I am encountering the below issue when mounting Azure DataLake Storage Gen2 File System using Python on Azure Databricks. Has anyone faced a similar issue and knows a solution?
My Storage account Name: projectstoragegen2
My Blob Container Name/File System: gen2loading
It says ‘Invalid configuration value detected for fs.azure.account.key’ pointing to the highlighted command below.
spark.conf.set("fs.azure.account.key.projectstoragegen2.dfs.core.windows.net", dbutils.secrets.get(scope = "proj-adb-gen2-blob", key = "accountkey"))
spark.conf.set("fs.azure.createRemoteFileSystemDuringInitialization", "true")
dbutils.fs.ls("abfss://gen2loading@projectstoragegen2.dfs.core.windows.net/")
spark.conf.set("fs.azure.createRemoteFileSystemDuringInitialization", "false")
My next step would be
configs = {"fs.azure.account.auth.type": "OAuth", "fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider", "fs.azure.account.oauth2.client.id": "<your-service-client-id>", "fs.azure.account.oauth2.client.secret": dbutils.secrets.get(scope = "<scope-name>", key = "<key-name>"), "fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/<your-directory-id>/oauth2/token"} # Optionally, you can add <your-directory-name> to the source URI of your mount point. dbutils.fs.mount( source = "abfss://<your-file-system-name>@<your-storage-account-name>.dfs.core.windows.net/", mount_point = "/mnt/<mount-name>", extra_configs = configs)
Please let us know if I am missing something here.
Thank you
Hi! did you resolve your problem?? I'm in the same issue...
thanks!
running into same issue . is there a workaround or fix for setting this config ?
Having the same issue. Did you solve it?
Dear Members,
I figured out the issue.
# instead of the this line
"fs.azure.account.oauth2.client.secret": dbutils.secrets.get(scope = "<scope-name>", key = "<key-name>"),
# use as below
"fs.azure.account.oauth2.client.secret": "<key-name>",
Thanks and Regards,
Vijay