How you can use Azure Key Vault to store your credentials and secret keys and use in Databricks

Aditya.Somwanshi
3 min readNov 30, 2021

--

Hello, everyone in this blog we will see how we can use the azure key vault to store our sensitive information in a secured way and then use that information in Azure data bricks.

Step 1:
First of all, we need a databricks workspace so go ahead and create a databricks resource in the Azure portal. Select basic tire pricing and a minimum number of nodes as we are dealing with small-size data for now.

Once you are in your workspace first create a cluster with minimum configuration.

After cluster, it’s time to create a notebook with python language and you will see your created cluster in the dropdown.

Step 2:
Now it’s time to create an azure key vault that will store our secrets. Again go to Azure portal and search for key vaults and create that resource keeps pricing and configuration to basic.

Step 3:
For now, we will connect databricks with Azure Blob where we have our dataset for transformations. Go to your container then to access the key and copy key 1.

Step 4: Now go to the key vault and create a secret. Name you can have anything and value will be the key that you have copied just now.

Step 5:
Now that things are in the key vault we will connect the databricks with the key vault. For this, we will have to create scope in databricks. Add the following things after your databricks URL. https://adb*************.azuredatabricks.net/#secrets/createScope

Basically delete everything after .net/ and add #sercrets/createScope
You will find this screen

Step 6:
Give a scope name and note it somewhere because we need this name in our code. Manage principal can be put to all users this will allow all users of this workspace to use this scope. Now let’s link this to the azure key vault.
The DNS name will be the virtual URL from the azure key vault and the resource id will be the resource id from the key vault. This info can be found in the properties section of Key Vault.

Step 7: Now let’s use this scope in our code. First, create a notebook and write the following code.

spark.conf.set(“fs.azure.account.key.<StorageAccName>.blob.core.windows.net”,dbutils.secrets.get(scope=”myscope”,key=”datasetblob”))
filepath = “wasbs://<containername>@<StorageAccName>.blob.core.windows.net/iris data.json”
irisDf = spark.read.format(“json”).load(filepath)
display(irisDf)

my scope is my scope name you will have to put yours, the file path is where the file is stored in a blob container.

Execute this code and you will get output similar to this

In this way you can use key vault to store your credentials and use it in Databricks. Hope this helps

--

--