Storing and Accessing Secrets in Azure Key Vault

Slack
Author

Miel Hostens, Enhong Liu et al.

Published

July 25, 2025

Before accessing the secret in Databricks, make sure to:

  1. ✅ Talk to Enhong or Miel to store the key in the Azure Key Vault.
  2. ✅ Use the secret scope bovi_analytics_secrets_scope in Databricks.

🔐 Access the Key in Databricks

Use the following code snippet to retrieve the secret:

# Retrieve secret from Azure Key Vault via Databricks Secret Scope
    # Replace <data> with the actual data name you want to access
    # Replace <your-key-name> with the actual key name you want to access.
<data>_secret = dbutils.secrets.get(scope="bovi_analytics_secrets_scope", key= <your-key-name> )

# Print the secret (only for debugging; avoid in production)
print(<data>_secret)

Overview of blob storage accounts and keys

blob-storage-account scope key
lab-us bovi_analytics_secrets_scope azure-bovi-analytics-lab-us
lab-eu bovi_analytics_secrets_scope azure-bovi-analytics-lab-eu
gpluse bovi_analytics_secrets_scope azure-bovi-analytics-gpluse-eu
playbehavior bovi_analytics_secrets_scope azure-bovi-analytics-playbehavior-eu
methanedata bovi_analytics_secrets_scope azure-bovi-analytics-methanedata-us

azure-bovi-analytics-methanedata-us

Real Example: Accessing Bovi-Analytics Data from Azure Blob Storage

In this example, we use a secret stored in Azure Key Vault (retrieved via Databricks Secret Scope) to authenticate and access data stored in an Azure Blob Storage account.

%scala
// Set the Azure Blob Storage account key using the Databricks secret
spark.sparkContext.hadoopConfiguration.set(
   "fs.azure.account.key.blob-storage-account.blob.core.windows.net",
   dbutils.secrets.get(scope="bovi_analytics_secrets_scope", key= "key")
   )
%python
// Set the Azure Blob Storage account key using the Databricks secret
spark.conf.set(
   "fs.azure.account.key.blob-storage-account.blob.core.windows.net",
   dbutils.secrets.get(scope="bovi_analytics_secrets_scope", key= "key")
   )
# Accessing farm data stored in blob storage
file_location = "wasbs://container-name@blob-storage-acount.blob.core.windows.net/path-to-file/"
file_type = "txt"
data_set = spark.read.format(file_type).load(file_location)
display(data_set)