Thanks for sharing the next set of slides! Here’s the final part of your blog, focused on Databricks Secrets Utility (dbutils.secrets
), how to implement it in notebooks and clusters, and some hands-on practice guidance.
🔑 Managing Secrets in Azure Databricks using dbutils.secrets
When working with cloud resources such as Azure Data Lake, databases, APIs, or third-party services, you often need to handle sensitive credentials like client secrets, access keys, tokens, and passwords. Databricks Secrets Utility helps you manage these secrets securely within notebooks and clusters.
Let’s dive into how to implement secrets utility in Databricks notebooks and clusters with best practices and examples.
🧰 What is dbutils.secrets
?
dbutils.secrets
is part of Databricks Utilities and allows you to access secret scopes configured either:
- Natively in Databricks (Databricks-backed scope)
- Or via Azure Key Vault integration
🔧 Basic Syntax:
dbutils.secrets.get(scope="scope-name", key="key-name")
🧪 Implement Secrets Utility in Databricks Notebooks
✅ Step-by-step Example:
- Create a Secret Scope (Databricks UI or CLI)
Example CLI command:databricks secrets create-scope --scope demo-scope
- Add a Secret to the Scope
databricks secrets put --scope demo-scope --key my-secret-key
- Access the Secret in a Notebook:
# Securely read the secret
my_secret = dbutils.secrets.get(scope="demo-scope", key="my-secret-key")
# Use it in your logic
print("Secret value retrieved successfully.")
🔒 The secret value is never printed in plaintext in logs or outputs.
⚙️ Implement Secrets Utility in Databricks Clusters
You can also pass secrets as part of cluster-level configurations so all jobs and notebooks within the cluster can access them.
✅ Spark Config using secrets:
spark.conf.set("fs.azure.account.key.<storage_account>.dfs.core.windows.net",
dbutils.secrets.get(scope="azure-secrets", key="adls-key"))
Or in the cluster’s Spark Config UI:
spark.hadoop.fs.azure.account.key.<storage_account>.dfs.core.windows.net {{secrets/azure-secrets/adls-key}}
🛑 This syntax (
{{secrets/...}}
) only works in the UI Spark config, not in notebooks.
🔐 Best Practices for Managing Secrets
Best Practice | Description |
---|---|
✅ Use Azure Key Vault | For enterprise-grade secret storage |
🔐 Avoid hardcoding secrets | Use dbutils.secrets.get() always |
🔁 Rotate secrets regularly | For improved security posture |
🧪 Separate scopes for environments | e.g., dev-scope , prod-scope |
👥 Control access to scopes | Assign roles to restrict who can access/edit |
📌 Summary
Task | How to Do It |
---|---|
Access secrets in notebooks | dbutils.secrets.get(...) |
Create secret scope | Via CLI or Databricks UI |
Use secrets in cluster configs | {{secrets/scope/key}} in Spark config |
Store secrets externally | Integrate with Azure Key Vault |
📘 Wrap-up
Using Databricks Secrets Utility is the safest and most reliable way to work with sensitive information in your notebooks and clusters. Whether you’re accessing Azure services, APIs, or any confidential data, never hardcode credentials — always use secure secret scopes!