Mohammad Gufran Jahangir April 20, 2025 0

Thanks for sharing the next set of slides! Here’s the final part of your blog, focused on Databricks Secrets Utility (dbutils.secrets), how to implement it in notebooks and clusters, and some hands-on practice guidance.


πŸ”‘ Managing Secrets in Azure Databricks using dbutils.secrets

When working with cloud resources such as Azure Data Lake, databases, APIs, or third-party services, you often need to handle sensitive credentials like client secrets, access keys, tokens, and passwords. Databricks Secrets Utility helps you manage these secrets securely within notebooks and clusters.

Let’s dive into how to implement secrets utility in Databricks notebooks and clusters with best practices and examples.


🧰 What is dbutils.secrets?

dbutils.secrets is part of Databricks Utilities and allows you to access secret scopes configured either:

  • Natively in Databricks (Databricks-backed scope)
  • Or via Azure Key Vault integration

πŸ”§ Basic Syntax:

dbutils.secrets.get(scope="scope-name", key="key-name")

πŸ§ͺ Implement Secrets Utility in Databricks Notebooks

βœ… Step-by-step Example:

  1. Create a Secret Scope (Databricks UI or CLI)
    Example CLI command: databricks secrets create-scope --scope demo-scope
  2. Add a Secret to the Scope databricks secrets put --scope demo-scope --key my-secret-key
  3. Access the Secret in a Notebook:
# Securely read the secret
my_secret = dbutils.secrets.get(scope="demo-scope", key="my-secret-key")

# Use it in your logic
print("Secret value retrieved successfully.")

πŸ”’ The secret value is never printed in plaintext in logs or outputs.


βš™οΈ Implement Secrets Utility in Databricks Clusters

You can also pass secrets as part of cluster-level configurations so all jobs and notebooks within the cluster can access them.

βœ… Spark Config using secrets:

spark.conf.set("fs.azure.account.key.<storage_account>.dfs.core.windows.net", 
               dbutils.secrets.get(scope="azure-secrets", key="adls-key"))

Or in the cluster’s Spark Config UI:

spark.hadoop.fs.azure.account.key.<storage_account>.dfs.core.windows.net {{secrets/azure-secrets/adls-key}}

πŸ›‘ This syntax ({{secrets/...}}) only works in the UI Spark config, not in notebooks.


πŸ” Best Practices for Managing Secrets

Best PracticeDescription
βœ… Use Azure Key VaultFor enterprise-grade secret storage
πŸ” Avoid hardcoding secretsUse dbutils.secrets.get() always
πŸ” Rotate secrets regularlyFor improved security posture
πŸ§ͺ Separate scopes for environmentse.g., dev-scope, prod-scope
πŸ‘₯ Control access to scopesAssign roles to restrict who can access/edit

πŸ“Œ Summary

TaskHow to Do It
Access secrets in notebooksdbutils.secrets.get(...)
Create secret scopeVia CLI or Databricks UI
Use secrets in cluster configs{{secrets/scope/key}} in Spark config
Store secrets externallyIntegrate with Azure Key Vault

πŸ“˜ Wrap-up

Using Databricks Secrets Utility is the safest and most reliable way to work with sensitive information in your notebooks and clusters. Whether you’re accessing Azure services, APIs, or any confidential data, never hardcode credentials β€” always use secure secret scopes!

Category: 
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments