Mohammad Gufran Jahangir August 17, 2025 0

What is Databricks Secret Management?

A built-in way to store credentials (keys, passwords, tokens) safely and read them from notebooks/jobs without printing them. Values you fetch with dbutils.secrets.get() are auto-redacted in outputs. (Microsoft Learn, Databricks Documentation)

What are Secret Scopes?

A scope is a named folder of secrets (key-value pairs). Scopes can be:

  • Databricks-backed: secrets are stored encrypted by Databricks.
  • Azure Key Vault-backed: secrets live in your Key Vault; Databricks reads them via the scope (read-only from Databricks side). (Microsoft Learn)

Save & use secrets (fast path)

Need to install Databrick CLI before Create a Databricks-backed scope

1) Create a Databricks-backed scope

databricks secrets create-scope my-scope
# (optional) grant who can MANAGE the scope:
# databricks secrets put-acl my-scope my-group MANAGE

2) Put secrets into that scope

databricks secrets put-secret my-scope username
databricks secrets put-secret my-scope password
# You’ll be prompted to paste values (hidden).

CLI refs: create-scope, put-secret, put-acl. (Databricks Documentation)

3) Use in a notebook

user = dbutils.secrets.get("my-scope", "username")
pwd  = dbutils.secrets.get("my-scope", "password")
# Example: JDBC
df = (spark.read.format("jdbc")
      .option("url","<jdbc-url>")
      .option("dbtable","<table>")
      .option("user", user)
      .option("password", pwd)
      .load())

Secrets fetched this way show as [REDACTED] if printed. (Microsoft Learn)


Create & use Azure Key Vault with Databricks (AKV-backed scope)

What you’ll do

  1. Create a Key Vault and add your secrets (in the Azure portal).
  2. In Databricks, create an AKV-backed scope that points to that vault (by DNS Name + Resource ID).
  3. Read secrets with dbutils.secrets.get(scope, key) the same way. (Microsoft Learn)

UI path (quickest):

  • Open https://<your-workspace>#secrets/createScope, choose Azure Key Vault-backed, paste Key Vault DNS & Resource ID, set who can manage the scope. (Microsoft Learn)

CLI/API path (scriptable):

# Requires CLI v0.205+ and an authenticated profile
databricks secrets create-scope my-akv-scope \
  --scope-backend-type AZURE_KEYVAULT \
  --json '{
    "scope": "my-akv-scope",
    "scope_backend_type": "AZURE_KEYVAULT",
    "backend_azure_keyvault": {
      "resource_id": "/subscriptions/<sub>/resourceGroups/<rg>/providers/Microsoft.KeyVault/vaults/<kv-name>",
      "dns_name": "https://<kv-name>.vault.azure.net/"
    }
  }'

(Use --json to pass the AKV details.) (Databricks Documentation)

Notes

  • AKV-backed scopes are read-only from Databricks: create/rotate secrets in Azure, not in the workspace. (Microsoft Learn)
  • Ensure proper Key Vault roles/permissions before creating the scope. (Databricks Knowledge Base)

What is a Databricks-backed secret scope?

A scope where Databricks stores the encrypted secrets for you (in a Databricks-managed database). You can add/rotate/list secrets with the CLI/API and control access with ACLs. (Databricks Documentation)


Install the Databricks CLI (modern, v0.205+)

Pick one:

macOS/Linux (Homebrew)

brew tap databricks/tap
brew install databricks

Windows (winget)

winget search databricks
winget install Databricks.DatabricksCLI

Any OS (curl)

curl -fsSL https://raw.githubusercontent.com/databricks/setup-cli/main/install.sh | sh

Verify:

databricks -v   # expect 0.205+ 

(Microsoft Learn)


Authenticate the Databricks CLI (simple options)

A) OAuth (interactive—recommended)

# Workspace login
databricks auth login --host https://adb-<workspace-id>.<region>.azuredatabricks.net
# (For account-level commands, also run:)
# databricks auth login --host https://accounts.azuredatabricks.net --account-id <account-id>

Follow the browser prompt; tokens are cached locally. (Microsoft Learn)

B) Personal Access Token (classic, workspace-only)

databricks configure
# Host: https://adb-<workspace-id>.<region>.azuredatabricks.net
# Token: <PAT you created in the workspace UI>

Stores a profile in ~/.databrickscfg. (Microsoft Learn)

(There are also service principals, managed identity, and Azure CLI auth if you need non-interactive automation.) (Microsoft Learn)

1) Install the CLI (Windows)

Open PowerShell (as you) and run:

winget search databricks
winget install Databricks.DatabricksCLI

Then close & reopen your terminal so the databricks command is on PATH. Verify:

databricks -v

(WinGet is the recommended way on Windows; Chocolatey and a curl-based installer are alternatives if you don’t have winget.) (Microsoft Learn)

2) Sign in (workspace auth)

Run OAuth login against your workspace URL:

databricks auth login --host https://adb-<workspace-id>.<region>.azuredatabricks.net

Follow the browser prompt; the CLI caches a token profile locally. (Microsoft Learn, Databricks Documentation)

3) Create a Databricks-backed secret scope

databricks secrets create-scope my-scope

Add a secret:

databricks secrets put-secret my-scope my-password
# paste the secret value when prompted (it’s hidden)

You can now read it in notebooks/jobs:

pwd = dbutils.secrets.get("my-scope", "my-password")

(Secrets are redacted in outputs; manage ACLs if needed.) (Microsoft Learn, Databricks Documentation)

If winget isn’t available

  • Chocolatey (experimental): choco install databricks-cli
  • curl installer is also supported; see the official install page for steps. (Microsoft Learn)


Tiny FAQ

  • Where do I see scopes/secrets?
    databricks secrets list-scopes, databricks secrets list-secrets <scope> or dbutils.secrets.listScopes(). (Databricks Documentation)
  • Will secrets print in outputs?
    They’re redacted ([REDACTED]) when fetched via dbutils.secrets.get(). (Microsoft Learn)
  • When should I choose AKV-backed?
    If you already use Key Vault for rotation/governance and want centralized control; Databricks just reads them. Use Databricks-backed for quick, workspace-local storage. (Microsoft Learn)

Category: 
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments