Databricks provides the dbutils module as a built-in set of utilities to interact with Databricks services directly from notebooks.
These utilities allow you to:
- Manage files in DBFS
- Interact with jobs
- Handle secrets securely
- Control notebook execution flow
- Work with widgets for parameterization

1. Hierarchy Overview
The main structure of dbutils is:
dbutils
│
├── credentials (DatabricksCredentialUtils)
├── data (DataUtils)
├── fs (DbfsUtils)
├── jobs (JobsUtils)
├── library (LibraryUtils)
├── meta (MetaUtils)
├── notebook (NotebookUtils)
├── preview (Preview)
├── secrets (SecretUtils)
└── widgets (WidgetsUtils)
2. Utilities in Detail with Examples
2.1 credentials — DatabricksCredentialUtils
Purpose: Manage and interact with credentials in notebooks.
Example:
# View current credential configuration
creds = dbutils.credentials.showCurrentCredentials()
print(creds)
Use Case: Debugging credential configurations when using cloud storage or API connections.
2.2 data — DataUtils (EXPERIMENTAL)
Purpose: Interact with datasets (currently experimental).
Example:
# Not widely documented; used for experimental dataset operations
dbutils.data.summarize("dbfs:/mnt/raw/data.csv")
2.3 fs — DbfsUtils
Purpose: Manipulate Databricks File System (DBFS) like a local file system.
Key Functions:
ls(path)— List filescp(src, dst, recurse)— Copy filesmv(src, dst, recurse)— Move filesrm(path, recurse)— Remove filesput(path, contents, overwrite)— Create file with content
Example:
# List DBFS directory
display(dbutils.fs.ls("/mnt/raw"))
# Create a file
dbutils.fs.put("/mnt/raw/sample.txt", "Hello Databricks!", overwrite=True)
# Read file
display(dbutils.fs.head("/mnt/raw/sample.txt"))
2.4 jobs — JobsUtils
Purpose: Manage and interact with job features inside notebooks.
Example:
# Exit a job with status
dbutils.jobs.taskValues.set(key="status", value="completed")
# Get task values
value = dbutils.jobs.taskValues.get(taskKey="MyTask", key="status")
Use Case: Pass values between tasks in a Databricks multi-task job.
2.5 library — LibraryUtils
Purpose: Manage session-isolated libraries.
Example:
# Install a Python package
dbutils.library.installPyPI("pandas")
# Restart Python process to apply
dbutils.library.restartPython()
Note: Library installation through dbutils is less common now; most packages are added via cluster init scripts or UI.
2.6 meta — MetaUtils (EXPERIMENTAL)
Purpose: Hook into Databricks compiler internals.
Example:
dbutils.meta.displayMeta("notebook")
Use Case: Rarely used in production — more for internal/expert debugging.
2.7 notebook — NotebookUtils
Purpose: Control flow between notebooks.
Key Functions:
run(path, timeout_seconds, arguments)— Run another notebookexit(value)— Exit a notebook with a return value
Example:
# Run another notebook
result = dbutils.notebook.run("/Shared/ETL_Notebook", 60, {"date": "2025-08-10"})
print(result)
# Exit current notebook
dbutils.notebook.exit("Notebook Finished Successfully!")
2.8 preview — Preview
Purpose: Experimental/preview utilities.
Note: These may change or be deprecated.
2.9 secrets — SecretUtils
Purpose: Securely store and retrieve secrets (keys, tokens, passwords).
Example:
# Get a secret value
token = dbutils.secrets.get(scope="my-scope", key="api-token")
# List all scopes
scopes = dbutils.secrets.listScopes()
print(scopes)
Use Case: Prevent storing plain text passwords in notebooks.
2.10 widgets — WidgetsUtils
Purpose: Create and get values of input widgets for parameterized notebooks.
Key Functions:
text(name, defaultValue, label)dropdown(name, defaultValue, choices, label)combobox(...)multiselect(...)get(name)remove(name)
Example:
# Create widgets
dbutils.widgets.text("param1", "default_value", "Enter parameter")
dbutils.widgets.dropdown("country", "US", ["US", "UK", "IN"], "Select Country")
# Get widget value
country_value = dbutils.widgets.get("country")
print("Country Selected:", country_value)
Use Case: Pass parameters dynamically to notebooks when running from a job.
3. Practical Workflow Example
Imagine an ETL pipeline:
widgets: Collects parameters for date range.secrets: Retrieves API token securely.fs: Reads raw files from DBFS mount.notebook: Runs transformation notebook.jobs: Passes processed data status to next task.
4. Best Practices
- Use
secretsinstead of hardcoding credentials. - Avoid experimental utilities (
meta,data) in production unless stable. - Use
widgetsfor parameterized jobs — improves reusability. - Use
notebook.run()for modular code separation. - Avoid storing sensitive data in DBFS without encryption.