Overview
Tool Name
Purpose
The databricks_action tool provides programmatic control of Databricks workspaces. It enables notebook CRUD, execution, imports and exports, permissions management, and job or cluster interactions so you can orchestrate Databricks work without using the UI.Functions Available
databricks_action
: Performs Databricks operations, including notebook create, read, update, delete, run, list, import, export, and status checks through the REST API.
Key Features
Notebook Lifecycle
Create, read, update, list, and delete notebooks programmatically in workspace folders.
Execute & Monitor
Run notebooks on clusters with parameters and monitor run state, logs, and results.
Import & Export
Move notebooks across environments in SOURCE, HTML, JUPYTER, or DBC formats.
Permissions & Sharing
Manage access to notebooks and align with workspace sharing policies.
Jobs and Clusters
Target specific clusters or jobs for scheduled and batch execution flows.
Input Parameters for Each Function
1.databricks_action
Parameters
Name | Definition | Format |
---|---|---|
action | Databricks operation to perform. Common values: create_notebook , run_notebook , get_notebook , delete_notebook , list_notebooks , export_notebook , import_notebook . | String (required) |
workspace_url | Databricks workspace base URL. Example: https://adb-1234567890123.4.azuredatabricks.net . | String (required) |
access_token | Personal access token used for API authentication. | String (required) |
notebook_path | Workspace path to the target notebook. Example: /Shared/etl/run_daily . | String |
notebook_content | Notebook source for create or update actions. | String |
cluster_id | Cluster identifier to run the notebook on. | String |
notebook_format | Import or export format. One of SOURCE , HTML , JUPYTER , DBC . | String |
language | Language for new notebooks. One of PYTHON , SCALA , SQL , R . | String |
parameters | Key value pairs passed to the run. | Object |
timeout_seconds | Maximum runtime before the execution is marked timed out. | Integer |
Genesis Tip
When running production jobs, pass parameters for date ranges or environment flags instead of hardcoding them in notebook cells.
When running production jobs, pass parameters for date ranges or environment flags instead of hardcoding them in notebook cells.
Use Cases
- Automated data processing Run ELT notebooks on a schedule and collect run logs for audit.
- ML training orchestration Kick off parameterized training runs and export resulting notebooks or metrics.
- Cross-environment promotion Export a notebook as SOURCE or DBC, review in Git, then import into another workspace.
- Operational monitoring Poll execution status, capture stdout and error traces, and surface results to stakeholders.
IMPORTANT: Notebook execution can modify data and incur compute costs. Use non-production clusters for testing and set
timeout_seconds
conservatively.Workflow/How It Works
- Authenticate
Provide
workspace_url
andaccess_token
. - Choose an action
For example
create_notebook
,run_notebook
, orexport_notebook
. - Provide context
Set
notebook_path
, optionalcluster_id
,parameters
, andtimeout_seconds
as needed. - Execute and observe Receive job or run identifiers, then fetch logs or results until completion.
- Promote or share
Export artifacts, store with
file_manager_tools
, and link runs back to tasks or tickets.
Use consistent workspace paths like
/Repos/team/project/
to simplify automation and reviews.Integration Relevance
data_connector_tools
: Open connections and run SQL inside notebooks that the action executes.file_manager_tools
: Capture notebook exports or logs and attach them to missions or tasks.project_manager_tools
: Track notebook work as todos with linked runs and outcomes.git_action
: Version notebook source, diffs, and promotion artifacts.dbt_action
: Coordinate dbt transformations before or after a Databricks notebook run.
Configuration Details
- Generate a Databricks personal access token with the permissions required for your operations.
- Use full
workspace_url
including protocol and host. - Ensure
cluster_id
points to a running or auto-start cluster that your token can access. - Align
notebook_format
with your import or export destination. - Parameter names in
parameters
must match widgets or parameter parsing in the notebook.
Limitations or Notes
- Requires valid workspace access and API permissions.
- API rate limits and cluster capacity can throttle concurrent runs.
- Large exports or logs may take time to transfer.
- Some advanced workspace features are not exposed in REST endpoints.
- Tokens expire and must be rotated according to your security policy.
Output
- Create and update Confirmation with notebook metadata and path.
- Run execution Run or job identifier, life-cycle state, result summary, and log access.
- Listing Arrays of notebooks with paths, languages, and modification timestamps.
- Import and export
Content in the requested
notebook_format
and related download details. - Errors HTTP status, Databricks error code, message, and troubleshooting hints.