This template enables AI-assisted development in Databricks by leveraging the Databricks Command Execution API through an MCP server. Test code directly on clusters, then deploy with Databricks Asset Bundles (DABs).
- ✅ Run and test code directly on Databricks clusters
- ✅ Auto-select clusters - no need to specify a cluster ID
- ✅ Create and deploy Databricks Asset Bundles (DABs)
- ✅ All from natural language prompts!
Just describe what you want → AI builds, tests the code on Databricks, and deploys the complete pipeline.
Clone and set up the MCP server somewhere on your machine:
git clone https://github.com/databricks-solutions/databricks-exec-code-mcp.git
cd databricks-exec-code-mcp
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txtAdd to your ~/.zshrc or ~/.bashrc:
export DATABRICKS_HOST=https://your-workspace.cloud.databricks.com
export DATABRICKS_TOKEN=dapi_your_token_hereTo get your Personal Access Token (PAT): Databricks workspace → Profile → Settings → Developer → Access Tokens → Generate new token
Create your project directory and install the Databricks skills:
# Create and enter your project
mkdir my-databricks-project && cd my-databricks-project
# Install skills for your AI client (downloads from remote)
curl -sSL https://raw.githubusercontent.com/databricks-solutions/databricks-exec-code-mcp/main/install_skills.sh | bash -s -- --cursor
# Or for Claude Code:
curl -sSL https://raw.githubusercontent.com/databricks-solutions/databricks-exec-code-mcp/main/install_skills.sh | bash -s -- --claude
# Or for both:
curl -sSL https://raw.githubusercontent.com/databricks-solutions/databricks-exec-code-mcp/main/install_skills.sh | bash -s -- --allThis creates:
- Cursor:
.cursor/rules/with Databricks rules - Claude Code:
.claude/skills/with Databricks skills
Point your AI client to the MCP server you set up in Step 1.
For Cursor — create .cursor/mcp.json in your project:
{
"mcpServers": {
"databricks": {
"command": "/path/to/databricks-exec-code-mcp/.venv/bin/python",
"args": ["/path/to/databricks-exec-code-mcp/mcp_tools/tools.py"]
}
}
}For Claude Code — run in your project:
claude mcp add-json databricks '{"command":"/path/to/databricks-exec-code-mcp/.venv/bin/python","args":["/path/to/databricks-exec-code-mcp/mcp_tools/tools.py"]}'Replace
/path/to/databricks-exec-code-mcpwith the actual path from Step 1.
💡 Smart Cluster Selection: If no
cluster_idis provided, the MCP server automatically finds a running cluster in your workspace.
Just describe what you want in natural language:
Data Engineering:
"Build a Data Engineering pipeline using Medallion Architecture on the NYC Taxi dataset and deploy it with DABs"
Machine Learning:
"Train a classification model on the Titanic dataset, register it to Unity Catalog, and deploy as a DAB job"
Quick Test:
"Run a SQL query to show the top 10 tables in my catalog"
The AI will create a complete DABs project:
your-project/
├── databricks.yml # DABs configuration
├── resources/
│ └── training_job.yml # Databricks job definition
├── src/<project>/
│ └── notebooks/
│ ├── 01_data_prep.py
│ ├── 02_training.py
│ └── 03_validation.py
└── tests/ # Unit tests (optional)
| Feature | Description |
|---|---|
| Direct Cluster Execution | Test code on Databricks clusters via Databricks Execution API |
| DABs Packaging | Production-ready bundle deployment |
| Multi-Environment | Support for dev/staging/prod targets |
| Unity Catalog | Models and data registered to UC for governance |
| MLflow Tracking | Experiment tracking and model versioning |
© 2025 Databricks, Inc. All rights reserved. The source in this project is provided subject to the Databricks License.
| Package | License | Copyright |
|---|---|---|
| mcp | MIT License | Copyright (c) 2024 Anthropic |
| requests | Apache License 2.0 | Copyright 2019 Kenneth Reitz |
| python-dotenv | BSD 3-Clause License | Copyright (c) 2014, Saurabh Kumar |