Provides access to Databricks functionality through tools that allow interacting with clusters (listing, creating, terminating, starting), jobs (listing, running), notebooks (listing, exporting), files (browsing DBFS paths), and executing SQL queries on a Databricks instance.
Used for implementing test endpoints and API functionality for the MCP server, enabling proper API interaction with Databricks services.
Databricks MCP Server
A Model Completion Protocol (MCP) server for Databricks that provides access to Databricks functionality via the MCP protocol. This allows LLM-powered tools to interact with Databricks clusters, jobs, notebooks, and more.
Features
MCP Protocol Support: Implements the MCP protocol to allow LLMs to interact with Databricks
Databricks API Integration: Provides access to Databricks REST API functionality
Tool Registration: Exposes Databricks functionality as MCP tools
Async Support: Built with asyncio for efficient operation
Available Tools
The Databricks MCP Server exposes the following tools:
list_clusters: List all Databricks clusters
create_cluster: Create a new Databricks cluster
terminate_cluster: Terminate a Databricks cluster
get_cluster: Get information about a specific Databricks cluster
start_cluster: Start a terminated Databricks cluster
list_jobs: List all Databricks jobs
run_job: Run a Databricks job
list_notebooks: List notebooks in a workspace directory
export_notebook: Export a notebook from the workspace
list_files: List files and directories in a DBFS path
execute_sql: Execute a SQL statement
Installation
Prerequisites
Python 3.10 or higher
uv
package manager (recommended for MCP servers)
Setup
Install
uv
if you don't have it already:# MacOS/Linux curl -LsSf https://astral.sh/uv/install.sh | sh # Windows (in PowerShell) irm https://astral.sh/uv/install.ps1 | iexRestart your terminal after installation.
Clone the repository:
git clone https://github.com/JustTryAI/databricks-mcp-server.git cd databricks-mcp-serverSet up the project with
uv
:# Create and activate virtual environment uv venv # On Windows .\.venv\Scripts\activate # On Linux/Mac source .venv/bin/activate # Install dependencies in development mode uv pip install -e . # Install development dependencies uv pip install -e ".[dev]"Set up environment variables:
# Windows set DATABRICKS_HOST=https://your-databricks-instance.azuredatabricks.net set DATABRICKS_TOKEN=your-personal-access-token # Linux/Mac export DATABRICKS_HOST=https://your-databricks-instance.azuredatabricks.net export DATABRICKS_TOKEN=your-personal-access-tokenYou can also create an
.env
file based on the.env.example
template.
Running the MCP Server
To start the MCP server, run:
These wrapper scripts will execute the actual server scripts located in the scripts
directory. The server will start and be ready to accept MCP protocol connections.
You can also directly run the server scripts from the scripts directory:
Querying Databricks Resources
The repository includes utility scripts to quickly view Databricks resources:
Project Structure
See project_structure.md
for a more detailed view of the project structure.
Development
Code Standards
Python code follows PEP 8 style guide with a maximum line length of 100 characters
Use 4 spaces for indentation (no tabs)
Use double quotes for strings
All classes, methods, and functions should have Google-style docstrings
Type hints are required for all code except tests
Linting
The project uses the following linting tools:
Testing
The project uses pytest for testing. To run the tests:
You can also run the tests directly with pytest:
A minimum code coverage of 80% is the goal for the project.
Documentation
API documentation is generated using Sphinx and can be found in the
docs/api
directoryAll code includes Google-style docstrings
See the
examples/
directory for usage examples
Examples
Check the examples/
directory for usage examples. To run examples:
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Ensure your code follows the project's coding standards
Add tests for any new functionality
Update documentation as necessary
Verify all tests pass before submitting
License
This project is licensed under the MIT License - see the LICENSE file for details.
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
Tools
A server that implements the Model Completion Protocol (MCP) to allow LLMs to interact with Databricks resources including clusters, jobs, notebooks, and SQL execution through natural language.
Related MCP Servers
- -securityAlicense-qualityA Model Context Protocol (MCP) server that enables LLMs to interact directly with MongoDB databases. Query collections, inspect schemas, and manage data seamlessly through natural language.Last updated -109169MIT License
- -securityAlicense-qualityA Model Context Protocol (MCP) server that enables LLMs to interact directly the documents that they have on-disk through agentic RAG and hybrid search in LanceDB. Ask LLMs questions about the dataset as a whole or about specific documents.Last updated -572MIT License
- -securityAlicense-qualityThis is a Model Context Protocol (MCP) server for executing SQL queries against Databricks using the Statement Execution API. It enables AI assistants to directly query Databricks data warehouses, analyze database schemas, and retrieve query results in a structured formatLast updated -24MIT License
- -securityFlicense-qualityA Model Context Protocol server that enables LLMs to interact with Databricks workspaces through natural language, allowing SQL query execution and job management operations.Last updated -41