Execute a Databricks job by specifying the job ID and optional notebook parameters, enabling automation of workflows and task management within the Databricks environment.
Retrieve detailed information about a Databricks cluster by providing its cluster ID, enabling efficient management and monitoring through the Databricks MCP Server.
A FastAPI-based server that provides tools for local file management and Databricks operations, enabling users to create/edit files locally and interact with Databricks clusters, jobs, and DLT pipelines.
Enables LLM-powered tools to interact with Databricks clusters, jobs, notebooks, SQL warehouses, and Unity Catalog through the Model Completion Protocol. Provides comprehensive access to Databricks REST API functionality including cluster management, job execution, workspace operations, and data catalog operations.
Enables AI assistants like Claude to interact with Databricks workspaces through secure OAuth authentication, allowing users to execute SQL queries, manage clusters, create jobs, and access workspace resources through natural language.