Server Configuration
Describes the environment variables required to run the server.
Name | Required | Description | Default |
---|---|---|---|
No arguments |
Schema
Prompts
Interactive templates invoked by user choice
Name | Description |
---|---|
No prompts |
Resources
Contextual data attached and managed by the client
Name | Description |
---|---|
No resources |
Tools
Functions exposed to the LLM to take actions
Name | Description |
---|---|
list_clusters | List all Databricks clusters |
create_cluster | Create a new Databricks cluster with parameters: cluster_name (required), spark_version (required), node_type_id (required), num_workers, autotermination_minutes |
terminate_cluster | Terminate a Databricks cluster with parameter: cluster_id (required) |
get_cluster | Get information about a specific Databricks cluster with parameter: cluster_id (required) |
start_cluster | Start a terminated Databricks cluster with parameter: cluster_id (required) |
list_jobs | List all Databricks jobs |
run_job | Run a Databricks job with parameters: job_id (required), notebook_params (optional) |
list_notebooks | List notebooks in a workspace directory with parameter: path (required) |
export_notebook | Export a notebook from the workspace with parameters: path (required), format (optional, one of: SOURCE, HTML, JUPYTER, DBC) |
list_files | List files and directories in a DBFS path with parameter: dbfs_path (required) |
execute_sql | Execute a SQL statement with parameters: statement (required), warehouse_id (required), catalog (optional), schema (optional) |