Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
DATABRICKS_HOSTYesThe URL of the Databricks host.
DATABRICKS_TOKENYesYour Databricks personal access token.

Capabilities

Features and capabilities supported by this server

CapabilityDetails
tools
{
  "listChanged": false
}
prompts
{
  "listChanged": false
}
resources
{
  "subscribe": false,
  "listChanged": false
}
experimental
{}

Tools

Functions exposed to the LLM to take actions

NameDescription
build_wheel

Builds the Python wheel using 'uv build --wheel'.

Args: target: The path to the directory containing pyproject.toml. Returns: The path to the generated wheel file.
upload_wheel

Uploads a local wheel file to the Databricks workspace.

Args: local_path: The local path to the wheel file. Returns: The full remote path of the uploaded wheel.
create_job

Creates a Databricks job with the specified wheel and entry point.

Args: job_name: The name of the job to create. package_name: The name of the Python package. remote_wheel_path: The remote path to the uploaded wheel file. Returns: The ID of the created job.
trigger_run

Triggers a run of the specified job.

Args: job_id: The ID of the job to run. job_args: A list of Python parameters to pass to the run. Returns: The ID of the triggered run.
list_job_runs

Lists runs for a specific job.

Args: job_id: The ID of the job to list runs for.

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/arahimi-hims/lakeflow-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server