Skip to main content
Glama

Databricks MCP Server

by samhavens
  • Linux
  • Apple

upload_file_to_dbfs

Transfer local files to Databricks File System (DBFS) for storage or execution. Supports overwrite, handles large files via chunked upload, and returns detailed upload results for tracking.

Instructions

Upload a local file to Databricks File System (DBFS). Args: local_file_path: Path to local file (e.g. './data/notebook.py') dbfs_path: DBFS path (e.g. '/tmp/uploaded/notebook.py') overwrite: Whether to overwrite existing file (default: True) Returns: JSON with upload results including success status, file size, and upload time. Example: # Upload script to DBFS result = upload_file_to_dbfs( local_file_path='./scripts/analysis.py', dbfs_path='/tmp/analysis.py', overwrite=True ) Note: For large files (>10MB), uses chunked upload with proper retry logic. DBFS is good for temporary files, scripts, and smaller datasets.

Input Schema

NameRequiredDescriptionDefault
dbfs_pathYes
local_file_pathYes
overwriteNo

Input Schema (JSON Schema)

{ "properties": { "dbfs_path": { "title": "Dbfs Path", "type": "string" }, "local_file_path": { "title": "Local File Path", "type": "string" }, "overwrite": { "default": true, "title": "Overwrite", "type": "boolean" } }, "required": [ "local_file_path", "dbfs_path" ], "title": "upload_file_to_dbfsArguments", "type": "object" }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/samhavens/databricks-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server