Provides AI-powered API discovery and management platform for registering, cataloging, and executing calls to external APIs through Databricks, with support for SQL warehouse queries and Delta table storage of API metadata.
Supports bearer token authentication for accessing GitHub API endpoints, enabling registration and management of GitHub API calls.
Supports bearer token authentication for accessing Shopify API endpoints, enabling registration and management of Shopify API calls.
Supports bearer token authentication for accessing Stripe API endpoints, enabling registration and management of Stripe API calls.
š API Registry MCP Server
A Databricks app that helps you discover, register, and manage external API endpoints with an AI-powered chat interface.
What is this?
An API discovery and management platform that runs on Databricks Apps:
š¤ AI Chat Interface: Register APIs using natural language powered by Claude
š API Registry: Database-backed catalog of external API endpoints
š Secure Auth: Support for public APIs, API keys, and bearer tokens
š ļø MCP Server: Programmatic API management tools
š Smart Discovery: Automatic endpoint testing and documentation parsing
Quick Start
Prerequisites
Required Tools (Install on your local machine)
1. Python Package Manager - uv:
2. Databricks CLI:
3. Bun (Optional - only for frontend development):
Databricks Workspace Requirements
Your workspace needs:
Databricks Apps enabled (Public Preview)
Foundation Model API with a tool-enabled model (Claude, Llama, etc.)
SQL Warehouse - At least one warehouse (create one)
Unity Catalog - With a catalog and schema you can write to
š Detailed requirements: WORKSPACE_REQUIREMENTS.md
Step 1: Clone and Setup
Run this on your local machine (not in Databricks):
The setup script will prompt you for:
Prompt | What It's For | Default | Notes |
Databricks Host | Your workspace URL | (no default) | Format:
|
Authentication Method | How to authenticate |
(PAT - Recommended ) | Options: 1=OAuth, 2=PAT |
Personal Access Token | Your Databricks PAT | (no default) | Required for PAT auth. |
SQL Warehouse ID | Warehouse for queries | Auto-detects first warehouse | Press Enter to use default |
Unity Catalog | Target catalog |
| Press Enter to use default |
Unity Schema | Target schema |
| Press Enter to use default |
ā ļø Important: Use Personal Access Token (PAT) authentication
PAT is the recommended method for local development
OAuth is experimental and may have issues
Get your PAT: Workspace ā Settings ā Developer ā Access Tokens ā Generate New Token
What this does:
Installs Python and JavaScript dependencies
Configures Databricks CLI authentication
Creates
.env.localwith your configurationValidates your workspace connection
Step 2: Create the API Registry Table
Create the Delta table that stores API metadata:
Example:
What this does:
Creates
api_http_registrytable in your specified catalog.schemaTable stores: API name, endpoints, auth type, HTTP connection details, parameters
Required for the app to track registered APIs
Alternative - Manual SQL:
Run the SQL from setup_api_http_registry_table.sql in Databricks SQL Editor
Note: Ensure your catalog and schema exist first. Create them in Databricks SQL Editor if needed.
Step 3: Deploy to Databricks Apps
Deploy your application code to Databricks:
During deployment, you'll be prompted for:
App name: Must start with
mcp-(e.g.,mcp-api-registry,mcp-prod-api)
What happens during deployment:
ā Builds the frontend - Compiles React TypeScript to static assets
ā Packages the backend - Prepares FastAPI server and MCP tools
ā Creates Databricks App - Registers your app in the workspace
ā Generates Service Principal - Automatically creates a service principal for your app
ā Deploys code to the app - Uploads your code and automatically attaches it to the app compute
ā Starts the application - Your app is now running and accessible
ā Enables OAuth (OBO) - Configures On-Behalf-Of authentication automatically
ā ļø Important: No manual attachment needed!
The deploy.sh script handles the entire deployment pipeline. Your code is automatically:
Packaged into a deployable artifact
Uploaded to Databricks
Attached to the app's compute environment
Started and made accessible at the app URL
You don't need to manually connect code to compute - it's all handled by the deployment process!
Finding your deployed app:
Or in Databricks UI:
Workspace ā Compute ā Apps ā Click your app name
š On-Behalf-Of (OBO) Authentication:
Databricks Apps automatically handles OAuth authentication:
ā Users log in through Databricks UI - no separate auth setup
ā All operations run with the user's permissions - proper access control
ā Full audit logging - track who did what
ā No manual OAuth configuration needed!
The app configuration (app.yaml) specifies required scopes. When users access the app, they automatically get an OAuth token with their Databricks permissions.
š More details: See app.yaml in the project root
Step 4: Setup Secret Scopes (For Authenticated APIs)
ā ļø Important: Do this AFTER Step 3 - You need the Service Principal ID from deployment first!
Skip if you only use public APIs with no authentication.
For APIs requiring API keys or bearer tokens:
When prompted, enter your app's Service Principal ID from Step 3.
Where to find your Service Principal ID:
From terminal: Run
./app_status.sh(shown in output)From UI: Databricks workspace ā Compute ā Apps ā Click your app ā "Service Principal ID"
Format: Looks like
00000000-0000-0000-0000-000000000000
What this script does:
Creates
mcp_api_keysscope - for API key authenticationCreates
mcp_bearer_tokensscope - for bearer token authenticationGrants your app's service principal WRITE access to both scopes
Verifies the permissions were set correctly
Why this is needed:
API keys and bearer tokens must be stored securely
Databricks Secrets provide encryption at rest
The app's service principal manages secrets on behalf of all users
Users never see or handle raw credentials - they're encrypted automatically
Verification:
Troubleshooting:
If scope creation fails: You may need admin permissions
If permission grant fails: Your SPN ID may be incorrect (check
./app_status.sh)
š Detailed guide: SECRETS_WORKAROUND.md
API Authentication Types
The app supports three authentication types:
Type | When to Use | Where Credential Goes | Example APIs |
none | Public APIs with no auth | N/A | Treasury, Public datasets |
api_key | Key passed as query param |
in URL | FRED, Alpha Vantage, NewsAPI |
bearer_token | Token passed in header |
| GitHub, Stripe, Shopify |
Quick Examples
Public API (no auth):
API Key Authentication:
Bearer Token Authentication:
How It Works
API Key Auth:
Key stored in
mcp_api_keysscopeHTTP connection has empty bearer_token
Key retrieved from secrets and added to params at runtime
Bearer Token Auth:
Token stored in
mcp_bearer_tokensscopeHTTP connection references the secret
Databricks automatically adds
Authorization: Bearer <token>header
š Detailed auth mechanics: See "API Authentication Types" section in SECRETS_WORKAROUND.md
Using the App
Web Interface
Open your app URL to access:
Chat Playground - Natural language API registration and queries
API Registry - View, edit, delete registered APIs
Traces - Debug AI agent execution
MCP Info - View available MCP tools
Example Workflow
Configuration
Environment Variables (.env.local)
Created automatically by ./setup.sh:
Authentication
The app uses On-Behalf-Of (OBO) authentication by default:
Users authenticate with Databricks OAuth
All operations run with the user's permissions
Proper access control and audit logging
š OBO details: See app.yaml configuration in the project root
Development
Local Development
Debugging
Multiple Environments
Deploy separate instances for dev/staging/prod:
Project Structure
Troubleshooting
Deployment Issues - App Created But Code Not Working
If ./deploy.sh completes successfully but your app doesn't work properly, follow these steps:
1. Check App Logs (MOST IMPORTANT):
2. Verify App Status:
3. Common Causes & Fixes:
Issue | Check | Fix |
Frontend build failed |
| Fix TypeScript errors, ensure
exists |
Missing Python dependencies |
| Run
|
app.yaml misconfigured |
| Verify
and
are correct |
Code not uploaded |
| Check if source path exists, redeploy with
|
App won't start | Check app logs | Look for Python import errors, missing env vars, port conflicts |
4. Redeploy with Verbose Output:
5. Manual Verification:
Authentication failures:
Run:
databricks current-user meto verify CLI authCheck
.env.localhas correctDATABRICKS_HOST
Table not found:
Run
setup_table.pyor manually create via SQL Editor
Secret scope errors:
App not accessible:
Check deployment:
./app_status.shView logs:
https://your-app.databricksapps.com/logz
API calls failing after registration:
Verify secret exists:
databricks secrets list-secrets --scope mcp_api_keysCheck app logs for connection creation errors
For API key auth: Ensure key is in
mcp_api_keysscopeFor bearer token auth: Ensure token is in
mcp_bearer_tokensscope
š Detailed troubleshooting:
WORKSPACE_REQUIREMENTS.md - Workspace setup issues
SECRETS_WORKAROUND.md - Secret scope issues
Key Features
MCP Tools Available
The app exposes these tools via its MCP server:
smart_register_api- One-step API registration with auto-discoveryregister_api_in_registry- Manual API registration with full controlcheck_api_http_registry- List and search registered APIsdiscover_endpoints_from_docs- Extract endpoints from documentation URLstest_api_endpoint- Validate endpoints before registrationexecute_dbsql- Run SQL queries against warehouses
AI Agent Capabilities
The chat interface can:
Parse API documentation to discover endpoints
Test endpoints automatically
Register APIs with proper authentication
Call registered APIs to answer queries
Combine multiple API calls for complex requests
Documentation
WORKSPACE_REQUIREMENTS.md - Prerequisites, setup, workspace configuration
SECRETS_WORKAROUND.md - Secret management, auth types, troubleshooting
SECURITY.md - Security policies
LICENSE.md - License information
License
See LICENSE.md
Security
Report vulnerabilities: See SECURITY.md
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
Enables discovery, registration, and management of external API endpoints through natural language, supporting multiple authentication methods (public, API key, bearer token) with automatic endpoint testing and documentation parsing.