Skip to main content
Glama

Alkemi MCP Server

Official
by alkemi-ai

Alkemi MCP Server

Integrate your Alkemi Data, connected to Snowflake, Google BigQuery, DataBricks and other sources, with your MCP Client.

This is a STDIO wrapper for the Streamable HTTP MCP Endpoint:

https://api.alkemi.cloud/mcp

Get your free API key at datalab.alkemi.ai

Alkemi.ai

Querying databases requires knowledge about the schema of the tables and may require examples of the kinds of queries that can answer specific questions. Otherwise, you may be getting the wrong answers. Maintaining all that information in every agent or MCP Client that queries your database is a challenge and doesn't scale to teams looking to share data.

The Alkemi MCP Server uses Alkemi to store the database metadata, generate proper queries and actually query the database so you can share your MCP Server with teammates and everyone will have the same ability to query with quality.

Installation

To add OpenAI to Claude Desktop, add the server config:

On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json

On Windows: %APPDATA%/Claude/claude_desktop_config.json

Env Vars

  • MCP_NAME: The name of the MCP Server. This is optional. If you configure multiple, this is required so they do not have the same names in your MCP Client..

  • BEARER_TOKEN: The Bearer token for the Streamable HTTP MCP Server. This is required for the STDIO MCP Integration.

  • PRODUCT_ID: The ID of the Product if you want to narrow scope to just a single product. This is optional.

Configuration

You can use it via npx in your Claude Desktop configuration like this:

{ "mcpServers": { "alkemi": { "command": "npx", "args": [ "@alkemiai/alkemi-mcp" ], "env": { "BEARER_TOKEN": "sk-12345" } } } }

Or, if you clone the repo, you can build and use in your Claude Desktop configuration like this:

{ "mcpServers": { "alkemi-data": { "command": "node", "args": [ "/path/to/alkemi-mcp/build/index.js" ], "env": { "BEARER_TOKEN": "sk-12345" } } } }

If you want to specify a specific product that the MCP Server should use, you can specify the PRODUCT_ID environment variable. And with setting the MCP_NAME, you can configure multiple.

{ "mcpServers": { "alkemi-customer-data": { "command": "node", "args": [ "/path/to/alkemi-mcp/build/index.js" ], "env": { "MCP_NAME": "customer-data", "PRODUCT_ID": "123", "BEARER_TOKEN": "sk-12345" } }, "alkemi-web-traffic-data": { "command": "node", "args": [ "/path/to/alkemi-mcp/build/index.js" ], "env": { "MCP_NAME": "web-traffic-data", "PRODUCT_ID": "234", "BEARER_TOKEN": "sk-12345" } } } }

Development

Install dependencies:

npm install

Build the server:

npm run build

For development with auto-rebuild:

npm run watch

Debugging

Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector, which is available as a package script:

npm run inspector

The Inspector will provide a URL to access debugging tools in your browser.

Acknowledgements

-
security - not tested
A
license - permissive license
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

Connects MCP clients to databases like Snowflake, BigQuery, and Databricks through Alkemi's data platform, enabling natural language database queries with proper schema understanding and metadata management for consistent team-wide access.

  1. Alkemi.ai
    1. Installation
      1. Env Vars
      2. Configuration
    2. Development
      1. Debugging
      2. Acknowledgements

    MCP directory API

    We provide all the information about MCP servers via our MCP API.

    curl -X GET 'https://glama.ai/api/mcp/v1/servers/alkemi-ai/alkemi-mcp'

    If you have feedback or need assistance with the MCP directory API, please join our Discord server