aimm-mcp
Captures SQL data-model metadata from Databricks via ODBC, exposing table schemas, relationships, and lineage information.
Captures SQL data-model metadata from Trino via ODBC, exposing table schemas, relationships, and lineage information.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@aimm-mcpread the project context"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
aimm-mcp
Local Model Context Protocol server for the AI Model Manager. Captures
SQL data-model metadata under ~/Documents/AIMM/ and exposes it to Claude
Code (or any MCP client) over stdio.
Fork of the AIMM VS Code extension, rebuilt in Python with no UI: just tools the agent calls.
Why this exists
The VS Code extension lives inside an editor. This fork lets you skip
the editor entirely — install once with claude mcp add, and any
Claude session on the machine sees the same model.
Install
One line. Claude Code spawns the server via uvx (uv's npx) — no
prior install, no setup.
claude mcp add aimm --scope user -- uvx aimm-mcpFirst connection downloads the package (a few seconds). Subsequent
connections are cache-served. Server lives under ~/Documents/AIMM/
so every project on the machine shares one model.
What lives under ~/Documents/AIMM/
~/Documents/AIMM/
├── aimm.json project header
├── tables/<name>.json per-table metadata
├── connections/<name>.json ODBC connection descriptors
├── model.mmd generated ER diagram (mermaid)
├── model_lineage.mmd generated upstream→downstream flowchart
├── lineage.json flat lineage edge list
├── relationships.json flat FK edge list
├── joins.json project-tracked joins (relationships + extracted SQL)
├── discovered_joins.json candidates from a folder scan
├── project_context.xml full agent-readable context
└── diagnostics.log every ODBC query the server issuedEngines
Three engines via ODBC: trino, sql_server, databricks. Connection
descriptors carry a DSN name (system DSN registered at the OS level)
plus the catalog / database qualifier.
Tools (in v0.1)
aimm_init_project— bootstrap~/Documents/AIMM/aimm_read_project_context— XML or markdown dump
More tools land per follow-up PR. See aimm_mcp/tools/ for the
current set.
Why ODBC?
Every warehouse this targets exposes an ODBC driver. We never run user
SQL — only information_schema reads for column / table / schema
metadata. Drivers stay read-only at the credential level.
Development
uv sync
uv run python -m aimm_mcp # starts the MCP stdio server
uv run pytest # testsLicense
MIT.
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/DylanCodyBrown/aimm-mcp'
If you have feedback or need assistance with the MCP directory API, please join our Discord server