Skip to main content
Glama

kothar

kothar MCP server

Context-aware MCP server advisor. Tells you what to install for your specific project — and why.

The problem

Glama has 19,000+ MCP servers. You have a project. Nobody bridges the gap.

LLMs asked directly hallucinate servers that don't exist and recommend from stale training data. Directories give you search, not advice.

kothar fills the selection under context gap: not "here are 19,000 options" but "for your specific project, right now, here's what you need and why."

The two moments nobody is serving

Project start: "I'm building a Python data pipeline with DuckDB and FastAPI" → what do I install right now

Mid-project: "I just added an auth layer / I need to handle PDF ingestion" → what do I add now that I've reached this point

The second moment is more valuable. At project start, people can Google. Mid-project they're in flow.

Three tools

recommend_for_project(description)
  → top MCP servers for your stack with rationale

recommend_next(current_stack, new_context)
  → what to add as your project evolves

explain_why(server_name, project_description)
  → why a specific server fits your project

Install

Prerequisites: uv

git clone https://github.com/yahiaklk/kothar
cd kothar
uv sync

Build the index (first run, ~30s):

uv run python -m kothar.indexer

Add to Claude Code

claude mcp add --scope user kothar -- uv run --directory /path/to/kothar python -m kothar.server

Add to Claude Desktop

Add to your claude_desktop_config.json:

{
  "mcpServers": {
    "kothar": {
      "command": "uv",
      "args": ["run", "--directory", "/path/to/kothar", "python", "-m", "kothar.server"]
    }
  }
}

Usage

Once connected, ask your AI assistant:

recommend_for_project("Python FastAPI backend with PostgreSQL and JWT auth")

recommend_next("github,filesystem", "adding Stripe payments and PDF invoices")

explain_why("postgres", "multi-tenant SaaS with row-level security")

How it works

  • Parses awesome-mcp-servers (2000+ curated servers)

  • Embeds descriptions with all-MiniLM-L6-v2 (local, no API cost)

  • Stores in DuckDB, queries with cosine similarity

  • Template-based rationale — grounded in the registry, not hallucinated

Rebuild the index

uv run python -m kothar.indexer --force

Docker

Multi-stage image with the embedding model + DuckDB index baked in — no runtime network dependency.

docker build -t kothar:0.3.0 .
docker run --rm -i kothar:0.3.0   # stdio transport, for local MCP clients

Wire into Claude Desktop:

{
  "mcpServers": {
    "kothar": {
      "command": "docker",
      "args": ["run", "--rm", "-i", "kothar:0.3.0"]
    }
  }
}

Non-root user (uid=10001), pinned Python 3.12, deps resolved from uv.lock, model cached under /app/.hf_cache with HF_HUB_OFFLINE=1 at runtime.

Stack

Python · FastMCP · DuckDB · sentence-transformers · uv

License

MIT

Install Server
A
license - permissive license
A
quality
B
maintenance

Maintenance

Maintainers
Response time
1dRelease cycle
3Releases (12mo)

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/yahiaklk/kothar'

If you have feedback or need assistance with the MCP directory API, please join our Discord server