Skip to main content
Glama

KB-MCP Server

A local-first Knowledge Base with Model Context Protocol (MCP) support. Give your AI a reliable memory. Run it locally. Stream answers in real time.


What is This?

A Knowledge Base (KB) is a structured collection of facts, documents, and embeddings stored in machine-readable form, with interfaces to:

  • Add knowledge

  • Query knowledge (semantic + keyword search)

  • Update/Delete knowledge

This MCP server exposes your KB to any MCP-compatible AI client (Claude, custom agents, etc.).


Why Local-First?

Benefit

Description

Privacy

No cloud leaks — your data stays on your machine

Zero latency

No network round-trips

Offline support

Works without internet

Full control

You own the data and the logic

No vendor lock-in

Swap components freely


Quick Start

Installation

npm install
npm run build

Run the Server

npm start

Or for development:

npm run dev

Configure with Claude Desktop

Add to your claude_desktop_config.json:

{
  "mcpServers": {
    "knowledge-base": {
      "command": "node",
      "args": ["/path/to/kb-mcp-server/dist/index.js"],
      "env": {
        "KB_DATA_DIR": "/path/to/your/data"
      }
    }
  }
}

Available Tools

Tool

Description

ingest_document

Add a document with title, content, and metadata

query_knowledge

Semantic search across all documents

list_documents

List documents with pagination

get_document

Get full document by ID

update_document

Update existing document

delete_document

Remove document from KB

kb_stats

Get knowledge base statistics


How It Works

1. User asks a question
       ↓
2. AI sends MCP query → KB-MCP Server
       ↓
3. KB retrieves relevant facts (semantic search)
       ↓
4. AI grounds the answer with real data
       ↓
5. Response streams to user
       ↓
6. (Optional) New insights stored back

Result: AI answers correctly. Knowledge compounds. No hallucinations.


Architecture

┌─────────────────┐
│   AI Client     │
│ (Claude, Agent) │
└────────┬────────┘
         │ MCP Protocol
         ↓
┌─────────────────┐
│  KB-MCP Server  │  ← stdio transport
│  ┌───────────┐  │
│  │  Tools    │  │  ingest | query | list | delete
│  └─────┬─────┘  │
│        ↓        │
│  ┌───────────┐  │
│  │  Engine   │  │  Embeddings + Similarity Search
│  └─────┬─────┘  │
│        ↓        │
│  ┌───────────┐  │
│  │   Store   │  │  JSON file (swap with Chroma/pgvector)
│  └───────────┘  │
└─────────────────┘

Configuration

Environment Variable

Default

Description

KB_DATA_DIR

./.kb-data

Directory for storing knowledge base data


Production Enhancements

For production use, consider:

  1. Real embeddings: Replace hash-based embeddings with OpenAI, Cohere, or local models (Ollama)

  2. Vector database: Swap JSON store with Chroma, Qdrant, or pgvector

  3. Chunking: Split large documents into chunks for better retrieval

  4. Hybrid search: Combine semantic + BM25 keyword search

  5. Access control: Add authentication for multi-user setups


License

MIT — Use freely.


Author

Matrix Agent

Install Server
A
security – no known vulnerabilities
F
license - not found
A
quality - confirmed to work

Resources

Unclaimed servers have limited discoverability.

Looking for Admin?

If you are the server author, to access and configure the admin panel.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/isshiki-dev/kb-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server