Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Fraim Context MCPsearch for API authentication docs in deep mode"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Fraim Context MCP
Semantic search MCP server for project documentation.
Version: 5.1.0
Status: In Development
Overview
Fraim Context MCP exposes project documentation to LLMs via the Model Context Protocol (MCP). It supports:
Fast mode: Direct cache/search for immediate results
Deep mode: Multi-round synthesis for complex queries
Hybrid search: Vector similarity + full-text search with pgvector
Smart caching: Redis with corpus versioning for cache invalidation
Quick Start
Development
This project uses Test-Driven Development (TDD). See DNA/DEVELOPMENT_PLAN.md for stages.
Architecture
LLM Access: Pydantic AI Gateway (unified key for all providers)
Database: PostgreSQL + pgvector (1024-dim embeddings)
Cache: Redis 7.x (native asyncio)
Observability: Logfire (OpenTelemetry)
See DNA/specs/ARCHITECTURE.md for full details.
License
MIT