ContextStream MCP Server
ContextStream MCP Server gives your AI coding assistant persistent memory, semantic code intelligence, and deep project context across sessions.
Initialize Sessions: Start with instant workspace context, recent memory, decisions, lessons, and semantic search on your first message.
Smart Context Retrieval: Automatically fetch relevant context before every AI response in a token-efficient format (~80% token savings vs. full chat history).
Semantic & Hybrid Search: Find code, memory, and knowledge by meaning, keyword, regex, or pattern across your entire workspace.
Session Management: Capture decisions, preferences, lessons learned, implementation plans, compress chat history, and restore context after compaction.
Memory & Knowledge Graph: Create and manage memory events, nodes, tasks, todos, diagrams, docs, roadmaps, and conversation transcripts.
Code Graph Analysis: Analyze module dependencies, function call paths, change impact, circular dependencies, and unused code.
Project Management: Create, index, and manage projects; ingest local folders for semantic code search; view indexing history and status.
Workspace Management: Associate folders with workspaces, bootstrap new workspaces, and manage multi-machine sync settings.
Team Collaboration: Share decisions, lessons, tasks, plans, and knowledge across team workspaces via GitHub, Slack, and Notion integrations.
Reminders: Create, snooze, complete, or dismiss time-based reminders linked to projects or workspaces.
Generate AI Rules: Auto-generate rule files for editors like Cursor, Claude Code, Cline, Roo, Aider, and more.
Help & Utilities: List available tools, check auth/version info, and install Claude Code hooks for real-time file indexing.
Referenced in decision history examples, showing how ContextStream tracks database technology selection rationale and architectural trade-offs.
Used as an example database preference in the memory system, demonstrating how ContextStream can remember and recall database technology choices and architectural decisions.
Featured prominently in memory examples, demonstrating how ContextStream remembers language preferences, strict mode settings, and coding standards across sessions.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@ContextStream MCP Serverwhat did we decide about the authentication flow yesterday?"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
npx --prefer-online -y @contextstream/mcp-server@latest setupGet Started (VS Code + Copilot)
Option 1: Rust MCP (recommended)
curl -fsSL https://contextstream.io/scripts/mcp.sh | bashirm https://contextstream.io/scripts/mcp.ps1 | iexThen run:
contextstream-mcp setupOption 2: Node MCP
npx --prefer-online -y @contextstream/mcp-server@latest setupAfter setup, restart VS Code/Copilot.
Works with: Claude Code • Cursor • VS Code • Claude Desktop • Codex CLI • OpenCode • Antigravity
This Isn't Just Memory. This Is Intelligence.
Other tools give your AI a notepad. ContextStream gives it a brain.
Your AI doesn't just remember things—it understands your entire codebase, learns from every conversation, pulls knowledge from your team's GitHub, Slack, and Notion, and delivers exactly the right context at exactly the right moment.
One setup. Instant transformation.
What Changes When You Install This
Before | After |
AI searches files one-by-one, burning tokens | Semantic search finds code by meaning in milliseconds |
Context lost when conversations get long | Smart compression preserves what matters before compaction |
Team knowledge scattered across tools | Unified intelligence from GitHub, Slack, Notion—automatically |
Same mistakes repeated across sessions | Lessons system ensures your AI learns from every failure |
Generic responses, no project awareness | Deep context about your architecture, decisions, patterns |
The Power Under the Hood
Semantic Code Intelligence
Ask "where do we handle authentication?" and get the answer instantly. No grep chains. No reading 10 files. Your AI understands your code at a conceptual level.
SmartRouter Context Delivery
Every message is analyzed. Risky refactor? Relevant lessons surface automatically. Making a decision? Your AI knows to capture it. The right context, every time, without you asking.
Team Knowledge Fusion
Connect GitHub, Slack, and Notion. Discussions from months ago? Surfaced when relevant. That architecture decision buried in a PR comment? Your AI knows about it.
Code Graph Analysis
"What depends on UserService?" "What's the impact of changing this function?" Your AI sees the connections across your entire codebase.
Context Pressure Awareness
Long conversation? ContextStream tracks token usage, auto-saves critical state, and ensures nothing important is lost when context compacts.
The Tools Your AI Gets
init → Loads your workspace context instantly
context → Delivers relevant context every single message
search → Semantic, hybrid, keyword—find anything by meaning
session → Captures decisions, preferences, lessons automatically
memory → Builds a knowledge graph of your project
graph → Maps dependencies and analyzes impact
project → Indexes your codebase for semantic understanding
media → Index and search video, audio, images (great for Remotion)
integration → Queries GitHub, Slack, Notion directlyYour AI uses these automatically. You just code.
Global Fallback Workspace (Unmapped Folders)
ContextStream now supports a catch-all mode for random folders (for example ~ or ad-hoc dirs) that are not associated with a project/workspace yet.
init(...)resolves normal folder mappings first (.contextstream/config.json, parent/global mappings).If no mapping exists, it uses a single hidden global fallback workspace (
.contextstream-global) in workspace-only mode.Context/memory/session tools continue to work without hard setup errors.
Project-bound actions (for example
project(action="ingest_local")) return guided remediation to create/select a project instead of failing with a rawproject_id requirederror.As soon as you enter a mapped project folder, that real workspace/project is prioritized and replaces fallback scope.
Manual Configuration
Skip this if you ran the setup wizard.
claude mcp add contextstream -- npx --prefer-online -y @contextstream/mcp-server@latest
claude mcp update contextstream -e CONTEXTSTREAM_API_URL=https://api.contextstream.io -e CONTEXTSTREAM_API_KEY=your_key{
"mcpServers": {
"contextstream": {
"command": "npx",
"args": ["--prefer-online", "-y", "@contextstream/mcp-server@latest"],
"env": {
"CONTEXTSTREAM_API_URL": "https://api.contextstream.io",
"CONTEXTSTREAM_API_KEY": "your_key"
}
}
}
}Locations: ~/.cursor/mcp.json • ~/Library/Application Support/Claude/claude_desktop_config.json
Local server:
{
"$schema": "https://opencode.ai/config.json",
"mcp": {
"contextstream": {
"type": "local",
"command": ["npx", "-y", "contextstream-mcp"],
"environment": {
"CONTEXTSTREAM_API_KEY": "{env:CONTEXTSTREAM_API_KEY}"
},
"enabled": true
}
}
}Remote server:
{
"$schema": "https://opencode.ai/config.json",
"mcp": {
"contextstream": {
"type": "remote",
"url": "https://mcp.contextstream.com",
"enabled": true
}
}
}For the local variant, export CONTEXTSTREAM_API_KEY before launching OpenCode.
Locations: ./opencode.json • ~/.config/opencode/opencode.json
For GitHub Copilot in VS Code, the easiest path is the hosted remote MCP with built-in OAuth. Marketplace installs should write this remote server definition automatically.
Hosted remote MCP (recommended)
{
"servers": {
"contextstream": {
"type": "http",
"url": "https://mcp.contextstream.io/mcp?default_context_mode=fast"
}
}
}On first use, VS Code should prompt you to authorize ContextStream in the browser and then complete setup without an API key in the config file.
npx @contextstream/mcp-server@latest setup now defaults VS Code/Copilot to this hosted remote when you are using the production ContextStream cloud. To force a local runtime instead, run setup with CONTEXTSTREAM_VSCODE_MCP_MODE=local.
For self-hosted or non-default API deployments, local runtime remains the default:
Rust MCP (recommended)
{
"servers": {
"contextstream": {
"type": "stdio",
"command": "contextstream-mcp",
"args": [],
"env": {
"CONTEXTSTREAM_API_URL": "https://api.contextstream.io",
"CONTEXTSTREAM_API_KEY": "your_key",
"CONTEXTSTREAM_TOOLSET": "complete",
"CONTEXTSTREAM_TRANSCRIPTS_ENABLED": "true",
"CONTEXTSTREAM_HOOK_TRANSCRIPTS_ENABLED": "true",
"CONTEXTSTREAM_SEARCH_LIMIT": "15",
"CONTEXTSTREAM_SEARCH_MAX_CHARS": "2400"
}
}
}
}Node MCP server
{
"servers": {
"contextstream": {
"type": "stdio",
"command": "npx",
"args": ["--prefer-online", "-y", "@contextstream/mcp-server@latest"],
"env": {
"CONTEXTSTREAM_API_URL": "https://api.contextstream.io",
"CONTEXTSTREAM_API_KEY": "your_key",
"CONTEXTSTREAM_TOOLSET": "complete",
"CONTEXTSTREAM_TRANSCRIPTS_ENABLED": "true",
"CONTEXTSTREAM_HOOK_TRANSCRIPTS_ENABLED": "true",
"CONTEXTSTREAM_SEARCH_LIMIT": "15",
"CONTEXTSTREAM_SEARCH_MAX_CHARS": "2400"
}
}
}
}Use the Copilot CLI to interactively add the MCP server:
/mcp addOr add to ~/.copilot/mcp-config.json (pick one runtime):
Rust MCP (recommended)
{
"mcpServers": {
"contextstream": {
"command": "contextstream-mcp",
"args": [],
"env": {
"CONTEXTSTREAM_API_URL": "https://api.contextstream.io",
"CONTEXTSTREAM_API_KEY": "your_key",
"CONTEXTSTREAM_TOOLSET": "complete",
"CONTEXTSTREAM_TRANSCRIPTS_ENABLED": "true",
"CONTEXTSTREAM_HOOK_TRANSCRIPTS_ENABLED": "true",
"CONTEXTSTREAM_SEARCH_LIMIT": "15",
"CONTEXTSTREAM_SEARCH_MAX_CHARS": "2400"
}
}
}
}Node MCP server
{
"mcpServers": {
"contextstream": {
"command": "npx",
"args": ["--prefer-online", "-y", "@contextstream/mcp-server@latest"],
"env": {
"CONTEXTSTREAM_API_URL": "https://api.contextstream.io",
"CONTEXTSTREAM_API_KEY": "your_key",
"CONTEXTSTREAM_TOOLSET": "complete",
"CONTEXTSTREAM_TRANSCRIPTS_ENABLED": "true",
"CONTEXTSTREAM_HOOK_TRANSCRIPTS_ENABLED": "true",
"CONTEXTSTREAM_SEARCH_LIMIT": "15",
"CONTEXTSTREAM_SEARCH_MAX_CHARS": "2400"
}
}
}
}For more information, see the GitHub Copilot CLI documentation.
VS Code + Copilot Tips
Run setup once and keep both config files:
~/.copilot/mcp-config.json.vscode/mcp.json
Rust install: use
contextstream-mcpas the command.Node install: use
npx --prefer-online -y @contextstream/mcp-server@latestas the command.Force local VS Code/Copilot setup with
CONTEXTSTREAM_VSCODE_MCP_MODE=local.Force hosted remote VS Code/Copilot setup with
CONTEXTSTREAM_VSCODE_MCP_MODE=remote.Use
mcpServersin Copilot CLI config andserversin VS Code config.
Quick Troubleshooting
Remove duplicate ContextStream entries across Workspace/User config scopes.
Check
CONTEXTSTREAM_API_URLandCONTEXTSTREAM_API_KEYare set.Remove stale version pins like
@contextstream/mcp-server@0.3.xx.Restart VS Code/Copilot after config changes.
Known Limitations
HTTP transport OAuth and vscode.dev dependency
The hosted HTTP MCP transport (https://mcp.contextstream.io/mcp) uses OAuth authentication that routes through vscode.dev for the redirect flow. This can fail in environments where vscode.dev is blocked (corporate networks, regional restrictions, CDN-level blocks).
Workaround: Use the stdio transport (Rust binary or Node.js) with API key authentication instead:
{
"contextstream": {
"type": "stdio",
"command": "npx",
"args": ["-y", "@contextstream/mcp-server@latest"],
"env": {
"CONTEXTSTREAM_API_KEY": "your-api-key"
}
}
}SDK version compatibility
@modelcontextprotocol/sdk versions 1.28.0 and above introduce breaking changes. The package.json pins the SDK to >=1.25.1 <1.28.0 to prevent incompatible resolutions. If you experience Zod schema errors on startup, ensure your SDK version is below 1.28.0.
Marketplace Note
The MCP marketplace entry now targets the hosted remote MCP at https://mcp.contextstream.io/mcp?default_context_mode=fast so VS Code can use the native OAuth flow instead of writing a local npm-based stdio config.
Use the Rust or Node local runtime configs above only when you explicitly want local execution, custom/self-hosted endpoints, or editor environments that do not support the hosted remote flow.
Links
Website: https://contextstream.io
Docs: https://contextstream.io/docs
Maintenance
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/contextstream/mcp-server'
If you have feedback or need assistance with the MCP directory API, please join our Discord server