Supports configuration of API keys and agent IDs through environment variables stored in a .env file.
Allows connection to a repository containing the MCP server code, which can be cloned and customized for domain-specific RAG capabilities.
Uses the Contextual AI Python SDK to provide RAG capabilities, query processing, and potential extension to other features like agent management and retrieval settings.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Contextual MCP Serverexplain the RF345 microchip initialization sequence"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Contextual MCP Server
A Model Context Protocol (MCP) server that provides RAG (Retrieval-Augmented Generation) capabilities using Contextual AI. This server integrates with a variety of MCP clients. It provides flexibility in you can decide what functionality to offer in the server. In this readme, we will show integration with the both Cursor IDE and Claude Desktop.
Contextual AI now offers a hosted server inside the platform available at: https://mcp.app.contextual.ai/mcp/
After you connect to the server, you can use the tools, such as query, provided by the platform MCP server.
For a complete walkthrough, check out the MCP user guide.
Overview
An MCP server acts as a bridge between AI interfaces (Cursor IDE or Claude Desktop) and a specialized Contextual AI agent. It enables:
Query Processing: Direct your domain specific questions to a dedicated Contextual AI agent
Intelligent Retrieval: Searches through comprehensive information in your knowledge base
Context-Aware Responses: Generates answers that are:
Grounded in source documentation
Include citations and attributions
Maintain conversation context
Integration Flow
Cursor/Claude Desktop → MCP Server → Contextual AI RAG Agent
↑ ↓ ↓
└──────────────────┴─────────────┴─────────────── Response with citationsRelated MCP server: SourceSync.ai MCP Server
Prerequisites
Python 3.10 or higher
Cursor IDE and/or Claude Desktop
Contextual AI API key
MCP-compatible environment
Installation
Clone the repository:
git clone https://github.com/ContextualAI/contextual-mcp-server.git
cd contextual-mcp-serverCreate and activate a virtual environment:
python -m venv .venv
source .venv/bin/activate # On Windows, use `.venv\Scripts\activate`Install dependencies:
pip install -e .Configuration
Configure MCP Server
The server requires modifications of settings or use. For example, the single_agent server should be customized with an appropriate docstring for your RAG Agent.
The docstring for your query tool is critical as it helps the MCP client understand when to route questions to your RAG agent. Make it specific to your knowledge domain. Here is an example:
A research tool focused on financial data on the largest US firmsor
A research tool focused on technical documents for Omaha semiconductorsThe server also requires the following settings from your RAG Agent:
API_KEY: Your Contextual AI API keyAGENT_ID: Your Contextual AI agent ID
If you'd like to store these files in .env file you can specify them like so:
cat > .env << EOF
API_KEY=key...
AGENT_ID=...
EOFThe repo also contains more advance MPC servers for multi-agent systems or a document-agent.
AI Interface Integration
This MCP server can be integrated with a variety of clients. To use with either Cursor IDE or Claude Desktop create or modify the MCP configuration file in the appropriate location:
First, find the path to your
uvinstallation:
UV_PATH=$(which uv)
echo $UV_PATH
# Example output: /Users/username/miniconda3/bin/uvCreate the configuration file using the full path from step 1:
cat > mcp.json << EOF
{
"mcpServers": {
"ContextualAI-TechDocs": {
"command": "$UV_PATH", # make sure this is set properly
"args": [
"--directory",
"\${workspaceFolder}", # Will be replaced with your project path
"run",
"multi-agent/server.py"
]
}
}
}
EOFMove to the correct folder location, see below for options:
mkdir -p .cursor/
mv mcp.json .cursor/Configuration locations:
For Cursor:
Project-specific:
.cursor/mcp.jsonin your project directoryGlobal:
~/.cursor/mcp.jsonfor system-wide accessFor Claude Desktop:
Use the same configuration file format in the appropriate Claude Desktop configuration directory
Environment Setup
This project uses uv for dependency management, which provides faster and more reliable Python package installation.
Usage
The server provides Contextual AI RAG capabilities using the python SDK, which can available a variety of commands accessible from MCP clients, such as Cursor IDE and Claude Desktop. The current server focuses on using the query command from the Contextual AI python SDK, however you could extend this to support other features such as listing all the agents, updating retrieval settings, updating prompts, extracting retrievals, or downloading metrics.
Example Usage
# In Cursor, you might ask:
"Show me the code for initiating the RF345 microchip?"
# The MCP client will:
1. Determine if this should be routed to the MCP Server
# Then the MCP server will:
1. Route the query to the Contextual AI agent
2. Retrieve relevant documentation
3. Generate a response with specific citations
4. Return the formatted answer to CursorKey Benefits
Accurate Responses: All answers are grounded in your documentation
Source Attribution: Every response includes references to source documents
Context Awareness: The system maintains conversation context for follow-up questions
Real-time Updates: Responses reflect the latest documentation in your datastore
Development
Modifying the Server
To add new capabilities:
Add new tools by creating additional functions decorated with
@mcp.tool()Define the tool's parameters using Python type hints
Provide a clear docstring describing the tool's functionality
Example:
@mcp.tool()
def new_tool(param: str) -> str:
"""Description of what the tool does"""
# Implementation
return resultLimitations
The server runs locally and may not work in remote development environments
Tool responses are subject to Contextual AI API limits and quotas
Currently only supports stdio transport mode
For all the capabilities of Contextual AI, please check the official documentation.
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.