README.mdβ’2.12 kB
# π§ MCP Server with LangChain and AI Tools
This project demonstrates how to build a **multi-tool AI assistant** using the **Model Context Protocol (MCP)**, LangChain, and Groqβs Qwen model. It includes:
- π A local **Math MCP Server**
- π€οΈ A simulated **Weather MCP Server**
- π€ A conversational **AI agent** (MCP client) that talks to both
---
## π§° Features
- Uses **LangChain MCP Adapters** to connect tools
- Powered by **Groq's Qwen LLM**
- Handles local and remote tool servers via MCP
- Interactive CLI chat with tool usage detection
---
## π Prerequisites
- Python >= 3.11
- `uv` for project/environment management (https://github.com/astral-sh/uv)
- Internet connection for loading LLM (Groq)
---
## βοΈ Setup Instructions
### 1. Create Project
```bash
mkdir mcp_project
cd mcp_project
uv init
```
Set Python version in .python-version and pyproject.toml to >=3.11
### 2. Create Virtual Environment
```bash
uv venv
source .venv/Scripts/activate
```
### 3. Add Dependencies
```
Create a requirements.txt file:
langchain-mcp-adapters
langchain-groq
langgraph
mcp
```
Install them
```bash
uv add -r requirements.txt
```
### Project Structure
mcp_project/
β
βββ math_server.py # MCP server for math tools
βββ weather_server.py # MCP server for weather API simulation
βββ client.py # MCP client with AI agent
βββ requirements.txt
βββ .python-version
βββ .env # For storing Groq API key (GROQ_API_KEY)
## How to Run
### 1. Run the Weather Server
```bash
python weather_server.py
```
### 2. Run the Client (Automatically runs math server as sub process)
```bash
python client.py
```
## Example Conversation
```text
You: What is the output of 2*3/(4-2)
AI: The result is 3.0
You: What is the weather in New York?
AI: The current weather in New York is sunny.
You: thanks
AI: You're welcome! π
```
## Note
#### The weather server is simulated. Replace it with real API logic if needed.
#### You can add more MCP servers for documents, search, DBs, etc.
#### Use .env to store your GROQ_API_KEY.