Used for storing API keys and configuration settings for the MCP server
Enables the creation of multi-tool AI assistants using LangChain MCP Adapters to connect tools and services
Integrates with the system to create conversational AI agents that can interact with multiple MCP tools
π§ MCP Server with LangChain and AI Tools
This project demonstrates how to build a multi-tool AI assistant using the Model Context Protocol (MCP), LangChain, and Groqβs Qwen model. It includes:
π A local Math MCP Server
π€οΈ A simulated Weather MCP Server
π€ A conversational AI agent (MCP client) that talks to both
π§° Features
Uses LangChain MCP Adapters to connect tools
Powered by Groq's Qwen LLM
Handles local and remote tool servers via MCP
Interactive CLI chat with tool usage detection
Related MCP server: MCP Server For Local
π Prerequisites
Python >= 3.11
uvfor project/environment management (https://github.com/astral-sh/uv)Internet connection for loading LLM (Groq)
βοΈ Setup Instructions
1. Create Project
Set Python version in .python-version and pyproject.toml to >=3.11
2. Create Virtual Environment
3. Add Dependencies
Install them
Project Structure
mcp_project/ β βββ math_server.py # MCP server for math tools βββ weather_server.py # MCP server for weather API simulation βββ client.py # MCP client with AI agent βββ requirements.txt βββ .python-version βββ .env # For storing Groq API key (GROQ_API_KEY)