LangChain Agent MCP Server
A production-ready MCP server exposing LangChain agent capabilities through the Model Context Protocol, deployed on Google Cloud Run.
π Overview
This is a standalone backend service that wraps a LangChain agent as a single, high-level MCP Tool. The server is built with FastAPI and deployed on Google Cloud Run, providing a scalable, production-ready solution for exposing AI agent capabilities to any MCP-compliant client.
Live Service: https://langchain-agent-mcp-server-554655392699.us-central1.run.app
β¨ Features
β MCP Compliance - Full Model Context Protocol support
β LangChain Agent - Multi-step reasoning with ReAct pattern
β Google Cloud Run - Scalable, serverless deployment
β Tool Support - Extensible framework for custom tools
β Production Ready - Error handling, logging, and monitoring
β Docker Support - Containerized for easy deployment
ποΈ Architecture
Component | Technology | Purpose |
Backend Framework | FastAPI | High-performance, asynchronous web server |
Agent Framework | LangChain | Multi-step reasoning and tool execution |
Deployment | Google Cloud Run | Serverless, auto-scaling hosting |
Containerization | Docker | Consistent deployment environment |
Protocol | Model Context Protocol (MCP) | Standardized tool and context sharing |
π οΈ Quick Start
Prerequisites
Python 3.11+
OpenAI API key
Google Cloud account (for Cloud Run deployment)
Docker (optional, for local testing)
Local Development
Clone the repository:
git clone https://github.com/mcpmessenger/LangchainMCP.git cd LangchainMCPInstall dependencies:
# Windows py -m pip install -r requirements.txt # Linux/Mac pip install -r requirements.txtSet up environment variables: Create a
.envfile:OPENAI_API_KEY=your-openai-api-key-here OPENAI_MODEL=gpt-4o-mini PORT=8000Run the server:
# Windows py run_server.py # Linux/Mac python run_server.pyTest the endpoints:
Health: http://localhost:8000/health
Manifest: http://localhost:8000/mcp/manifest
API Docs: http://localhost:8000/docs
βοΈ Google Cloud Run Deployment
The server is designed for deployment on Google Cloud Run. See our comprehensive deployment guides:
DEPLOY_CLOUD_RUN_WINDOWS.md - Windows deployment guide
DEPLOY_CLOUD_RUN.md - General deployment guide
QUICK_DEPLOY.md - Quick reference
Quick Deploy
Current Deployment
Service URL: https://langchain-agent-mcp-server-554655392699.us-central1.run.app
Project: slashmcp
Region: us-central1
Status: β Live and operational
π‘ API Endpoints
MCP Endpoints
Get Manifest
Returns the MCP manifest declaring available tools.
Response:
Invoke Tool
Response:
Other Endpoints
GET /- Server informationGET /health- Health checkGET /docs- Interactive API documentation (Swagger UI)
π§ Configuration
Environment Variables
Variable | Description | Default | Required |
| OpenAI API key | - | β Yes |
| OpenAI model to use |
| No |
| Server port |
| No |
| Optional API key for authentication | - | No |
| Maximum agent iterations |
| No |
| Enable verbose logging |
| No |
π Documentation
π Full Documentation Site - Complete documentation with examples (GitHub Pages)
Quick Links:
Getting Started - Set up and run locally
Examples - Code examples including "Build a RAG agent in 10 lines"
Deployment Guide - Deploy to Google Cloud Run
API Reference - Complete API documentation
Troubleshooting - Common issues and solutions
Build Docs Locally:
Additional Guides:
README_BACKEND.md - Complete technical documentation
DEPLOY_CLOUD_RUN_WINDOWS.md - Windows deployment guide
INSTALL_PREREQUISITES.md - Prerequisites installation
SLASHMCP_INTEGRATION.md - SlashMCP integration guide
π§ͺ Testing
ποΈ Project Structure
π Deployment Options
Google Cloud Run (Recommended)
Scalable - Auto-scales based on traffic
Serverless - Pay only for what you use
Managed - No infrastructure to manage
Fast - Low latency with global CDN
See DEPLOY_CLOUD_RUN_WINDOWS.md for detailed instructions.
Docker (Local/Other Platforms)
π Performance
P95 Latency: < 5 seconds for standard 3-step ReAct chains
Scalability: Horizontal scaling on Cloud Run
Uptime: 99.9% target (Cloud Run SLA)
Throughput: Handles concurrent requests efficiently
π Security
API key authentication (optional)
Environment variable management
Secret Manager integration (Cloud Run)
HTTPS by default (Cloud Run)
CORS configuration
π€ Contributing
We welcome contributions! Please see our contributing guidelines.
Fork the repository
Create a feature branch
Make your changes
Submit a pull request
π License
This project is licensed under the MIT License.
π Links
GitHub Repository: https://github.com/mcpmessenger/LangchainMCP
Live Service: https://langchain-agent-mcp-server-554655392699.us-central1.run.app
API Documentation: https://langchain-agent-mcp-server-554655392699.us-central1.run.app/docs
Model Context Protocol: https://modelcontextprotocol.io/
π Acknowledgments
Built with LangChain
Deployed on Google Cloud Run
Uses FastAPI for the web framework
Status: β Production-ready and deployed on Google Cloud Run
This server cannot be installed