Skip to main content
Glama

LangChain Agent MCP Server

A production-ready MCP server exposing LangChain agent capabilities through the Model Context Protocol, deployed on Google Cloud Run.

GitHub Stars GitHub Forks License

πŸš€ Overview

This is a standalone backend service that wraps a LangChain agent as a single, high-level MCP Tool. The server is built with FastAPI and deployed on Google Cloud Run, providing a scalable, production-ready solution for exposing AI agent capabilities to any MCP-compliant client.

Live Service: https://langchain-agent-mcp-server-554655392699.us-central1.run.app

✨ Features

  • βœ… MCP Compliance - Full Model Context Protocol support

  • βœ… LangChain Agent - Multi-step reasoning with ReAct pattern

  • βœ… Google Cloud Run - Scalable, serverless deployment

  • βœ… Tool Support - Extensible framework for custom tools

  • βœ… Production Ready - Error handling, logging, and monitoring

  • βœ… Docker Support - Containerized for easy deployment

πŸ—οΈ Architecture

Component

Technology

Purpose

Backend Framework

FastAPI

High-performance, asynchronous web server

Agent Framework

LangChain

Multi-step reasoning and tool execution

Deployment

Google Cloud Run

Serverless, auto-scaling hosting

Containerization

Docker

Consistent deployment environment

Protocol

Model Context Protocol (MCP)

Standardized tool and context sharing

πŸ› οΈ Quick Start

Prerequisites

  • Python 3.11+

  • OpenAI API key

  • Google Cloud account (for Cloud Run deployment)

  • Docker (optional, for local testing)

Local Development

  1. Clone the repository:

    git clone https://github.com/mcpmessenger/LangchainMCP.git cd LangchainMCP
  2. Install dependencies:

    # Windows py -m pip install -r requirements.txt # Linux/Mac pip install -r requirements.txt
  3. Set up environment variables: Create a .env file:

    OPENAI_API_KEY=your-openai-api-key-here OPENAI_MODEL=gpt-4o-mini PORT=8000
  4. Run the server:

    # Windows py run_server.py # Linux/Mac python run_server.py
  5. Test the endpoints:

☁️ Google Cloud Run Deployment

The server is designed for deployment on Google Cloud Run. See our comprehensive deployment guides:

Quick Deploy

# Windows PowerShell .\deploy-cloud-run.ps1 -ProjectId "your-project-id" -Region "us-central1" # Linux/Mac ./deploy-cloud-run.sh your-project-id us-central1

Current Deployment

πŸ“‘ API Endpoints

MCP Endpoints

Get Manifest

GET /mcp/manifest

Returns the MCP manifest declaring available tools.

Response:

{ "name": "langchain-agent-mcp-server", "version": "1.0.0", "tools": [ { "name": "agent_executor", "description": "Execute a complex, multi-step reasoning task...", "inputSchema": { "type": "object", "properties": { "query": { "type": "string", "description": "The user's query or task" } }, "required": ["query"] } } ] }

Invoke Tool

POST /mcp/invoke Content-Type: application/json { "tool": "agent_executor", "arguments": { "query": "What is the capital of France?" } }

Response:

{ "content": [ { "type": "text", "text": "The capital of France is Paris." } ], "isError": false }

Other Endpoints

  • GET / - Server information

  • GET /health - Health check

  • GET /docs - Interactive API documentation (Swagger UI)

πŸ”§ Configuration

Environment Variables

Variable

Description

Default

Required

OPENAI_API_KEY

OpenAI API key

-

βœ… Yes

OPENAI_MODEL

OpenAI model to use

gpt-4o-mini

No

PORT

Server port

8000

No

API_KEY

Optional API key for authentication

-

No

MAX_ITERATIONS

Maximum agent iterations

10

No

VERBOSE

Enable verbose logging

false

No

πŸ“š Documentation

πŸ“– Full Documentation Site - Complete documentation with examples (GitHub Pages)

Quick Links:

Build Docs Locally:

# Windows .\build-docs.ps1 serve # Linux/Mac ./build-docs.sh serve

Additional Guides:

πŸ§ͺ Testing

# Test health endpoint Invoke-WebRequest -Uri "https://langchain-agent-mcp-server-554655392699.us-central1.run.app/health" # Test agent invocation $body = @{ tool = "agent_executor" arguments = @{ query = "What is 2+2?" } } | ConvertTo-Json Invoke-WebRequest -Uri "https://langchain-agent-mcp-server-554655392699.us-central1.run.app/mcp/invoke" ` -Method POST ` -ContentType "application/json" ` -Body $body

πŸ—οΈ Project Structure

. β”œβ”€β”€ src/ β”‚ β”œβ”€β”€ main.py # FastAPI application with MCP endpoints β”‚ β”œβ”€β”€ agent.py # LangChain agent definition and tools β”‚ β”œβ”€β”€ mcp_manifest.json # MCP manifest configuration β”‚ └── start.sh # Cloud Run startup script β”œβ”€β”€ tests/ β”‚ └── test_mcp_endpoints.py # Test suite β”œβ”€β”€ Dockerfile # Container configuration β”œβ”€β”€ requirements.txt # Python dependencies β”œβ”€β”€ deploy-cloud-run.ps1 # Windows deployment script β”œβ”€β”€ deploy-cloud-run.sh # Linux/Mac deployment script └── cloudbuild.yaml # Cloud Build configuration

πŸš€ Deployment Options

Google Cloud Run (Recommended)

  • Scalable - Auto-scales based on traffic

  • Serverless - Pay only for what you use

  • Managed - No infrastructure to manage

  • Fast - Low latency with global CDN

See DEPLOY_CLOUD_RUN_WINDOWS.md for detailed instructions.

Docker (Local/Other Platforms)

docker build -t langchain-agent-mcp-server . docker run -p 8000:8000 -e OPENAI_API_KEY=your-key langchain-agent-mcp-server

πŸ“Š Performance

  • P95 Latency: < 5 seconds for standard 3-step ReAct chains

  • Scalability: Horizontal scaling on Cloud Run

  • Uptime: 99.9% target (Cloud Run SLA)

  • Throughput: Handles concurrent requests efficiently

πŸ”’ Security

  • API key authentication (optional)

  • Environment variable management

  • Secret Manager integration (Cloud Run)

  • HTTPS by default (Cloud Run)

  • CORS configuration

🀝 Contributing

We welcome contributions! Please see our contributing guidelines.

  1. Fork the repository

  2. Create a feature branch

  3. Make your changes

  4. Submit a pull request

πŸ“œ License

This project is licensed under the MIT License.

πŸ”— Links

πŸ™ Acknowledgments


Status: βœ… Production-ready and deployed on Google Cloud Run

-
security - not tested
F
license - not found
-
quality - not tested

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/mcpmessenger/LangchainMCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server