Skip to main content
Glama

LangChain MCP

langchain_mcp

This repository demonstrates a minimal working MCP (Multi-Server Control Plane) setup using LangChain, with:

  • A dummy jobs and employee API (FastAPI)
  • Two MCP servers (jobs and employee feedback, each on its own port)
  • A Python client that can query both servers simultaneously (multi-server, multi-tool)

Requirements

  • Python 3.9+
  • pip
  • Node.js (optional, only if you want to build a frontend)
  • An OpenAI API key (for GPT-4o)

Setup

1. Clone the repository

git clone https://github.com/nishant-Tiwari24/mcp.git cd mcp

2. Create and activate a virtual environment

python3 -m venv .venv source .venv/bin/activate

Note: .venv is gitignored. You must create it yourself.

3. Install Python dependencies

pip install --upgrade pip pip install -r requirements.txt

4. Set your OpenAI API key

Create a .env file in the project root (not tracked by git):

OPENAI_API_KEY=sk-...your-key-here...

Or export it in your shell before running the client:

export OPENAI_API_KEY=sk-...your-key-here...

Running the Demo

1. Start the dummy jobs/employee API

uvicorn mcp_server.jobs_api:app --port 8001 --host 127.0.0.1

2. Start both MCP servers (in a new terminal)

python run_servers.py

This will start:

  • Jobs server on port 8000
  • Employee server on port 8002

Or, to run them manually in separate terminals:

python mcp_server/server.py jobs python mcp_server/server.py employee

3. Run the multi-server client (in a new terminal)

python langchain_mcp_client.py
  • The client will connect to both servers and can use tools from both in a single conversation.
  • Make sure your OpenAI API key is exported or in a .env file.

File Structure

  • langchain_mcp_client.py — Python client for querying both MCP servers (multi-server, multi-tool)
  • mcp_server/server.py — MCP servers (jobs and employee feedback, each on its own port)
  • mcp_server/jobs_api.py — Dummy FastAPI backend for jobs and employee data
  • run_servers.py — Script to start both MCP servers at once
  • requirements.txt — Python dependencies
  • .gitignore — Excludes .venv, .env, and other environment files

Notes

  • The .venv directory and .env file are not included in the repo. You must create them locally.
  • Only the minimal, required files are tracked in git.
  • If you want to add a frontend, you can do so separately (not included in this repo).

Example Usage

  • Multi-server client:
    • Query: "I need to find similar jobs for an AI engineer position in San Jose, CA with 4-5 years of experience, and also get a feedback summary for Kalyan P. Can you help me with both?"
    • The client will use both tools: find_similar_jobs and summarize_employee_feedback.

License

MIT

-
security - not tested
F
license - not found
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

A Multi-Server Control Plane system that enables natural language querying of job listings and employee feedback data through two specialized servers built with LangChain.

  1. Requirements
    1. Setup
      1. 1. Clone the repository
      2. 2. Create and activate a virtual environment
      3. 3. Install Python dependencies
      4. 4. Set your OpenAI API key
    2. Running the Demo
      1. 1. Start the dummy jobs/employee API
      2. 2. Start both MCP servers (in a new terminal)
      3. 3. Run the multi-server client (in a new terminal)
    3. File Structure
      1. Notes
        1. Example Usage
          1. License

            Related MCP Servers

            • -
              security
              F
              license
              -
              quality
              A Model Context Protocol server that enables LLMs to interact with Databricks workspaces through natural language, allowing SQL query execution and job management operations.
              Last updated -
              19
              Python
              • Linux
              • Apple
            • A
              security
              A
              license
              A
              quality
              A Model Context Protocol server that enables AI assistants to interact with Linear project management systems, allowing users to retrieve, create, and update issues, projects, and teams through natural language.
              Last updated -
              32
              942
              92
              TypeScript
              MIT License
              • Apple
            • -
              security
              F
              license
              -
              quality
              An MCP server that enables AI assistants to interact with the Plane project management platform, allowing them to manage workspaces, projects, issues, and comments through a structured API.
              Last updated -
              JavaScript
            • A
              security
              A
              license
              A
              quality
              A Model Context Protocol server implementation that enables natural language interactions with OpenSearch clusters, allowing users to search documents, analyze indices, and manage clusters through simple conversational commands.
              Last updated -
              6
              8
              Python
              Apache 2.0
              • Apple

            View all related MCP servers

            MCP directory API

            We provide all the information about MCP servers via our MCP API.

            curl -X GET 'https://glama.ai/api/mcp/v1/servers/nishant-Tiwari24/mcp'

            If you have feedback or need assistance with the MCP directory API, please join our Discord server