Provides access to a dummy jobs and employee API that serves as the backend for the MCP demo
Used as the framework for building the Multi-Server Control Plane setup, enabling interactions with both jobs and employee feedback servers
Integrates with OpenAI's API to power the MCP servers using GPT-4o for processing natural language queries related to job searches and employee feedback
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@LangChain MCPfind similar jobs for a data scientist in New York with 3-5 years experience"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
langchain_mcp
This repository demonstrates a minimal working MCP (Multi-Server Control Plane) setup using LangChain, with:
A dummy jobs and employee API (FastAPI)
Two MCP servers (jobs and employee feedback, each on its own port)
A Python client that can query both servers simultaneously (multi-server, multi-tool)
Requirements
Python 3.9+
Node.js (optional, only if you want to build a frontend)
An OpenAI API key (for GPT-4o)
Related MCP server: Linear
Setup
1. Clone the repository
git clone https://github.com/nishant-Tiwari24/mcp.git
cd mcp2. Create and activate a virtual environment
python3 -m venv .venv
source .venv/bin/activateNote:
.venvis gitignored. You must create it yourself.
3. Install Python dependencies
pip install --upgrade pip
pip install -r requirements.txt4. Set your OpenAI API key
Create a .env file in the project root (not tracked by git):
OPENAI_API_KEY=sk-...your-key-here...Or export it in your shell before running the client:
export OPENAI_API_KEY=sk-...your-key-here...Running the Demo
1. Start the dummy jobs/employee API
uvicorn mcp_server.jobs_api:app --port 8001 --host 127.0.0.12. Start both MCP servers (in a new terminal)
python run_servers.pyThis will start:
Jobs server on port 8000
Employee server on port 8002
Or, to run them manually in separate terminals:
python mcp_server/server.py jobs
python mcp_server/server.py employee3. Run the multi-server client (in a new terminal)
python langchain_mcp_client.pyThe client will connect to both servers and can use tools from both in a single conversation.
Make sure your OpenAI API key is exported or in a
.envfile.
File Structure
langchain_mcp_client.py— Python client for querying both MCP servers (multi-server, multi-tool)mcp_server/server.py— MCP servers (jobs and employee feedback, each on its own port)mcp_server/jobs_api.py— Dummy FastAPI backend for jobs and employee datarun_servers.py— Script to start both MCP servers at oncerequirements.txt— Python dependencies.gitignore— Excludes.venv,.env, and other environment files
Notes
The
.venvdirectory and.envfile are not included in the repo. You must create them locally.Only the minimal, required files are tracked in git.
If you want to add a frontend, you can do so separately (not included in this repo).
Example Usage
Multi-server client:
Query: "I need to find similar jobs for an AI engineer position in San Jose, CA with 4-5 years of experience, and also get a feedback summary for Kalyan P. Can you help me with both?"
The client will use both tools:
find_similar_jobsandsummarize_employee_feedback.
License
MIT
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.