Skip to main content
Glama

Model Context Protocol (MCP) Server

by hideya

MCP Client Using LangChain / Python

This simple Model Context Protocol (MCP) client demonstrates the use of MCP server tools by LangChain ReAct Agent.

It leverages a utility function convert_mcp_to_langchain_tools() from langchain_mcp_tools.
This function handles parallel initialization of specified multiple MCP servers and converts their available tools into a list of LangChain-compatible tools (List[BaseTool]).

LLMs from Anthropic, OpenAI and Groq are currently supported.

A typescript version of this MCP client is available here

Prerequisites

  • Python 3.11+
  • [optional] uv (uvx) installed to run Python package-based MCP servers
  • [optional] npm 7+ (npx) to run Node.js package-based MCP servers
  • API keys from Anthropic, OpenAI, and/or Groq as needed

Setup

  1. Install dependencies:
    make install
  2. Setup API keys:
    cp .env.template .env
    • Update .env as needed.
    • .gitignore is configured to ignore .env to prevent accidental commits of the credentials.
  3. Configure LLM and MCP Servers settings llm_mcp_config.json5 as needed.
    • The configuration file format for MCP servers follows the same structure as Claude for Desktop, with one difference: the key name mcpServers has been changed to mcp_servers to follow the snake_case convention commonly used in JSON configuration files.
    • The file format is JSON5, where comments and trailing commas are allowed.
    • The format is further extended to replace ${...} notations with the values of corresponding environment variables.
    • Keep all the credentials and private info in the .env file and refer to them with ${...} notation as needed.

Usage

Run the app:

make start

It takes a while on the first run.

Run in verbose mode:

make start-v

See commandline options:

make start-h

At the prompt, you can simply press Enter to use example queries that perform MCP server tool invocations.

Example queries can be configured in llm_mcp_config.json5

-
security - not tested
A
license - permissive license
-
quality - not tested

This server facilitates the invocation of AI models from providers like Anthropic, OpenAI, and Groq, enabling users to manage and configure large language model interactions seamlessly.

  1. Prerequisites
    1. Setup
      1. Usage

        Related MCP Servers

        • -
          security
          F
          license
          -
          quality
          This server provides an API to query Large Language Models using context from local files, supporting various models and file types for context-aware responses.
          Last updated -
          1
          TypeScript
        • -
          security
          A
          license
          -
          quality
          A server that enables Large Language Models to discover and interact with REST APIs defined by OpenAPI specifications through the Model Context Protocol.
          Last updated -
          378
          96
          TypeScript
          MIT License
          • Apple
        • -
          security
          F
          license
          -
          quality
          A server that provides rich UI context and interaction capabilities to AI models, enabling deep understanding of user interfaces through visual analysis and precise interaction via Model Context Protocol.
          Last updated -
          24
          Python
          • Linux
          • Apple
        • -
          security
          A
          license
          -
          quality
          A server tool for running Deep Learning models that offers Shell execution, Ngrok connectivity, and Docker container hosting with support for multiple AI frameworks including Anthropic, Gemini, and OpenAI.
          Last updated -
          2
          Python
          MIT License

        View all related MCP servers

        MCP directory API

        We provide all the information about MCP servers via our MCP API.

        curl -X GET 'https://glama.ai/api/mcp/v1/servers/hideya/mcp-client-langchain-py'

        If you have feedback or need assistance with the MCP directory API, please join our Discord server