MCP AI Chat LangChain

by badrinathvm

MCP AI Chat LangChain Example

A basic implementation of a Model Context Protocol (MCP) server that demonstrates core functionality, including tools and resources. This guide will walk you through the steps to initialize, inspect, and integrate the server.

Getting Started

Before you begin, ensure you have the following installed:

  • Python (Version 3.8 or later)
  • uv CLI

To verify your installation, run:

python --version uv --version

Initialization

To initialize the project, navigate to a local folder of your choice and launch your terminal (PowerShell or CMD). Then, run:

uv init mcp-ai-chat-langchain

Create a virtual environment

uv venv --python 3.12.0

To add a new dependency

uv add langchain-groq uv addlangchain-openai uv add mcp-use

This will set up the project directory and install the necessary dependencies.

To execute the project

uv run app.py

Resources

-
security - not tested
-
license - not tested
-
quality - not tested

A basic Model Context Protocol server implementation that demonstrates core functionality including tools and resources for AI chat applications.

  1. Getting Started
    1. Initialization
    2. Create a virtual environment
    3. To add a new dependency
    4. To execute the project
  2. Resources

    MCP directory API

    We provide all the information about MCP servers via our MCP API.

    curl -X GET 'https://glama.ai/api/mcp/v1/servers/badrinathvm/mcp-ai-chat-langchain'

    If you have feedback or need assistance with the MCP directory API, please join our Discord server