Skip to main content
Glama

Model Context Protocol (MCP) Server

by hideya
README_DEV.md629 B
# Building and Running from the Source ## Prerequisites - Python 3.11+ - [`uv`](https://docs.astral.sh/uv/getting-started/installation/) - make - git ## Setup 1. Clone the repository: ```bash git clone https://github.com/hideya/mcp-client-langchain-py.git cd mcp-client-langchain-py ``` 2. Install dependencies: ```bash make install ``` 3. Setup API keys 4. Configure LLM and MCP Servers settings `llm_mcp_config.json5` as needed ## Test Execution Run the app: ```bash make start ``` Run in verbose mode: ```bash make start -- -v ``` See commandline options: ```bash make start -- -h ```

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/hideya/mcp-client-langchain-py'

If you have feedback or need assistance with the MCP directory API, please join our Discord server