Skip to main content
Glama

Library Docs MCP Server

README.md1.53 kB
# Library Docs MCP Server This is an MCP (Model Context Protocol) server that allows you to search and fetch documentation for popular libraries like Langchain, Llama-Index, MCP, and OpenAI using the Serper API. ## Features - Search library documentation using a natural language query. - Supports Langchain, Llama-Index, MCP, and OpenAI (Update the code to add other libraries). - Uses the `Serper API` to perform site-specific searches. - Parses and returns the documentation using `BeautifulSoup`. - **Provides updated documentation** – useful for LLM models with knowledge cut-off dates. ## Why Use This Server with LLMs? Many LLM models, including those used in **Claude Desktop** and similar platforms, have a knowledge cut-off date and may not have access to the latest library documentation. This MCP server solves that problem by: - Fetching **real-time documentation** from official sources. - Providing **up-to-date information** for development and troubleshooting. - Improving the accuracy and relevance of responses when working with new library updates. ## Setting Up with Claude Desktop To use this server with **Claude Desktop**, update the `claude_desktop_config.json` file with the following configuration: ```json { "mcpServers": { "docs-mcp-server": { "command": "C:\\Users\\Vikram\\.local\\bin\\uv.exe", "args": [ "run", "--with", "mcp[cli]", "mcp", "run", "F:\\My Projects\\AI\\docs-mcp-server\\server.py" ] } } } ```

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/vikramdse/docs-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server