Skip to main content
Glama

MCP AI Server

pyproject.toml561 B
[project] name = "cursor-mcp-rag" version = "0.1.0" description = "MCP RAG integration with Pinecone and Ollama" readme = "README.md" requires-python = ">=3.10" dependencies = [ "ipykernel>=6.29.5", "linkup-sdk>=0.2.4", "llama-index>=0.12.25", "llama-index-embeddings-huggingface>=0.5.2", "llama-index-llms-ollama>=0.5.3", "mcp[cli]>=1.5.0", "fastapi>=0.110.0", "uvicorn[standard]>=0.29.0", "pinecone>=3.0.0", "sentence-transformers>=2.6.1", "duckduckgo_search>=5.0", "tqdm>=4.66.4", "python-dotenv>=1.0.1" ]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/MeetRathodNitsan/MCP1'

If you have feedback or need assistance with the MCP directory API, please join our Discord server