Skip to main content
Glama

LocalMCP

by leolech14
MIT License
requirements.txt884 B
# Core dependencies fastapi>=0.104.0 uvicorn>=0.24.0 pydantic>=2.4.0 python-multipart>=0.0.6 # MCP Protocol websockets>=12.0 jsonrpc>=3.0 aiofiles>=23.2.0 # LLM Providers openai>=1.3.0 anthropic>=0.8.0 google-generativeai>=0.3.0 transformers>=4.35.0 torch>=2.1.0 # Semantic Search faiss-cpu>=1.7.4 sentence-transformers>=2.2.2 numpy>=1.24.0 # Caching redis>=5.0.0 aiocache>=0.12.0 # Monitoring prometheus-client>=0.19.0 opentelemetry-api>=1.21.0 opentelemetry-sdk>=1.21.0 opentelemetry-instrumentation-fastapi>=0.42b0 # Circuit Breaker py-breaker>=0.7.0 tenacity>=8.2.0 # Database sqlalchemy>=2.0.0 alembic>=1.12.0 asyncpg>=0.29.0 # Utilities click>=8.1.0 rich>=13.7.0 python-dotenv>=1.0.0 pyyaml>=6.0.1 structlog>=23.2.0 # Testing pytest>=7.4.0 pytest-asyncio>=0.21.0 pytest-cov>=4.1.0 httpx>=0.25.0 # Development black>=23.11.0 flake8>=6.1.0 mypy>=1.7.0 pre-commit>=3.5.0

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/leolech14/LocalMCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server