Skip to main content
Glama

Observe Community MCP Server

by rustomax
requirements.txt877 B
fastmcp>=0.1.0 fastapi>=0.95.0 uvicorn>=0.21.1 requests>=2.28.2 python-dotenv>=1.0.0 httpx>=0.24.0 pinecone>=7.0.0 tqdm>=4.65.0 typing-extensions>=4.0.0 # PostgreSQL for OPAL memory system asyncpg>=0.29.0 # OpenAI embeddings for semantic search (lightweight) openai>=1.0.0 # LangGraph dependencies for NLP query agent langchain-core>=0.3.0 langgraph>=0.2.0 langchain-anthropic>=0.2.0 langchain-openai>=0.2.0 # OpenAI Agents SDK for investigator agent openai-agents>=0.1.0 # OpenTelemetry instrumentation opentelemetry-api>=1.21.0 opentelemetry-sdk>=1.21.0 opentelemetry-exporter-otlp>=1.21.0 opentelemetry-instrumentation>=0.42b0 opentelemetry-instrumentation-httpx>=0.42b0 opentelemetry-instrumentation-asyncpg>=0.42b0 opentelemetry-instrumentation-fastapi>=0.42b0 # Visualization dependencies matplotlib>=3.7.0 pandas>=2.0.0 numpy>=1.24.0 pillow>=10.0.0 seaborn>=0.12.0

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/rustomax/observe-community-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server