Skip to main content
Glama

Lila MCP Server

by lila-graph
fastmcp.json702 B
{ "$schema": "https://gofastmcp.com/public/schemas/fastmcp.json/v1.json", "source": { "type": "filesystem", "path": "lila_mcp_server.py", "entrypoint": "mcp" }, "environment": { "type": "uv", "python": ">=3.12", "dependencies": [ "fastmcp>=2.12.3", "neo4j>=5.15.0", "pydantic>=2.6.0", "pydantic-settings>=2.2.0", "python-dotenv>=1.0.0", "openai>=1.30.0", "anthropic>=0.25.0", "httpx>=0.27.0", "aiohttp>=3.9.0", "logfire>=0.28.0", "click>=8.1.0", "asyncio-mqtt>=0.16.0" ] }, "deployment": { "transport": "http", "host": "127.0.0.1", "port": 8765, "log_level": "INFO" } }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/lila-graph/lila-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server