Skip to main content
Glama

EdgeLake MCP Server

by tom342178
pyproject.toml502 B
[build-system] requires = ["setuptools>=65.0", "wheel"] build-backend = "setuptools.build_meta" [project] name = "edgelake-mcp-server" version = "1.0.7" description = "MCP server for EdgeLake distributed database" readme = "README.md" requires-python = ">=3.10" license = {text = "MPL-2.0"} authors = [ {name = "EdgeLake", email = "info@anylog.co"} ] [project.scripts] edgelake-mcp-server = "edgelake_mcp.server:main" [tool.pytest.ini_options] testpaths = ["tests"] python_files = ["test_*.py"]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/tom342178/edgelake-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server