Skip to main content
Glama
local-quickstart.md664 B
# Local Quickstart (Personal) ## Install ```bash python -m venv .venv source .venv/bin/activate # Windows: .venv\Scripts\activate pip install -r requirements.txt # Optional: pip install -r requirements-dev.txt ``` ## Run stdio (desktop Claude) ```bash python -m server # or: ex-mcp-server (after `pip install -e .`) ``` ## Run remote (for tunnels) ```bash pip install .[remote] # Generate token python - <<'PY' import secrets; print(secrets.token_urlsafe(32)) PY export MCP_AUTH_TOKEN=<paste> export MCP_BASE_PATH=/mcp export CORS_ORIGINS=* uvicorn remote_server:app --host 0.0.0.0 --port 7800 ``` Now your local server is ready at http://localhost:7800.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Zazzles2908/EX_AI-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server