Skip to main content
Glama

MCP Fetch

by LangGPT
MIT License
  • Apple
  • Linux
.gitignore935 B
# Python __pycache__/ *.py[cod] *$py.class *.so .Python build/ develop-eggs/ dist/ downloads/ eggs/ .eggs/ lib/ lib64/ parts/ sdist/ var/ wheels/ share/python-wheels/ *.egg-info/ .installed.cfg *.egg MANIFEST.in *.whl # Virtual environments .env .venv env/ venv/ ENV/ env.bak/ venv.bak/ # UV package manager .uv/ # IDEs and Editors .vscode/ .idea/ .spyderproject .spyproject *.swp *.swo *~ .vim/ # Claude Code files CLAUDE.md .claude/ # OS files .DS_Store .DS_Store? ._* .Spotlight-V100 .Trashes ehthumbs.db Thumbs.db # Logs *.log logs/ # Temporary files *.tmp *.temp .tmp/ # Testing and Development .pytest_cache/ .coverage .coverage.* htmlcov/ .tox/ .nox/ .mypy_cache/ .dmypy.json dmypy.json .ruff_cache/ .pytype/ # Config files that might contain sensitive data *config_debug.json .env.local .env.development .env.test .env.production # Documentation builds docs/_build/ site/ # Project specific prompts/ .python-version

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/LangGPT/mcp-fetch'

If you have feedback or need assistance with the MCP directory API, please join our Discord server