Skip to main content
Glama

Llama 4 Maverick MCP Server

by YobieBen
.gitignoreโ€ข986 B
# Byte-compiled / optimized / DLL files __pycache__/ *.py[cod] *$py.class # C extensions *.so # Distribution / packaging .Python build/ develop-eggs/ dist/ downloads/ eggs/ .eggs/ lib/ lib64/ parts/ sdist/ var/ wheels/ share/python-wheels/ *.egg-info/ .installed.cfg *.egg MANIFEST # PyInstaller *.manifest *.spec # Installer logs pip-log.txt pip-delete-this-directory.txt # Unit test / coverage reports htmlcov/ .tox/ .nox/ .coverage .coverage.* .cache nosetests.xml coverage.xml *.cover *.py,cover .hypothesis/ .pytest_cache/ cover/ # Virtual environments venv/ ENV/ env/ .venv/ # IDEs .vscode/ .idea/ *.swp *.swo *~ .DS_Store # Environment files .env .env.local .env.*.local # Logs logs/ *.log # Database *.db *.sqlite *.sqlite3 # Temporary files *.tmp *.temp .tmp/ temp/ # Ollama models (if stored locally) models/ # OS files Thumbs.db Desktop.ini # Project specific .mypy_cache/ .dmypy.json dmypy.json .ruff_cache/ # Documentation builds docs/_build/ docs/.doctrees/

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/YobieBen/llama4-maverick-mcp-python'

If you have feedback or need assistance with the MCP directory API, please join our Discord server