Skip to main content
Glama

Ollama MCP Server

by etnlbck
railway.jsonโ€ข554 B
{ "build": { "builder": "DOCKERFILE" }, "deploy": { "startCommand": null, "healthcheckPath": "/healthz", "healthcheckTimeout": 300 }, "volumes": [ { "name": "ollama-models", "mountPath": "/data/ollama" } ], "envVars": { "OLLAMA_HOST": { "value": "0.0.0.0:11434" }, "OLLAMA_BASE_URL": { "value": "http://127.0.0.1:11434" }, "MCP_TRANSPORT": { "value": "http" }, "MCP_HTTP_PORT": { "value": "8080" }, "PORT": { "value": "8080" } } }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/etnlbck/ollama-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server