We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/wx-b/long-context-mcp'
If you have feedback or need assistance with the MCP directory API, please join our Discord server
ollama.env.example•366 B
# Ollama Local Configuration
# Install Ollama: https://ollama.com/
# Pull models: ollama pull qwen2.5-coder:7b
# Start server: ollama serve
# Ollama typically doesn't require API keys for local usage
# OLLAMA_API_KEY=
# The base URL is automatically configured by the provider preset
# Default: http://localhost:11434/v1
# OLLAMA_BASE_URL=http://localhost:11434/v1