Skip to main content
Glama

MCP Sendmail Server

by maxyychen
.env.example242 B
# MCP Server Configuration MCP_SERVER_URL=http://localhost:8080 # Ollama Configuration OLLAMA_BASE_URL=http://localhost:11434 OLLAMA_MODEL=gpt-oss:20b # Optional: Override config.yaml settings # OLLAMA_TEMPERATURE=0.7 # OLLAMA_NUM_CTX=4096

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/maxyychen/mcp-sendmail'

If you have feedback or need assistance with the MCP directory API, please join our Discord server