Skip to main content
Glama

MCP-Mem0

by coleam00
.env.example1.63 kB
# The transport for the MCP server - either 'sse' or 'stdio' (defaults to SSE if left empty) TRANSPORT= # Host to bind to if using sse as the transport (leave empty if using stdio) HOST= # Port to listen on if using sse as the transport (leave empty if using stdio) PORT= # The provider for your LLM # Set this to either openai, openrouter, or ollama # This is needed on top of the base URL for Mem0 (long term memory) LLM_PROVIDER= # Base URL for the OpenAI compatible instance (default is https://api.openai.com/v1) # OpenAI: https://api.openai.com/v1 # Ollama (example): http://localhost:11434/v1 # OpenRouter: https://openrouter.ai/api/v1 LLM_BASE_URL= # OpenAI: https://help.openai.com/en/articles/4936850-where-do-i-find-my-openai-api-key # Open Router: Get your API Key here after registering: https://openrouter.ai/keys # Ollama: No need to set this unless you specifically configured an API key LLM_API_KEY= # The LLM you want to use for processing memories. # OpenAI example: gpt-4o-mini # OpenRouter example: anthropic/claude-3.7-sonnet # Ollama example: qwen2.5:14b-instruct-8k LLM_CHOICE= # The embedding model you want to use to store memories - this needs to be from the same provider as set above. # OpenAI example: text-embedding-3-small # Ollama example: nomic-embed-text EMBEDDING_MODEL_CHOICE= # Postgres DB URL used for mem0 # Format: postgresql://[user]:[password]@[host]:[port]/[database_name] # Example: postgresql://postgres:mypassword@localhost:5432/mydb # For Supabase Postgres connection, you can find this in "Connect" (top middle of Supabase dashboard) -> Transaction pooler DATABASE_URL=

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/coleam00/mcp-mem0'

If you have feedback or need assistance with the MCP directory API, please join our Discord server