Skip to main content
Glama

MCP Memory LibSQL Go

.env.example884 B
## .env.example - example environment variables for docker-compose / local runs # Core LIBSQL_URL=file:/data/libsql.db LIBSQL_AUTH_TOKEN= # Server MODE=single # single | multi | voyageai PORT=8080 METRICS_PORT=9090 TRANSPORT=sse PORT=:${PORT} SSE_ENDPOINT=/sse # Embeddings (choose one provider) EMBEDDINGS_PROVIDER= EMBEDDING_DIMS=1536 # OpenAI OPENAI_API_KEY= OPENAI_EMBEDDINGS_MODEL=text-embedding-3-small # Ollama OLLAMA_HOST=http://ollama:11434 OLLAMA_EMBEDDINGS_MODEL=nomic-embed-text # LocalAI LOCALAI_BASE_URL=http://localai:8080/v1 LOCALAI_EMBEDDINGS_MODEL=text-embedding-ada-002 # VoyageAI VOYAGEAI_API_KEY= VOYAGEAI_EMBEDDINGS_MODEL=voyage-3-lite # Multi-project PROJECTS_DIR=/data/projects MULTI_PROJECT_AUTH_REQUIRED=true MULTI_PROJECT_AUTO_INIT_TOKEN=false MULTI_PROJECT_DEFAULT_TOKEN= # Metrics METRICS_PROMETHEUS=true METRICS_PORT=:${METRICS_PORT}

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ZanzyTHEbar/mcp-memory-libsql-go'

If you have feedback or need assistance with the MCP directory API, please join our Discord server