Skip to main content
Glama

MemOS-MCP

by qinshu1109
Apache 2.0
3
  • Linux
  • Apple
tree_config.json801 B
{ "extractor_llm": { "backend": "ollama", "config": { "model_name_or_path": "qwen3:0.6b", "temperature": 0.0, "remove_think_prefix": true, "max_tokens": 8192 } }, "dispatcher_llm": { "backend": "ollama", "config": { "model_name_or_path": "qwen3:0.6b", "temperature": 0.0, "remove_think_prefix": true, "max_tokens": 8192 } }, "embedder": { "backend": "ollama", "config": { "model_name_or_path": "nomic-embed-text:latest" } }, "graph_db": { "backend": "neo4j", "config": { "uri": "bolt://localhost:7687", "user": "neo4j", "password": "12345678", "db_name": "user08alice", "auto_create": true, "embedding_dimension": 768 } }, "reorganize": false }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/qinshu1109/memos-MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server