mnemos
by s60yucca
Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| MNEMOS_LOG_LEVEL | No | The logging level for the server. Possible values: debug, info, warn, error. | info |
| MNEMOS_PROJECT_ID | No | Scope memories to a specific project. This allows mnemos to separate knowledge across different coding projects. | |
| MNEMOS_EMBEDDINGS_API_KEY | No | The API key required if using a cloud-based embedding provider like 'openai'. | |
| MNEMOS_EMBEDDINGS_PROVIDER | No | The embedding provider used for semantic search. Options are 'noop' (text-only search), 'ollama' (local), or 'openai'. | noop |
Capabilities
Features and capabilities supported by this server
| Capability | Details |
|---|---|
| tools | {
"listChanged": true
} |
| prompts | {
"listChanged": true
} |
| resources | {
"subscribe": true,
"listChanged": true
} |
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
| mnemos_contextC | Assemble relevant context for a query within token budget |
| mnemos_deleteB | Soft-delete a memory |
| mnemos_getC | Get a memory by ID |
| mnemos_maintainB | Run decay, archival, and GC maintenance |
| mnemos_relateC | Create a relation between two memories |
| mnemos_searchC | Search memories using hybrid text+semantic search |
| mnemos_storeC | Store a new memory in Mnemos |
| mnemos_updateB | Update a memory (PATCH semantics) |
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
| load_context | Load relevant context at session start |
| save_session | Save important learnings at session end |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
| Storage statistics | Overall storage statistics |
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/s60yucca/mnemos'
If you have feedback or need assistance with the MCP directory API, please join our Discord server