widemem-ai
by remete618
Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| QDRANT_URL | No | URL for remote Qdrant vector store | |
| OPENAI_API_KEY | No | API key for OpenAI LLM provider | |
| ANTHROPIC_API_KEY | No | API key for Anthropic LLM provider | |
| WIDEMEM_DATA_PATH | No | Storage directory for memory data | ~/.widemem/data |
| WIDEMEM_LLM_MODEL | No | LLM model name | llama3.2 |
| WIDEMEM_LLM_BASE_URL | No | LLM API base URL | http://localhost:11434 |
| WIDEMEM_LLM_PROVIDER | No | LLM provider (e.g., 'openai', 'anthropic', 'ollama') | ollama |
| WIDEMEM_EMBEDDING_PROVIDER | No | Embedding provider (e.g., 'openai', 'sentence-transformers') | sentence-transformers |
Capabilities
Server capabilities have not been inspected yet.
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
No tools | |
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/remete618/widemem-ai'
If you have feedback or need assistance with the MCP directory API, please join our Discord server