anythingllm-mcp
Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| ANYTHING_LLM_BASE | No | Base URL for AnythingLLM API (default: http://localhost:3001/api/v1) | http://localhost:3001/api/v1 |
| ANYTHING_LLM_API_KEY | Yes | Your AnythingLLM API key |
Capabilities
Features and capabilities supported by this server
| Capability | Details |
|---|---|
| tools | {} |
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
| check_tokenA | Validate the current API token |
| generate_api_keyB | Generate a new API key (admin) |
| list_workspacesA | List all workspaces |
| get_workspaceB | Get details of a specific workspace |
| create_workspaceC | Create a new workspace |
| update_workspaceD | Update workspace settings |
| delete_workspaceC | Delete a workspace |
| chatC | Send a chat message to a workspace (mode: chat or query) |
| stream_chatC | Stream a chat message to a workspace |
| upload_documentC | Upload a document to a workspace |
| update_embeddingsC | Add or remove documents from workspace embeddings |
| list_workspace_documentsC | List all documents in a workspace |
| list_threadsC | List all threads in a workspace |
| get_threadC | Get details of a specific thread |
| delete_threadC | Delete a thread from a workspace |
| get_system_envB | Get system environment configuration |
| openai_chat_completionC | OpenAI-compatible chat completion endpoint (use workspace as model) |
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/moliver28/anythingllm-mcp'
If you have feedback or need assistance with the MCP directory API, please join our Discord server