Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default | 
|---|---|---|---|
| OPENAI_API_KEY | No | Your OpenAI API key for using GPT models | |
| MISTRAL_API_KEY | No | Your Mistral AI API key | |
| MCP_LLM_PROVIDER | No | The LLM provider to use (ollama, anthropic, openai, mistral, or custom) | |
| MCP_OLLAMA_MODEL | No | The Ollama model to use (e.g., llama2, mistral) | |
| ANTHROPIC_API_KEY | No | Your Anthropic API key for using Claude models | |
| MCP_CUSTOM_LLM_ENDPOINT | No | The endpoint URL for custom local models | 
Schema
Prompts
Interactive templates invoked by user choice
| Name | Description | 
|---|---|
No prompts  | |
Resources
Contextual data attached and managed by the client
| Name | Description | 
|---|---|
No resources  | |
Tools
Functions exposed to the LLM to take actions
| Name | Description | 
|---|---|
No tools  | |