Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| ENV | No | The environment mode (development/production) | development |
| PORT | No | The port on which the MCP server runs | 8000 |
| LOG_LEVEL | No | The logging level for the server | INFO |
| OLLAMA_MODEL | No | The Ollama model to use for LLM inference | llama3:latest |
| ALLOWED_ORIGINS | No | Allowed CORS origins for the server | * |
| OLLAMA_BASE_URL | No | The base URL of the Ollama API server | http://localhost:11434 |
Schema
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
No tools | |