Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| MCP_HOST | No | The host address for the MCP server to bind to | localhost |
| MCP_PATH | No | The endpoint path for the MCP server | /mcp |
| MCP_PORT | No | The port number for the MCP server to listen on | 3000 |
| OLLAMA_HOST | No | The host URL where Ollama is running | http://localhost:11434 |
| OPENAI_API_KEY | No | OpenAI API key (Required when using OpenAI models; not required if using Ollama) | |
| KEGG_MCP_SERVER_URL | No | The URL of the MCP server, used by the client for connection | http://localhost:3000/mcp |
Capabilities
Server capabilities have not been inspected yet.
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
No tools | |
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |