MCP Code Expert System
by tomsiwik
Verified
Server Configuration
Describes the environment variables required to run the server.
Name | Required | Description | Default |
---|---|---|---|
OLLAMA_HOST | No | URL of the Ollama server | http://localhost:11434 |
OLLAMA_MODEL | No | Ollama model to use for AI-powered code reviews | llama3:8b |
KNOWLEDGE_GRAPH_PATH | No | Path to store the knowledge graph data | data/knowledge_graph.json |
Schema
Prompts
Interactive templates invoked by user choice
Name | Description |
---|---|
No prompts |
Resources
Contextual data attached and managed by the client
Name | Description |
---|---|
No resources |
Tools
Functions exposed to the LLM to take actions
Name | Description |
---|---|
No tools |