mcp-llm
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
MCP LLM
An MCP server that provides access to LLMs using the LlamaIndexTS library.
Features
This MCP server provides the following tools:
generate_code
: Generate code based on a descriptiongenerate_code_to_file
: Generate code and write it directly to a file at a specific line numbergenerate_documentation
: Generate documentation for codeask_question
: Ask a question to the LLM
Installation
Update your MCP config to add the mcp-llm server:
Available Scripts
npm run build
- Build the projectnpm run watch
- Watch for changes and rebuildnpm start
- Start the MCP servernpm run example
- Run the example scriptnpm run inspector
- Run the MCP inspector
Configuration
The MCP server is configurable using environment variables:
Required Environment Variables
LLM_MODEL_NAME
: The name of the model to use (e.g.,qwen2-32b:q6_k
,anthropic.claude-3-7-sonnet-20250219-v1:0
)LLM_MODEL_PROVIDER
: The model provider (e.g.,bedrock
,ollama
,openai
,openai-compatible
)
Optional Environment Variables
LLM_BASE_URL
: Base URL for the model provider (e.g.,https://ollama.internal
,http://my-openai-compatible-server.com:3000/v1
)LLM_TEMPERATURE
: Temperature parameter for the model (e.g.,0.2
)LLM_NUM_CTX
: Context window size (e.g.,16384
)LLM_TOP_P
: Top-p parameter for the model (e.g.,0.85
)LLM_TOP_K
: Top-k parameter for the model (e.g.,40
)LLM_MIN_P
: Min-p parameter for the model (e.g.,0.05
)LLM_REPETITION_PENALTY
: Repetition penalty parameter for the model (e.g.,1.05
)LLM_SYSTEM_PROMPT_GENERATE_CODE
: System prompt for the generate_code toolLLM_SYSTEM_PROMPT_GENERATE_DOCUMENTATION
: System prompt for the generate_documentation toolLLM_SYSTEM_PROMPT_ASK_QUESTION
: System prompt for the ask_question toolLLM_TIMEOUT_S
: Timeout in seconds for LLM requests (e.g.,240
for 4 minutes)LLM_ALLOW_FILE_WRITE
: Set totrue
to allow thegenerate_code_to_file
tool to write to files (default:false
)OPENAI_API_KEY
: API key for OpenAI (required when using OpenAI provider)
Manual Install From Source
- Clone the repository
- Install dependencies:
- Build the project:
- Update your MCP configuration
Using the Example Script
The repository includes an example script that demonstrates how to use the MCP server programmatically:
This script starts the MCP server and sends requests to it using curl commands.
Examples
Generate Code
Generate Code to File
The generate_code_to_file
tool supports both relative and absolute file paths. If a relative path is provided, it will be resolved relative to the current working directory of the MCP server.
Generate Documentation
Ask Question
License
You must be authenticated.
An MCP server that provides LLMs access to other LLMs