The mcp-llm server provides tools to interact with LLMs for code generation, documentation, and answering questions:
- Generate Code: Create code based on a description in a specified programming language, with optional context.
- Generate Code to File: Generate code and insert it directly into a file at a specific line number, optionally replacing existing lines.
- Generate Documentation: Automatically generate documentation for provided code in a specified format.
- Ask Question: Pose general questions to the LLM with optional context for clarification.
MCP LLM
An MCP server that provides access to LLMs using the LlamaIndexTS library.
Features
This MCP server provides the following tools:
generate_code
: Generate code based on a descriptiongenerate_code_to_file
: Generate code and write it directly to a file at a specific line numbergenerate_documentation
: Generate documentation for codeask_question
: Ask a question to the LLM
Installation
Installing via Smithery
To install LLM Server for Claude Desktop automatically via Smithery:
Manual Install From Source
- Clone the repository
- Install dependencies:
- Build the project:
- Update your MCP configuration
Using the Example Script
The repository includes an example script that demonstrates how to use the MCP server programmatically:
This script starts the MCP server and sends requests to it using curl commands.
Examples
Generate Code
Generate Code to File
The generate_code_to_file
tool supports both relative and absolute file paths. If a relative path is provided, it will be resolved relative to the current working directory of the MCP server.
Generate Documentation
Ask Question
License
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
An MCP server that provides LLMs access to other LLMs
Related Resources
Related MCP Servers
- -securityAlicense-qualityAn MCP server that connects any MCP client (like Claude or Cursor) with the browser using browser-use, allowing clients to utilize existing LLMs without requiring additional API keys.Last updated 4 months ago74PythonApache 2.0
- -securityAlicense-qualityAn MCP server that provides tools to load and fetch documentation from any llms.txt source, giving users full control over context retrieval for LLMs in IDE agents and applications.Last updated 7 days ago553PythonMIT License
- -securityFlicense-qualityA Filesystem MCP server that allows an LLM to read and list files from a specified directory on your local machine through the Model Context Protocol.Last updated 21 days ago2Python
- -securityFlicense-qualityAn MCP server that fetches real-time documentation for popular libraries like Langchain, Llama-Index, MCP, and OpenAI, allowing LLMs to access updated library information beyond their knowledge cut-off dates.Last updated 4 months ago2Python