LibreModel MCP Server 🤖
A Model Context Protocol (MCP) server that bridges Claude Desktop with your local LLM instance running via llama-server.
Features
💬 Full conversation support with Local Model through Claude Desktop
🎛️ Complete parameter control (temperature, max_tokens, top_p, top_k)
✅ Health monitoring and server status checks
🧪 Built-in testing tools for different capabilities
📊 Performance metrics and token usage tracking
🔧 Easy configuration via environment variables
Quick Start
A Model Context Protocol (MCP) server that bridges Claude Desktop with your local LLM instance running via llama-server.
Features
💬 Full conversation support with LibreModel through Claude Desktop
🎛️ Complete parameter control (temperature, max_tokens, top_p, top_k)
✅ Health monitoring and server status checks
🧪 Built-in testing tools for different capabilities
📊 Performance metrics and token usage tracking
🔧 Easy configuration via environment variables
Quick Start
1. Install Dependencies
2. Build the Server
3. Start Your LibreModel
Make sure llama-server is running with your model:
4. Configure Claude Desktop
Add this to your Claude Desktop configuration (~/.config/claude/claude_desktop_config.json
):
5. Restart Claude Desktop
Claude will now have access to LibreModel through MCP!
Usage
Once configured, you can use these tools in Claude Desktop:
💬 chat
- Main conversation tool
🧪 quick_test
- Test LibreModel capabilities
🏥 health_check
- Monitor server status
Configuration
Set environment variables to customize behavior:
Available Tools
Tool | Description | Parameters |
| Converse with MOdel |
,
,
,
,
,
|
| Run predefined capability tests |
(hello/math/creative/knowledge) |
| Check server health and status | None |
Resources
Configuration: View current server settings
Instructions: Detailed usage guide and setup instructions
Development
2. Build the Server
3. Start Your LibreModel
Make sure llama-server is running with your model:
4. Configure Claude Desktop
Add this to your Claude Desktop configuration (~/.config/claude/claude_desktop_config.json
):
5. Restart Claude Desktop
Claude will now have access to LibreModel through MCP!
Usage
Once configured, you can use these tools in Claude Desktop:
💬 chat
- Main conversation tool
🧪 quick_test
- Test LibreModel capabilities
🏥 health_check
- Monitor server status
Configuration
Set environment variables to customize behavior:
Available Tools
Tool | Description | Parameters |
| Converse with MOdel |
,
,
,
,
,
|
| Run predefined capability tests |
(hello/math/creative/knowledge) |
| Check server health and status | None |
Resources
Configuration: View current server settings
Instructions: Detailed usage guide and setup instructions
Development
Architecture
The MCP server acts as a bridge, translating MCP protocol messages into llama-server API calls and formatting responses for Claude Desktop.
Troubleshooting
"Cannot reach LLama server"
Ensure llama-server is running on the configured port
Check that the model is loaded and responding
Verify firewall/network settings
"Tool not found in Claude Desktop"
Restart Claude Desktop after configuration changes
Check that the path to
index.js
is correct and absoluteVerify the MCP server builds without errors
Poor response quality
Adjust temperature and sampling parameters
Try different system prompts
License
CC0-1.0 - Public Domain. Use freely!
Built with ❤️ for open-source AI and the LibreModel project. by Claude Sonnet4
Development mode (auto-rebuild)
npm run dev
Build for production
npm run build
Start the server directly
npm start
Claude Desktop ←→ LLama MCP Server ←→ llama-server API ←→ Local Model
2. Build the Server
3. Start Your LibreModel
Make sure llama-server is running with your model:
4. Configure Claude Desktop
Add this to your Claude Desktop configuration (~/.config/claude/claude_desktop_config.json
):
5. Restart Claude Desktop
Claude will now have access to LibreModel through MCP!
Usage
Once configured, you can use these tools in Claude Desktop:
💬 chat
- Main conversation tool
🧪 quick_test
- Test LibreModel capabilities
🏥 health_check
- Monitor server status
Configuration
Set environment variables to customize behavior:
Available Tools
Tool | Description | Parameters |
| Converse with MOdel |
,
,
,
,
,
|
| Run predefined capability tests |
(hello/math/creative/knowledge) |
| Check server health and status | None |
Resources
Configuration: View current server settings
Instructions: Detailed usage guide and setup instructions
Development
Architecture
The MCP server acts as a bridge, translating MCP protocol messages into llama-server API calls and formatting responses for Claude Desktop.
Troubleshooting
"Cannot reach LLama server"
Ensure llama-server is running on the configured port
Check that the model is loaded and responding
Verify firewall/network settings
"Tool not found in Claude Desktop"
Restart Claude Desktop after configuration changes
Check that the path to
index.js
is correct and absoluteVerify the MCP server builds without errors
Poor response quality
Adjust temperature and sampling parameters
Try different system prompts
License
CC0-1.0 - Public Domain. Use freely!
Built with ❤️ for open-source AI and the LibreModel project. by Claude Sonnet4
local-only server
The server can only run on the client's local machine because it depends on local resources.
Bridges Claude Desktop with local LLM instances running via llama-server, enabling full conversation support with complete parameter control and health monitoring. Allows users to chat with their local models directly through Claude Desktop with configurable sampling parameters.
Related MCP Servers
- -securityAlicense-qualityEnables seamless integration between Ollama's local LLM models and MCP-compatible applications, supporting model management and chat interactions.Last updated -128103AGPL 3.0
- -securityFlicense-qualityAn MCP server that allows Claude to interact with local LLMs running in LM Studio, providing access to list models, generate text, and use chat completions through local models.Last updated -10
- -securityAlicense-qualityA bridge that allows Claude to communicate with locally running LLM models via LM Studio, enabling users to leverage their private models through Claude's interface.Last updated -114MIT License
- -securityAlicense-qualityA local MCP server that integrates with Claude Desktop, enabling RAG capabilities to provide Claude with up-to-date private information from custom LlamaCloud indices.Last updated -201MIT License