LibreModel MCP Server ๐ค
A Model Context Protocol (MCP) server that bridges Claude Desktop with your local LLM instance running via llama-server.
Features
๐ฌ Full conversation support with Local Model through Claude Desktop
๐๏ธ Complete parameter control (temperature, max_tokens, top_p, top_k)
โ Health monitoring and server status checks
๐งช Built-in testing tools for different capabilities
๐ Performance metrics and token usage tracking
๐ง Easy configuration via environment variables
Related MCP server: Claude-LMStudio Bridge
Quick Start
A Model Context Protocol (MCP) server that bridges Claude Desktop with your local LLM instance running via llama-server.
Features
๐ฌ Full conversation support with LibreModel through Claude Desktop
๐๏ธ Complete parameter control (temperature, max_tokens, top_p, top_k)
โ Health monitoring and server status checks
๐งช Built-in testing tools for different capabilities
๐ Performance metrics and token usage tracking
๐ง Easy configuration via environment variables
Quick Start
1. Install Dependencies
2. Build the Server
3. Start Your LibreModel
Make sure llama-server is running with your model:
4. Configure Claude Desktop
Add this to your Claude Desktop configuration (~/.config/claude/claude_desktop_config.json):
5. Restart Claude Desktop
Claude will now have access to LibreModel through MCP!
Usage
Once configured, you can use these tools in Claude Desktop:
๐ฌ chat - Main conversation tool
๐งช quick_test - Test LibreModel capabilities
๐ฅ health_check - Monitor server status
Configuration
Set environment variables to customize behavior:
Available Tools
Tool | Description | Parameters |
| Converse with MOdel |
,
,
,
,
,
|
| Run predefined capability tests |
(hello/math/creative/knowledge) |
| Check server health and status | None |
Resources
Configuration: View current server settings
Instructions: Detailed usage guide and setup instructions
Development
2. Build the Server
3. Start Your LibreModel
Make sure llama-server is running with your model:
4. Configure Claude Desktop
Add this to your Claude Desktop configuration (~/.config/claude/claude_desktop_config.json):
5. Restart Claude Desktop
Claude will now have access to LibreModel through MCP!
Usage
Once configured, you can use these tools in Claude Desktop:
๐ฌ chat - Main conversation tool
๐งช quick_test - Test LibreModel capabilities
๐ฅ health_check - Monitor server status
Configuration
Set environment variables to customize behavior:
Available Tools
Tool | Description | Parameters |
| Converse with MOdel |
,
,
,
,
,
|
| Run predefined capability tests |
(hello/math/creative/knowledge) |
| Check server health and status | None |
Resources
Configuration: View current server settings
Instructions: Detailed usage guide and setup instructions
Development
Architecture
The MCP server acts as a bridge, translating MCP protocol messages into llama-server API calls and formatting responses for Claude Desktop.
Troubleshooting
"Cannot reach LLama server"
Ensure llama-server is running on the configured port
Check that the model is loaded and responding
Verify firewall/network settings
"Tool not found in Claude Desktop"
Restart Claude Desktop after configuration changes
Check that the path to
index.jsis correct and absoluteVerify the MCP server builds without errors
Poor response quality
Adjust temperature and sampling parameters
Try different system prompts
License
CC0-1.0 - Public Domain. Use freely!
Built with โค๏ธ for open-source AI and the LibreModel project. by Claude Sonnet4
Development mode (auto-rebuild)
npm run dev
Build for production
npm run build
Start the server directly
npm start
Claude Desktop โโ LLama MCP Server โโ llama-server API โโ Local Model
2. Build the Server
3. Start Your LibreModel
Make sure llama-server is running with your model:
4. Configure Claude Desktop
Add this to your Claude Desktop configuration (~/.config/claude/claude_desktop_config.json):
5. Restart Claude Desktop
Claude will now have access to LibreModel through MCP!
Usage
Once configured, you can use these tools in Claude Desktop:
๐ฌ chat - Main conversation tool
๐งช quick_test - Test LibreModel capabilities
๐ฅ health_check - Monitor server status
Configuration
Set environment variables to customize behavior:
Available Tools
Tool | Description | Parameters |
| Converse with MOdel |
,
,
,
,
,
|
| Run predefined capability tests |
(hello/math/creative/knowledge) |
| Check server health and status | None |
Resources
Configuration: View current server settings
Instructions: Detailed usage guide and setup instructions
Development
Architecture
The MCP server acts as a bridge, translating MCP protocol messages into llama-server API calls and formatting responses for Claude Desktop.
Troubleshooting
"Cannot reach LLama server"
Ensure llama-server is running on the configured port
Check that the model is loaded and responding
Verify firewall/network settings
"Tool not found in Claude Desktop"
Restart Claude Desktop after configuration changes
Check that the path to
index.jsis correct and absoluteVerify the MCP server builds without errors
Poor response quality
Adjust temperature and sampling parameters
Try different system prompts
License
CC0-1.0 - Public Domain. Use freely!
Built with โค๏ธ for open-source AI and the LibreModel project. by Claude Sonnet4