Integrations
This project connects a local LLM (e.g. Qwen) to tools such as a calculator or a knowledge base via the MCP protocol. The assistant automatically detects and calls these tools to help answer user queries.
📦 Features
- 🔧 Tool execution through MCP server
- 🧠 Local LLM integration via HTTP or OpenAI SDK
- 📚 Knowledge base support (
data.json
) - ⚡ Supports
stdio
andsse
transports
🗂 Project Files
File | Description |
---|---|
server.py | Registers tools and starts MCP server |
client-http.py | Uses aiohttp to communicate with local LLM |
clientopenai.py | Uses OpenAI-compatible SDK for LLM + tool call logic |
client-stdio.py | MCP client using stdio |
client-see.py | MCP client using SSE |
data.json | Q&A knowledge base |
📥 Installation
Requirements
Python 3.8+
Install dependencies:
requirements.txt
🚀 Getting Started
1. Run the MCP server
This launches your tool server with functions like add
, multiply
, and get_knowledge_base
.
2. Start a client
Option A: HTTP client (local LLM via raw API)
Option B: OpenAI SDK client
Option C: stdio transport
Option D: SSE transport
Make sure server.py
sets:
Then run:
💬 Example Prompts
Math Tool Call
Response:
Knowledge Base Question
Response will include the relevant answer from data.json
.
📁 Example: data.json
🔧 Configuration
Inside client-http.py
or clientopenai.py
, update the following:
Make sure your LLM is serving OpenAI-compatible API endpoints.
🧹 Cleanup
Clients handle tool calls and responses automatically. You can stop the server or client using Ctrl+C
.
🪪 License
MIT License. See LICENSE file.
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
Connects local LLMs to external tools (calculator, knowledge base) via MCP protocol, enabling automatic tool detection and execution to enhance query responses.
Related MCP Servers
- -securityAlicense-qualityMCP server for toolhouse.ai. This does not rely on an external llm unlike the official server.Last updated -1PythonMIT License
- -securityAlicense-qualityA comprehensive toolkit that enhances LLM capabilities through the Model Context Protocol, allowing LLMs to interact with external services including command-line operations, file management, Figma integration, and audio processing.Last updated -12PythonApache 2.0
- -securityAlicense-qualityAn MCP server that provides tools to load and fetch documentation from any llms.txt source, giving users full control over context retrieval for LLMs in IDE agents and applications.Last updated -177PythonMIT License
- -security-license-qualityA tool server that processes mathematical expressions via Multi-Chain Protocol (MCP), allowing LLMs to solve math problems through tool integration.Last updated -PythonMIT License