This project connects a local LLM (e.g. Qwen) to tools such as a calculator or a knowledge base via the MCP protocol. The assistant automatically detects and calls these tools to help answer user queries.
📦 Features
🔧 Tool execution through MCP server
🧠 Local LLM integration via HTTP or OpenAI SDK
📚 Knowledge base support (
data.json)⚡ Supports
stdioandssetransports
Related MCP server: MCP Documentation Server
🗂 Project Files
File | Description |
| Registers tools and starts MCP server |
| Uses
to communicate with local LLM |
| Uses OpenAI-compatible SDK for LLM + tool call logic |
| MCP client using stdio |
| MCP client using SSE |
| Q&A knowledge base |
📥 Installation
Requirements
Python 3.8+
Install dependencies:
requirements.txt
🚀 Getting Started
1. Run the MCP server
This launches your tool server with functions like add, multiply, and get_knowledge_base.
2. Start a client
Option A: HTTP client (local LLM via raw API)
Option B: OpenAI SDK client
Option C: stdio transport
Option D: SSE transport
Make sure server.py sets:
Then run:
💬 Example Prompts
Math Tool Call
Response:
Knowledge Base Question
Response will include the relevant answer from data.json.
📁 Example: data.json
🔧 Configuration
Inside client-http.py or clientopenai.py, update the following:
Make sure your LLM is serving OpenAI-compatible API endpoints.
🧹 Cleanup
Clients handle tool calls and responses automatically. You can stop the server or client using Ctrl+C.
🪪 License
MIT License. See LICENSE file.