This project connects a local LLM (e.g. Qwen) to tools such as a calculator or a knowledge base via the MCP protocol. The assistant automatically detects and calls these tools to help answer user queries.
๐ฆ Features
๐ง Tool execution through MCP server
๐ง Local LLM integration via HTTP or OpenAI SDK
๐ Knowledge base support (
data.json)โก Supports
stdioandssetransports
Related MCP server: MCP Documentation Server
๐ Project Files
File | Description |
| Registers tools and starts MCP server |
| Uses
to communicate with local LLM |
| Uses OpenAI-compatible SDK for LLM + tool call logic |
| MCP client using stdio |
| MCP client using SSE |
| Q&A knowledge base |
๐ฅ Installation
Requirements
Python 3.8+
Install dependencies:
requirements.txt
๐ Getting Started
1. Run the MCP server
This launches your tool server with functions like add, multiply, and get_knowledge_base.
2. Start a client
Option A: HTTP client (local LLM via raw API)
Option B: OpenAI SDK client
Option C: stdio transport
Option D: SSE transport
Make sure server.py sets:
Then run:
๐ฌ Example Prompts
Math Tool Call
Response:
Knowledge Base Question
Response will include the relevant answer from data.json.
๐ Example: data.json
๐ง Configuration
Inside client-http.py or clientopenai.py, update the following:
Make sure your LLM is serving OpenAI-compatible API endpoints.
๐งน Cleanup
Clients handle tool calls and responses automatically. You can stop the server or client using Ctrl+C.
๐ชช License
MIT License. See LICENSE file.