Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@LLM Tool-Calling Assistantcalculate the monthly payment for a $250,000 mortgage at 4.5% interest over 30 years"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
This project connects a local LLM (e.g. Qwen) to tools such as a calculator or a knowledge base via the MCP protocol. The assistant automatically detects and calls these tools to help answer user queries.
π¦ Features
π§ Tool execution through MCP server
π§ Local LLM integration via HTTP or OpenAI SDK
π Knowledge base support (
data.json)β‘ Supports
stdioandssetransports
Related MCP server: MCP Documentation Server
π Project Files
File | Description |
| Registers tools and starts MCP server |
| Uses |
| Uses OpenAI-compatible SDK for LLM + tool call logic |
| MCP client using stdio |
| MCP client using SSE |
| Q&A knowledge base |
π₯ Installation
Requirements
Python 3.8+
Install dependencies:
pip install -r requirements.txtrequirements.txt
aiohttp==3.11.18
nest_asyncio==1.6.0
python-dotenv==1.1.0
openai==1.77.0
mcp==1.6.0π Getting Started
1. Run the MCP server
python server.pyThis launches your tool server with functions like add, multiply, and get_knowledge_base.
2. Start a client
Option A: HTTP client (local LLM via raw API)
python client-http.pyOption B: OpenAI SDK client
python client-openai.pyOption C: stdio transport
python client-stdio.pyOption D: SSE transport
Make sure server.py sets:
transport = "sse"Then run:
python client-sse.py㪠Example Prompts
Math Tool Call
What is 8 times 3?Response:
Eight times three is 24.Knowledge Base Question
What are the healthcare benefits available to employees in Singapore?Response will include the relevant answer from data.json.
π Example: data.json
[
{
"question": "What is Singapore's public holiday schedule?",
"answer": "Singapore observes several public holidays..."
},
{
"question": "How do I apply for permanent residency in Singapore?",
"answer": "Submit an online application via the ICA website..."
}
]π§ Configuration
Inside client-http.py or clientopenai.py, update the following:
LOCAL_LLM_URL = "..."
TOKEN = "your-api-token"
LOCAL_LLM_MODEL = "your-model"Make sure your LLM is serving OpenAI-compatible API endpoints.
π§Ή Cleanup
Clients handle tool calls and responses automatically. You can stop the server or client using Ctrl+C.
πͺͺ License
MIT License. See LICENSE file.