LLM Tool-Calling Assistant

by o6-webwork

Integrations

  • Enables interaction with local LLMs running on the user's machine via an HTTP interface or OpenAI-compatible SDK.

  • Allows communication with OpenAI-compatible language models using the OpenAI SDK for tool-calling functionality.

This project connects a local LLM (e.g. Qwen) to tools such as a calculator or a knowledge base via the MCP protocol. The assistant automatically detects and calls these tools to help answer user queries.


📦 Features

  • 🔧 Tool execution through MCP server
  • 🧠 Local LLM integration via HTTP or OpenAI SDK
  • 📚 Knowledge base support (data.json)
  • ⚡ Supports stdio and sse transports

🗂 Project Files

FileDescription
server.pyRegisters tools and starts MCP server
client-http.pyUses aiohttp to communicate with local LLM
clientopenai.pyUses OpenAI-compatible SDK for LLM + tool call logic
client-stdio.pyMCP client using stdio
client-see.pyMCP client using SSE
data.jsonQ&A knowledge base

📥 Installation

Requirements

Python 3.8+

Install dependencies:

pip install -r requirements.txt

requirements.txt

aiohttp==3.11.18 nest_asyncio==1.6.0 python-dotenv==1.1.0 openai==1.77.0 mcp==1.6.0

🚀 Getting Started

1. Run the MCP server

python server.py

This launches your tool server with functions like add, multiply, and get_knowledge_base.

2. Start a client

Option A: HTTP client (local LLM via raw API)
python client-http.py
Option B: OpenAI SDK client
python client-openai.py
Option C: stdio transport
python client-stdio.py
Option D: SSE transport

Make sure server.py sets:

transport = "sse"

Then run:

python client-sse.py

💬 Example Prompts

Math Tool Call

What is 8 times 3?

Response:

Eight times three is 24.

Knowledge Base Question

What are the healthcare benefits available to employees in Singapore?

Response will include the relevant answer from data.json.


📁 Example: data.json

[ { "question": "What is Singapore's public holiday schedule?", "answer": "Singapore observes several public holidays..." }, { "question": "How do I apply for permanent residency in Singapore?", "answer": "Submit an online application via the ICA website..." } ]

🔧 Configuration

Inside client-http.py or clientopenai.py, update the following:

LOCAL_LLM_URL = "..." TOKEN = "your-api-token" LOCAL_LLM_MODEL = "your-model"

Make sure your LLM is serving OpenAI-compatible API endpoints.


🧹 Cleanup

Clients handle tool calls and responses automatically. You can stop the server or client using Ctrl+C.


🪪 License

MIT License. See LICENSE file.

-
security - not tested
A
license - permissive license
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

Connects local LLMs to external tools (calculator, knowledge base) via MCP protocol, enabling automatic tool detection and execution to enhance query responses.

  1. 🗂 Project Files
    1. 📥 Installation
      1. Requirements
      2. requirements.txt
    2. 🚀 Getting Started
      1. 1. Run the MCP server
      2. 2. Start a client
    3. 💬 Example Prompts
      1. Math Tool Call
      2. Knowledge Base Question
    4. 📁 Example: data.json
      1. 🔧 Configuration
        1. 🧹 Cleanup
          1. 🪪 License

            Related MCP Servers

            • -
              security
              A
              license
              -
              quality
              MCP server for toolhouse.ai. This does not rely on an external llm unlike the official server.
              Last updated -
              1
              Python
              MIT License
            • -
              security
              A
              license
              -
              quality
              A comprehensive toolkit that enhances LLM capabilities through the Model Context Protocol, allowing LLMs to interact with external services including command-line operations, file management, Figma integration, and audio processing.
              Last updated -
              12
              Python
              Apache 2.0
              • Linux
              • Apple
            • -
              security
              A
              license
              -
              quality
              An MCP server that provides tools to load and fetch documentation from any llms.txt source, giving users full control over context retrieval for LLMs in IDE agents and applications.
              Last updated -
              177
              Python
              MIT License
              • Apple
            • -
              security
              -
              license
              -
              quality
              A tool server that processes mathematical expressions via Multi-Chain Protocol (MCP), allowing LLMs to solve math problems through tool integration.
              Last updated -
              Python
              MIT License

            View all related MCP servers

            ID: d0hc6ovns6