Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Modular MCP Server with Python Toolscreate a note titled 'meeting notes' with category work"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Modular MCP Server with Python Tools
This project is a test/demo implementation to understand how to build a modular MCP (Model Context Protocol) server in Python, integrate custom tools, and use prompts to interact with language models.
Purpose
Explore Python async programming and tool registration with MCP.
Practice building tools that can be invoked via chat input (e.g., note creation, searching, running commands).
Experiment with prompt design and using local LLMs (via Ollama) to parse natural language into structured commands.
Understand how to detect user intents from free text and map them to specific tools.
Learn how to handle tool invocation responses asynchronously and display results.
Related MCP server: mcp-server-llmling
Features
Tool examples:
Note creation and search with SQLite backend.
Weather fetching.
Mathematical calculation.
Time queries.
Running shell commands safely.
File operations.
Intent detection via keyword matching and prompt parsing.
Ollama local LLM integration for structured data extraction.
Modular design allowing easy addition of new tools.
Usage
Start the MCP server (e.g.,
python simple_modular_server.py).Run the client script to chat and interact with tools.
Use natural language commands like:
"make a note. title is shopping list. category groceries.""search note groceries""run command 'pwd'""what's the weather in London?"
Notes
This project is purely for learning and testing.
It is not production-ready.
Designed to help understand MCP architecture, Python tooling, and prompt engineering with local LLMs.
Requirements
Python 3.8+
Ollama installed and running locally with your preferred models (e.g.,
phi,llama3.2:3b).SQLite (built-in with Python).
Dependencies listed in
requirements.txt(if created).
License
This is an educational project with no license.
Feel free to experiment and extend!