Skip to main content
Glama
tunamsyar

Modular MCP Server with Python Tools

by tunamsyar

Modular MCP Server with Python Tools

This project is a test/demo implementation to understand how to build a modular MCP (Model Context Protocol) server in Python, integrate custom tools, and use prompts to interact with language models.


Purpose

  • Explore Python async programming and tool registration with MCP.

  • Practice building tools that can be invoked via chat input (e.g., note creation, searching, running commands).

  • Experiment with prompt design and using local LLMs (via Ollama) to parse natural language into structured commands.

  • Understand how to detect user intents from free text and map them to specific tools.

  • Learn how to handle tool invocation responses asynchronously and display results.


Related MCP server: mcp-server-llmling

Features

  • Tool examples:

    • Note creation and search with SQLite backend.

    • Weather fetching.

    • Mathematical calculation.

    • Time queries.

    • Running shell commands safely.

    • File operations.

  • Intent detection via keyword matching and prompt parsing.

  • Ollama local LLM integration for structured data extraction.

  • Modular design allowing easy addition of new tools.


Usage

  • Start the MCP server (e.g., python simple_modular_server.py).

  • Run the client script to chat and interact with tools.

  • Use natural language commands like:

    • "make a note. title is shopping list. category groceries."

    • "search note groceries"

    • "run command 'pwd'"

    • "what's the weather in London?"


Notes

  • This project is purely for learning and testing.

  • It is not production-ready.

  • Designed to help understand MCP architecture, Python tooling, and prompt engineering with local LLMs.


Requirements

  • Python 3.8+

  • Ollama installed and running locally with your preferred models (e.g., phi, llama3.2:3b).

  • SQLite (built-in with Python).

  • Dependencies listed in requirements.txt (if created).


License

This is an educational project with no license.


Feel free to experiment and extend!

-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/tunamsyar/ollama-mcp-py'

If you have feedback or need assistance with the MCP directory API, please join our Discord server