Skip to main content
Glama

Leave Manager MCP Tool Server

by ahmad-act

Local AI with Ollama, WebUI & MCP on Windows

A self-hosted AI stack combining Ollama for running language models, Open WebUI for user-friendly chat interaction, and MCP for centralized model management—offering full control, privacy, and flexibility without relying on the cloud.

This sample project provides an MCP-based tool server for managing employee leave balance, applications, and history. It is exposed via OpenAPI using mcpo for easy integration with Open WebUI or other OpenAPI-compatible clients.


🚀 Features

  • ✅ Check employee leave balance
  • 📆 Apply for leave on specific dates
  • 📜 View leave history
  • 🙋 Personalized greeting functionality

📁 Project Structure

leave-manager/ ├── main.py # MCP server logic for leave management ├── requirements.txt # Python dependencies for the MCP server ├── Dockerfile # Docker image configuration for the leave manager ├── docker-compose.yml # Docker Compose file to run leave manager and Open WebUI └── README.md # Project documentation (this file)

📋 Prerequisites

  1. Windows 10 or later (required for Ollama)
  2. Docker Desktop for Windows (required for Open WebUI and MCP)

🛠️ Workflow

  1. Install Ollama on Windows
  2. Pull the deepseek-r1 model
  3. Clone the repository and navigate to the project directory
  4. Run the docker-compose.yml file to launch services

Install Ollama

➤ Windows

  1. Download the Installer:
  2. Run the Installer:
    • Execute OllamaSetup.exe and follow the installation prompts.
    • After installation, Ollama runs as a background service, accessible at: http://localhost:11434.
    • Verify in your browser; you should see:
      Ollama is running

    Ollama Initial Window Ollama Setup Progress Ollama In System Tray Ollama On Browser

  3. Start Ollama Server (if not already running):
    ollama serve

Verify Installation

Check the installed version of Ollama:

ollama --version

Expected Output:

ollama version 0.7.1

Pull the deepseek-r1 Model

1. Pull the Default Model (7B):

Using PoweShell
ollama pull deepseek-r1

deepseek-r1

To Pull Specific Versions:

ollama run deepseek-r1:1.5b ollama run deepseek-r1:671b

2. List Installed Models:

ollama list

Expected:

Expected Output:

NAME ID SIZE deepseek-r1:latest xxxxxxxxxxxx X.X GB

deepseek-r1:latest

4. Alternative Check via API:

curl http://localhost:11434/api/tags

Expected Output: A JSON response listing installed models, including deepseek-r1:latest.

alternative check

4. Test the API via PowerShell:

Invoke-RestMethod -Uri http://localhost:11434/api/generate -Method Post -Body '{"model": "deepseek-r1", "prompt": "Hello, world!", "stream": false}' -ContentType "application/json"

Expected Response: A JSON object containing the model's response to the "Hello, world!" prompt.

test the API

5. Run and Chat the Model via PowerShell:

ollama run deepseek-r1
  • This opens an interactive chat session with the deepseek-r1 model.
  • Type /bye and press Enter to exit the chat session.

run and chat

run and chat with Hi

exist chat


🐳 Run Open WebUI and MCP Server with Docker Compose

  1. Clone the Repository:
    git clone https://github.com/ahmad-act/Local-AI-with-Ollama-Open-WebUI-MCP-on-Windows.git cd Local-AI-with-Ollama-Open-WebUI-MCP-on-Windows
  2. To launch both the MCP tool and Open WebUI locally (on Docker Desktop):
    docker-compose up --build
    exist chat exist chat exist chat exist chat exist chat exist chat exist chat

This will:


🌐 Add MCP Tools to Open WebUI

The MCP tools are exposed via the OpenAPI specification at: http://localhost:8000/openapi.json.

  1. Open http://localhost:3000 in your browser.
  2. Click the Profile Icon and navigate to Settings. exist chat
  3. Select the Tools menu and click the Add (+) Button. exist chat
  4. Add a new tool by entering the URL: http://localhost:8000/. exist chat exist chat exist chat exist chat exist chat exist chat exist chat

💬 Example Prompts

Use these prompts in Open WebUI to interact with the Leave Manager tool:

  • Check Leave Balance:
    Check how many leave days are left for employee E001
    exist chat exist chat
  • Apply for Leave:
    Apply ![exist chat](readme-img/add-mcp-tools-on-open-webui-12.png)
  • View Leave History:
    What's the leave history of E001?
    exist chat
  • Personalized Greeting:
    Greet me as Alice
    exist chat

🛠️ Troubleshooting

  • Ollama not running: Ensure the service is active (ollama serve) and check http://localhost:11434.
  • Docker issues: Verify Docker Desktop is running and you have sufficient disk space.
  • Model not found: Confirm the deepseek-r1 model is listed with ollama list.
  • Port conflicts: Ensure ports 11434, 3000, and 8000 are free.

📚 Additional Resources

Related MCP Servers

  • -
    security
    F
    license
    -
    quality
    A server that implements a checklist management system with features like task creation, progress tracking, data persistence, and item comments.
    Last updated -
    5
    3
    TypeScript
  • A
    security
    A
    license
    A
    quality
    Provides API access to a locally-hosted task management system with features for creating, updating, and organizing tasks, including support for urgency levels, effort estimates, subtasks, and bi-directional sync with Obsidian markdown files.
    Last updated -
    12
    3
    Python
    MIT License
    • Apple
  • -
    security
    A
    license
    -
    quality
    A lightweight, modular API service that provides useful tools like weather, date/time, calculator, search, email, and task management through a RESTful interface, designed for integration with AI agents and automated workflows.
    Last updated -
    Python
    MIT License
  • A
    security
    A
    license
    A
    quality
    A server that exposes PagerDuty API functionality to LLMs with structured inputs and outputs, enabling management of incidents, services, teams, and users.
    Last updated -
    14
    Python
    MIT License
    • Apple

View all related MCP servers

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ahmad-act/Local-AI-with-Ollama-Open-WebUI-MCP-on-Windows'

If you have feedback or need assistance with the MCP directory API, please join our Discord server