Skip to main content
Glama

Leave Manager MCP Tool Server

by ahmad-act

Local AI with Ollama, WebUI & MCP on Windows

A self-hosted AI stack combining Ollama for running language models, Open WebUI for user-friendly chat interaction, and MCP for centralized model management—offering full control, privacy, and flexibility without relying on the cloud.

This sample project provides an MCP-based tool server for managing employee leave balance, applications, and history. It is exposed via OpenAPI using mcpo for easy integration with Open WebUI or other OpenAPI-compatible clients.


🚀 Features

  • ✅ Check employee leave balance
  • 📆 Apply for leave on specific dates
  • 📜 View leave history
  • 🙋 Personalized greeting functionality

📁 Project Structure

leave-manager/ ├── main.py # MCP server logic for leave management ├── requirements.txt # Python dependencies for the MCP server ├── Dockerfile # Docker image configuration for the leave manager ├── docker-compose.yml # Docker Compose file to run leave manager and Open WebUI └── README.md # Project documentation (this file)

📋 Prerequisites

  1. Windows 10 or later (required for Ollama)
  2. Docker Desktop for Windows (required for Open WebUI and MCP)

🛠️ Workflow

  1. Install Ollama on Windows
  2. Pull the deepseek-r1 model
  3. Clone the repository and navigate to the project directory
  4. Run the docker-compose.yml file to launch services

Install Ollama

➤ Windows

  1. Download the Installer:
  2. Run the Installer:
    • Execute OllamaSetup.exe and follow the installation prompts.
    • After installation, Ollama runs as a background service, accessible at: http://localhost:11434.
    • Verify in your browser; you should see:
      Ollama is running

    Ollama Initial Window Ollama Setup Progress Ollama In System Tray Ollama On Browser

  3. Start Ollama Server (if not already running):
    ollama serve

Verify Installation

Check the installed version of Ollama:

ollama --version

Expected Output:

ollama version 0.7.1

Pull the deepseek-r1 Model

1. Pull the Default Model (7B):

Using PoweShell
ollama pull deepseek-r1

deepseek-r1

To Pull Specific Versions:

ollama run deepseek-r1:1.5b ollama run deepseek-r1:671b

2. List Installed Models:

ollama list

Expected:

Expected Output:

NAME ID SIZE deepseek-r1:latest xxxxxxxxxxxx X.X GB

deepseek-r1:latest

4. Alternative Check via API:

curl http://localhost:11434/api/tags

Expected Output: A JSON response listing installed models, including deepseek-r1:latest.

alternative check

4. Test the API via PowerShell:

Invoke-RestMethod -Uri http://localhost:11434/api/generate -Method Post -Body '{"model": "deepseek-r1", "prompt": "Hello, world!", "stream": false}' -ContentType "application/json"

Expected Response: A JSON object containing the model's response to the "Hello, world!" prompt.

test the API

5. Run and Chat the Model via PowerShell:

ollama run deepseek-r1
  • This opens an interactive chat session with the deepseek-r1 model.
  • Type /bye and press Enter to exit the chat session.

run and chat

run and chat with Hi

exist chat


🐳 Run Open WebUI and MCP Server with Docker Compose

  1. Clone the Repository:
    git clone https://github.com/ahmad-act/Local-AI-with-Ollama-Open-WebUI-MCP-on-Windows.git cd Local-AI-with-Ollama-Open-WebUI-MCP-on-Windows
  2. To launch both the MCP tool and Open WebUI locally (on Docker Desktop):
    docker-compose up --build
    exist chat exist chat exist chat exist chat exist chat exist chat exist chat

This will:


🌐 Add MCP Tools to Open WebUI

The MCP tools are exposed via the OpenAPI specification at: http://localhost:8000/openapi.json.

  1. Open http://localhost:3000 in your browser.
  2. Click the Profile Icon and navigate to Settings. exist chat
  3. Select the Tools menu and click the Add (+) Button. exist chat
  4. Add a new tool by entering the URL: http://localhost:8000/. exist chat exist chat exist chat exist chat exist chat exist chat exist chat

💬 Example Prompts

Use these prompts in Open WebUI to interact with the Leave Manager tool:

  • Check Leave Balance:
    Check how many leave days are left for employee E001
    exist chat exist chat
  • Apply for Leave:
    Apply ![exist chat](readme-img/add-mcp-tools-on-open-webui-12.png)
  • View Leave History:
    What's the leave history of E001?
    exist chat
  • Personalized Greeting:
    Greet me as Alice
    exist chat

🛠️ Troubleshooting

  • Ollama not running: Ensure the service is active (ollama serve) and check http://localhost:11434.
  • Docker issues: Verify Docker Desktop is running and you have sufficient disk space.
  • Model not found: Confirm the deepseek-r1 model is listed with ollama list.
  • Port conflicts: Ensure ports 11434, 3000, and 8000 are free.

📚 Additional Resources

-
security - not tested
F
license - not found
-
quality - not tested

local-only server

The server can only run on the client's local machine because it depends on local resources.

A centralized employee leave management system that allows users to check leave balances, apply for leave, and view leave history through an OpenAPI interface.

  1. 🚀 Features
    1. 📁 Project Structure
      1. 📋 Prerequisites
        1. 🛠️ Workflow
          1. Install Ollama
            1. ➤ Windows
            2. Verify Installation
          2. Pull the deepseek-r1 Model
            1. 1. Pull the Default Model (7B):
            2. 2. List Installed Models:
            3. 4. Alternative Check via API:
            4. 4. Test the API via PowerShell:
            5. 5. Run and Chat the Model via PowerShell:
          3. 🐳 Run Open WebUI and MCP Server with Docker Compose
            1. 🌐 Add MCP Tools to Open WebUI
              1. 💬 Example Prompts
                1. 🛠️ Troubleshooting
                  1. 📚 Additional Resources

                    Related MCP Servers

                    • -
                      security
                      A
                      license
                      -
                      quality
                      A lightweight, modular API service that provides useful tools like weather, date/time, calculator, search, email, and task management through a RESTful interface, designed for integration with AI agents and automated workflows.
                      Last updated -
                      Python
                      MIT License
                    • -
                      security
                      F
                      license
                      -
                      quality
                      A server that enables LLMs to interact with Lark/Feishu services, currently supporting employee information queries via Lark's Contact API.
                      Last updated -
                      4
                      3
                      JavaScript
                    • -
                      security
                      F
                      license
                      -
                      quality
                      A Model Context Protocol server that enables querying attendance information and managing employee leave requests, overtime requests, and schedules.
                      Last updated -
                      Python
                    • -
                      security
                      F
                      license
                      -
                      quality
                      An intelligent hospital appointment management system that allows users to add doctors/patients, book appointments, and view medical schedules through both a manual UI and an LLM-powered assistant interface.
                      Last updated -
                      JavaScript

                    View all related MCP servers

                    MCP directory API

                    We provide all the information about MCP servers via our MCP API.

                    curl -X GET 'https://glama.ai/api/mcp/v1/servers/ahmad-act/Local-AI-with-Ollama-Open-WebUI-MCP-on-Windows'

                    If you have feedback or need assistance with the MCP directory API, please join our Discord server