Skip to main content
Glama

Leave Manager MCP Tool Server

by ahmad-act

Local AI with Ollama, WebUI & MCP on Windows

A self-hosted AI stack combining Ollama for running language models, Open WebUI for user-friendly chat interaction, and MCP for centralized model managementβ€”offering full control, privacy, and flexibility without relying on the cloud.

This sample project provides an MCP-based tool server for managing employee leave balance, applications, and history. It is exposed via OpenAPI using mcpo for easy integration with Open WebUI or other OpenAPI-compatible clients.


πŸš€ Features

  • βœ… Check employee leave balance

  • πŸ“† Apply for leave on specific dates

  • πŸ“œ View leave history

  • πŸ™‹ Personalized greeting functionality


Related MCP server: Lark MCP Server

πŸ“ Project Structure

leave-manager/ β”œβ”€β”€ main.py # MCP server logic for leave management β”œβ”€β”€ requirements.txt # Python dependencies for the MCP server β”œβ”€β”€ Dockerfile # Docker image configuration for the leave manager β”œβ”€β”€ docker-compose.yml # Docker Compose file to run leave manager and Open WebUI └── README.md # Project documentation (this file)

πŸ“‹ Prerequisites

  1. Windows 10 or later (required for Ollama)

  2. Docker Desktop for Windows (required for Open WebUI and MCP)


πŸ› οΈ Workflow

  1. Install Ollama on Windows

  2. Pull the deepseek-r1 model

  3. Clone the repository and navigate to the project directory

  4. Run the docker-compose.yml file to launch services


Install Ollama

➀ Windows

  1. Download the Installer:

  2. Run the Installer:

    • Execute OllamaSetup.exe and follow the installation prompts.

    • After installation, Ollama runs as a background service, accessible at: http://localhost:11434.

    • Verify in your browser; you should see:

      Ollama is running

    Ollama Initial Window Ollama Setup Progress Ollama In System Tray Ollama On Browser

  3. Start Ollama Server (if not already running):

    ollama serve

Verify Installation

Check the installed version of Ollama:

ollama --version

Expected Output:

ollama version 0.7.1

Pull the deepseek-r1 Model

1. Pull the Default Model (7B):

Using PoweShell

ollama pull deepseek-r1

deepseek-r1

To Pull Specific Versions:

ollama run deepseek-r1:1.5b ollama run deepseek-r1:671b

2. List Installed Models:

ollama list

Expected:

Expected Output:

NAME ID SIZE deepseek-r1:latest xxxxxxxxxxxx X.X GB

deepseek-r1:latest

4. Alternative Check via API:

curl http://localhost:11434/api/tags

Expected Output: A JSON response listing installed models, including deepseek-r1:latest.

alternative check

4. Test the API via PowerShell:

Invoke-RestMethod -Uri http://localhost:11434/api/generate -Method Post -Body '{"model": "deepseek-r1", "prompt": "Hello, world!", "stream": false}' -ContentType "application/json"

Expected Response: A JSON object containing the model's response to the "Hello, world!" prompt.

test the API

5. Run and Chat the Model via PowerShell:

ollama run deepseek-r1
  • This opens an interactive chat session with the deepseek-r1 model.

  • Type /bye and press Enter to exit the chat session.

run and chat

run and chat with Hi

exist chat


🐳 Run Open WebUI and MCP Server with Docker Compose

  1. Clone the Repository:

    git clone https://github.com/ahmad-act/Local-AI-with-Ollama-Open-WebUI-MCP-on-Windows.git cd Local-AI-with-Ollama-Open-WebUI-MCP-on-Windows
  2. To launch both the MCP tool and Open WebUI locally (on Docker Desktop):

    docker-compose up --build

    exist chat exist chat exist chat exist chat exist chat exist chat exist chat

This will:


🌐 Add MCP Tools to Open WebUI

The MCP tools are exposed via the OpenAPI specification at: http://localhost:8000/openapi.json.

  1. Open http://localhost:3000 in your browser.

  2. Click the Profile Icon and navigate to Settings. exist chat

  3. Select the Tools menu and click the Add (+) Button. exist chat

  4. Add a new tool by entering the URL: http://localhost:8000/. exist chat exist chat exist chat exist chat exist chat exist chat exist chat


πŸ’¬ Example Prompts

Use these prompts in Open WebUI to interact with the Leave Manager tool:

  • Check Leave Balance:

    Check how many leave days are left for employee E001

    exist chat exist chat

  • Apply for Leave:

    Apply ![exist chat](readme-img/add-mcp-tools-on-open-webui-12.png)
  • View Leave History:

    What's the leave history of E001?

    exist chat

  • Personalized Greeting:

    Greet me as Alice

    exist chat


πŸ› οΈ Troubleshooting

  • Ollama not running: Ensure the service is active (ollama serve) and check http://localhost:11434.

  • Docker issues: Verify Docker Desktop is running and you have sufficient disk space.

  • Model not found: Confirm the deepseek-r1 model is listed with ollama list.

  • Port conflicts: Ensure ports 11434, 3000, and 8000 are free.


πŸ“š Additional Resources

-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ahmad-act/Local-AI-with-Ollama-Open-WebUI-MCP-on-Windows'

If you have feedback or need assistance with the MCP directory API, please join our Discord server