Skip to main content
Glama

Leave Manager MCP Tool Server

by ahmad-act
README.md7.89 kB
# Local AI with Ollama, WebUI & MCP on Windows A self-hosted AI stack combining Ollama for running language models, Open WebUI for user-friendly chat interaction, and MCP for centralized model management—offering full control, privacy, and flexibility without relying on the cloud. This sample project provides an **MCP-based tool server** for managing employee leave balance, applications, and history. It is exposed via OpenAPI using `mcpo` for easy integration with **Open WebUI** or other OpenAPI-compatible clients. --- ## 🚀 Features - ✅ Check employee leave balance - 📆 Apply for leave on specific dates - 📜 View leave history - 🙋 Personalized greeting functionality --- ## 📁 Project Structure ```text leave-manager/ ├── main.py # MCP server logic for leave management ├── requirements.txt # Python dependencies for the MCP server ├── Dockerfile # Docker image configuration for the leave manager ├── docker-compose.yml # Docker Compose file to run leave manager and Open WebUI └── README.md # Project documentation (this file) ``` --- ## 📋 Prerequisites 1. **Windows 10 or later** (required for Ollama) 2. **Docker Desktop for Windows** (required for Open WebUI and MCP) - Install from: [Docker Desktop for Windows](https://docs.docker.com/desktop/install/windows-install/) --- ## 🛠️ Workflow 1. Install Ollama on Windows 2. Pull the `deepseek-r1` model 3. Clone the repository and navigate to the project directory 4. Run the `docker-compose.yml` file to launch services --- ## Install Ollama ### ➤ Windows 1. **Download the Installer**: - Visit [Ollama Download](https://ollama.com/download/windows) and click **Download for Windows** to get `OllamaSetup.exe`. - Alternatively, download from [Ollama GitHub Releases](https://github.com/ollama/ollama/releases). 2. **Run the Installer**: - Execute `OllamaSetup.exe` and follow the installation prompts. - After installation, Ollama runs as a background service, accessible at: [http://localhost:11434](http://localhost:11434). - Verify in your browser; you should see: ```text Ollama is running ``` ![Ollama Initial Window](readme-img/ollama-setup-steps-1.png) ![Ollama Setup Progress](readme-img/ollama-setup-steps-2.png) ![Ollama In System Tray](readme-img/ollama-setup-steps-3.png) ![Ollama On Browser](readme-img/ollama-setup-steps-4.png) 3. **Start Ollama Server (if not already running)**: ```powershell ollama serve ``` - Access the server at: [http://localhost:11434](http://localhost:11434). ### Verify Installation Check the installed version of Ollama: ```powershell ollama --version ``` **Expected Output**: ``` ollama version 0.7.1 ``` --- ## Pull the `deepseek-r1` Model ### 1. Pull the Default Model (7B): #### Using PoweShell ```powershell ollama pull deepseek-r1 ``` ![deepseek-r1](readme-img/deepseek-r1-setup-1.png) *To Pull Specific Versions:* ```powershell ollama run deepseek-r1:1.5b ollama run deepseek-r1:671b ``` ### 2. List Installed Models: ```powershell ollama list ``` Expected: **Expected Output**: ```text NAME ID SIZE deepseek-r1:latest xxxxxxxxxxxx X.X GB ``` ![deepseek-r1:latest](readme-img/deepseek-r1-setup-2.png) ### 4. Alternative Check via API: ```powershell curl http://localhost:11434/api/tags ``` **Expected Output**: A JSON response listing installed models, including `deepseek-r1:latest`. ![alternative check](readme-img/deepseek-r1-setup-3.png) ### 4. Test the API via PowerShell: ```powershell Invoke-RestMethod -Uri http://localhost:11434/api/generate -Method Post -Body '{"model": "deepseek-r1", "prompt": "Hello, world!", "stream": false}' -ContentType "application/json" ``` **Expected Response**: A JSON object containing the model's response to the "Hello, world!" prompt. ![test the API](readme-img/deepseek-r1-setup-7.png) ### 5. Run and Chat the Model via PowerShell: ```powershell ollama run deepseek-r1 ``` - This opens an interactive chat session with the `deepseek-r1` model. - Type `/bye` and press `Enter` to exit the chat session. ![run and chat](readme-img/deepseek-r1-setup-4.png) ![run and chat with Hi](readme-img/deepseek-r1-setup-5.png) ![exist chat](readme-img/deepseek-r1-setup-6.png) --- ## 🐳 Run Open WebUI and MCP Server with Docker Compose 1. **Clone the Repository**: ```powershell git clone https://github.com/ahmad-act/Local-AI-with-Ollama-Open-WebUI-MCP-on-Windows.git cd Local-AI-with-Ollama-Open-WebUI-MCP-on-Windows ``` 2. **To launch both the MCP tool and Open WebUI locally (on Docker Desktop)**: ```powershell docker-compose up --build ``` ![exist chat](readme-img/openwebui-mcp-setup-1.png) ![exist chat](readme-img/openwebui-mcp-setup-2.png) ![exist chat](readme-img/openwebui-mcp-setup-3.png) ![exist chat](readme-img/openwebui-mcp-setup-4.png) ![exist chat](readme-img/openwebui-mcp-setup-5.png) ![exist chat](readme-img/openwebui-mcp-setup-6.png) ![exist chat](readme-img/openwebui-mcp-setup-7.png) This will: - Start the Leave Manager (MCP Server) tool on port `8000` - Launch Open WebUI at [http://localhost:3000](http://localhost:3000) --- ## 🌐 Add MCP Tools to Open WebUI The MCP tools are exposed via the OpenAPI specification at: [http://localhost:8000/openapi.json](http://localhost:8000/openapi.json). 1. Open [http://localhost:3000](http://localhost:3000) in your browser. 2. Click the **Profile Icon** and navigate to **Settings**. ![exist chat](readme-img/add-mcp-tools-on-open-webui-1.png) 3. Select the **Tools** menu and click the **Add (+) Button**. ![exist chat](readme-img/add-mcp-tools-on-open-webui-2.png) 4. Add a new tool by entering the URL: [http://localhost:8000/](http://localhost:8000/). ![exist chat](readme-img/add-mcp-tools-on-open-webui-3.png) ![exist chat](readme-img/add-mcp-tools-on-open-webui-4.png) ![exist chat](readme-img/add-mcp-tools-on-open-webui-5.png) ![exist chat](readme-img/add-mcp-tools-on-open-webui-6.png) ![exist chat](readme-img/add-mcp-tools-on-open-webui-7.png) ![exist chat](readme-img/add-mcp-tools-on-open-webui-8.png) ![exist chat](readme-img/add-mcp-tools-on-open-webui-9.png) --- ## 💬 Example Prompts Use these prompts in Open WebUI to interact with the Leave Manager tool: - **Check Leave Balance**: ``` Check how many leave days are left for employee E001 ``` ![exist chat](readme-img/add-mcp-tools-on-open-webui-10.png) ![exist chat](readme-img/add-mcp-tools-on-open-webui-11.png) - **Apply for Leave**: ``` Apply ![exist chat](readme-img/add-mcp-tools-on-open-webui-12.png) - **View Leave History**: ``` What's the leave history of E001? ``` ![exist chat](readme-img/add-mcp-tools-on-open-webui-13.png) - **Personalized Greeting**: ``` Greet me as Alice ``` ![exist chat](readme-img/add-mcp-tools-on-open-webui-14.png) --- ## 🛠️ Troubleshooting - **Ollama not running**: Ensure the service is active (`ollama serve`) and check [http://localhost:11434](http://localhost:11434). - **Docker issues**: Verify Docker Desktop is running and you have sufficient disk space. - **Model not found**: Confirm the `deepseek-r1` model is listed with `ollama list`. - **Port conflicts**: Ensure ports `11434`, `3000`, and `8000` are free. --- ## 📚 Additional Resources - [Ollama Documentation](https://github.com/ollama/ollama/tree/main/docs) - [Open WebUI Documentation](https://docs.openwebui.com/) - [Docker Desktop Documentation](https://docs.docker.com/desktop/) - [MCP Documentation](https://modelcontextprotocol.io/introduction) - [OpenAPI Tool Servers](https://github.com/open-webui/openapi-servers) - [mcpo - Works with OpenAPI tools, SDKs, and UIs](https://github.com/open-webui/mcpo)

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ahmad-act/Local-AI-with-Ollama-Open-WebUI-MCP-on-Windows'

If you have feedback or need assistance with the MCP directory API, please join our Discord server