README.md•7.89 kB
# Local AI with Ollama, WebUI & MCP on Windows
A self-hosted AI stack combining Ollama for running language models, Open WebUI for user-friendly chat interaction, and MCP for centralized model management—offering full control, privacy, and flexibility without relying on the cloud.
This sample project provides an **MCP-based tool server** for managing employee leave balance, applications, and history. It is exposed via OpenAPI using `mcpo` for easy integration with **Open WebUI** or other OpenAPI-compatible clients.
---
## 🚀 Features
- ✅ Check employee leave balance
- 📆 Apply for leave on specific dates
- 📜 View leave history
- 🙋 Personalized greeting functionality
---
## 📁 Project Structure
```text
leave-manager/
├── main.py # MCP server logic for leave management
├── requirements.txt # Python dependencies for the MCP server
├── Dockerfile # Docker image configuration for the leave manager
├── docker-compose.yml # Docker Compose file to run leave manager and Open WebUI
└── README.md # Project documentation (this file)
```
---
## 📋 Prerequisites
1. **Windows 10 or later** (required for Ollama)
2. **Docker Desktop for Windows** (required for Open WebUI and MCP)
- Install from: [Docker Desktop for Windows](https://docs.docker.com/desktop/install/windows-install/)
---
## 🛠️ Workflow
1. Install Ollama on Windows
2. Pull the `deepseek-r1` model
3. Clone the repository and navigate to the project directory
4. Run the `docker-compose.yml` file to launch services
---
## Install Ollama
### ➤ Windows
1. **Download the Installer**:
- Visit [Ollama Download](https://ollama.com/download/windows) and click **Download for Windows** to get `OllamaSetup.exe`.
- Alternatively, download from [Ollama GitHub Releases](https://github.com/ollama/ollama/releases).
2. **Run the Installer**:
- Execute `OllamaSetup.exe` and follow the installation prompts.
- After installation, Ollama runs as a background service, accessible at: [http://localhost:11434](http://localhost:11434).
- Verify in your browser; you should see:
```text
Ollama is running
```




3. **Start Ollama Server (if not already running)**:
```powershell
ollama serve
```
- Access the server at: [http://localhost:11434](http://localhost:11434).
### Verify Installation
Check the installed version of Ollama:
```powershell
ollama --version
```
**Expected Output**:
```
ollama version 0.7.1
```
---
## Pull the `deepseek-r1` Model
### 1. Pull the Default Model (7B):
#### Using PoweShell
```powershell
ollama pull deepseek-r1
```

*To Pull Specific Versions:*
```powershell
ollama run deepseek-r1:1.5b
ollama run deepseek-r1:671b
```
### 2. List Installed Models:
```powershell
ollama list
```
Expected:
**Expected Output**:
```text
NAME ID SIZE
deepseek-r1:latest xxxxxxxxxxxx X.X GB
```

### 4. Alternative Check via API:
```powershell
curl http://localhost:11434/api/tags
```
**Expected Output**:
A JSON response listing installed models, including `deepseek-r1:latest`.

### 4. Test the API via PowerShell:
```powershell
Invoke-RestMethod -Uri http://localhost:11434/api/generate -Method Post -Body '{"model": "deepseek-r1", "prompt": "Hello, world!", "stream": false}' -ContentType "application/json"
```
**Expected Response**:
A JSON object containing the model's response to the "Hello, world!" prompt.

### 5. Run and Chat the Model via PowerShell:
```powershell
ollama run deepseek-r1
```
- This opens an interactive chat session with the `deepseek-r1` model.
- Type `/bye` and press `Enter` to exit the chat session.



---
## 🐳 Run Open WebUI and MCP Server with Docker Compose
1. **Clone the Repository**:
```powershell
git clone https://github.com/ahmad-act/Local-AI-with-Ollama-Open-WebUI-MCP-on-Windows.git
cd Local-AI-with-Ollama-Open-WebUI-MCP-on-Windows
```
2. **To launch both the MCP tool and Open WebUI locally (on Docker Desktop)**:
```powershell
docker-compose up --build
```







This will:
- Start the Leave Manager (MCP Server) tool on port `8000`
- Launch Open WebUI at [http://localhost:3000](http://localhost:3000)
---
## 🌐 Add MCP Tools to Open WebUI
The MCP tools are exposed via the OpenAPI specification at: [http://localhost:8000/openapi.json](http://localhost:8000/openapi.json).
1. Open [http://localhost:3000](http://localhost:3000) in your browser.
2. Click the **Profile Icon** and navigate to **Settings**.

3. Select the **Tools** menu and click the **Add (+) Button**.

4. Add a new tool by entering the URL: [http://localhost:8000/](http://localhost:8000/).







---
## 💬 Example Prompts
Use these prompts in Open WebUI to interact with the Leave Manager tool:
- **Check Leave Balance**:
```
Check how many leave days are left for employee E001
```


- **Apply for Leave**:
```
Apply

- **View Leave History**:
```
What's the leave history of E001?
```

- **Personalized Greeting**:
```
Greet me as Alice
```

---
## 🛠️ Troubleshooting
- **Ollama not running**: Ensure the service is active (`ollama serve`) and check [http://localhost:11434](http://localhost:11434).
- **Docker issues**: Verify Docker Desktop is running and you have sufficient disk space.
- **Model not found**: Confirm the `deepseek-r1` model is listed with `ollama list`.
- **Port conflicts**: Ensure ports `11434`, `3000`, and `8000` are free.
---
## 📚 Additional Resources
- [Ollama Documentation](https://github.com/ollama/ollama/tree/main/docs)
- [Open WebUI Documentation](https://docs.openwebui.com/)
- [Docker Desktop Documentation](https://docs.docker.com/desktop/)
- [MCP Documentation](https://modelcontextprotocol.io/introduction)
- [OpenAPI Tool Servers](https://github.com/open-webui/openapi-servers)
- [mcpo - Works with OpenAPI tools, SDKs, and UIs](https://github.com/open-webui/mcpo)