Handles environment configuration for the MCP server, managing settings like API URLs and model selection.
Provides source control integration for cloning and managing the MCP server codebase.
Hosts the MCP server repository for distribution and collaboration.
Uses Node.js as the runtime environment for the MCP server, with v18+ required for operation.
Manages dependencies and provides scripts for running, testing, and diagnosing the MCP server.
Leverages Ollama's LLM capabilities to provide weather information through a 'get-weather' tool that retrieves weather data for any city.
Implements the MCP server using TypeScript for type-safe code organization and structure.
🌦️ Weather MCP Server Demo
A Model Context Protocol (MCP) compatible server that provides weather information using Ollama's LLM capabilities. This server exposes a get-weather
tool that can be used by MCP clients to retrieve weather information for any city.
🛠 Prerequisites
- Node.js v18+
- Ollama installed and running locally
- Ollama model:
llama3
(or configure your preferred model)
🚀 Complete Setup & Installation
✅ Step 1: Clone and Install
⚙️ Step 2: Setup Environment
Edit the .env file to contain:
🤖 Step 3: Install and Setup Ollama
Step 4: Test Ollama Connection
Step 5: Run Diagnostics
🎯 Running the MCP Server
Start the Server
Expected output:
🚀 MCP Weather Server starting... 📡 Ollama URL: http://localhost:11434/api/generate 🤖 Model: llama3 ✅ MCP Server connected and ready!
🔍 Testing with MCP Inspector
Method 1: CLI Inspector
⚙️ Configuration
Environment Variables (.env file)
Performance Tuning
The server is optimized for quick responses:
15-second timeout for HTTP requests Aggressive Ollama parameters for faster generation Fallback from HTTP API to CLI if needed
##🔧 Troubleshooting
Quick Diagnosis
Common Issues & Solutions
Issue: "Request timed out" errors
Issue: "Request timed out" errors
Issue: "Model not found" errors
Issue: Connection errors
Issue: MCP Inspector connection fails
Performance Tips
📁 Project Structure
✅ Success Checklist
Complete this checklist to ensure everything is working:
Node.js v18+ installed Ollama installed and running (ollama serve) Model downloaded (ollama pull llama3) Project dependencies installed (npm install) Environment configured (.env file exists) Diagnostic passes (npm run diagnose) MCP server starts successfully (npm start) MCP Inspector connects successfully Weather tool responds to test quer
🚀 Quick Start Commands
🤝 Contributing
- Fork the repository
- Create a feature branch: git checkout -b feature-name
- Make your changes
- Test with MCP Inspector
- Submit a pull request
local-only server
The server can only run on the client's local machine because it depends on local resources.
Tools
A Model Context Protocol compatible server that provides weather information for any city using Ollama's LLM capabilities through an exposed get-weather tool.
Related MCP Servers
- -securityAlicense-qualityA Model Context Protocol server that provides standardized interfaces for interacting with Ollama API, offering JSON responses, error handling, and intelligent guidance for LLM-based API calls.Last updated -PythonMIT License
- -securityFlicense-qualityA Model Context Protocol tool that provides weather information for cities, with London access requiring Solana devnet payment via the Latinum Wallet MCP server.Last updated -Python
- AsecurityAlicenseAqualityA Model Context Protocol server that provides comprehensive weather data and forecasts through the OpenWeatherMap API, enabling AI assistants to access real-time weather information, forecasts, air quality data, and location services.Last updated -1120JavaScriptMIT License
- AsecurityAlicenseAqualityA Model Context Protocol server that provides real-time weather data and forecasts for any city.Last updated -18ISC License