Used for configuration management, allowing the server to load API keys and other environment variables from a .env file
Version control system used for cloning the repository
Platform hosting the repository that needs to be cloned during installation
Required runtime environment for the MCP server, version 18+ needed to run the application
Package manager used for installing dependencies and running scripts for the MCP server
Programming language used for implementing the server with type safety
Alternative package manager that can be used instead of npm for dependency management
MCP Multi-API Server 🌉
A Model Context Protocol (MCP) server that bridges AI/LLMs with multiple real-world APIs including weather, finance, and news services. This server acts as a standardized interface, allowing any MCP-compatible AI application to seamlessly interact with external APIs without custom integration work.
🚀 Features
- Multi-API Support: Weather (OpenWeatherMap), Finance (Alpha Vantage), News (NewsAPI)
- MCP Protocol Compliant: Full implementation of Anthropic's MCP standard
- Intelligent Caching: Configurable TTL-based caching to reduce API calls
- Rate Limiting: Built-in rate limiting to respect API quotas
- Comprehensive Logging: Winston-based logging with multiple transports
- Error Handling: Robust error handling with meaningful error messages
- TypeScript: Fully typed for better developer experience
📋 Prerequisites
- Node.js 18+
- npm or yarn
- API keys for:
🛠️ Installation
- Clone the repository:
- Install dependencies:
- Copy the environment template and add your API keys:
- Edit
.env
and add your API keys:
- Build the project:
🚀 Usage
Starting the Server
For development with auto-reload:
🏗️ Architecture
⚙️ Configuration
Environment Variables
Variable | Description | Default |
---|---|---|
OPENWEATHER_API_KEY | OpenWeatherMap API key | Required |
ALPHA_VANTAGE_API_KEY | Alpha Vantage API key | Required |
NEWS_API_KEY | NewsAPI key | Required |
MCP_SERVER_PORT | Server port | 3000 |
LOG_LEVEL | Logging level | info |
CACHE_TTL_WEATHER | Weather cache TTL (seconds) | 300 |
CACHE_TTL_FINANCE | Finance cache TTL (seconds) | 60 |
CACHE_TTL_NEWS | News cache TTL (seconds) | 600 |
RATE_LIMIT_REQUESTS | Rate limit requests | 100 |
RATE_LIMIT_WINDOW | Rate limit window (ms) | 60000 |
🧪 Testing
Run the test script:
Using MCP Inspector (Visual Testing)
This opens a web interface at http://localhost:5173
where you can:
- See all available tools
- Test tool calls interactively
- View request/response logs
Manual STDIO Testing
Run:
Test Finance API
Test News API
This server cannot be installed
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
A bridge allowing AI/LLMs to seamlessly interact with external APIs for weather, finance, and news services through a standardized MCP-compliant interface.
Related MCP Servers
- AsecurityFlicenseAqualityA bridge that enables seamless integration of Ollama's local LLM capabilities into MCP-powered applications, allowing users to manage and run AI models locally with full API coverage.Last updated -1033JavaScript
- -securityAlicense-qualityA Server-Sent Events implementation using FastAPI framework that integrates Model Context Protocol (MCP), allowing AI models to access external tools and data sources like weather information.Last updated -9PythonMIT License
- -securityAlicense-qualityA proxy server that bridges AI agents and external APIs by dynamically translating OpenAPI specifications into standardized MCP tools, enabling seamless interaction without custom integration code.Last updated -39PythonMIT License
- -securityAlicense-qualityA server that integrates the MCP library with OpenAI's API, allowing users to interact with various tools, such as the weather tool, through natural language queries.Last updated -PythonMIT License