Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@tes-mcp-serverget the 7-day weather forecast for London"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
tes-mcp-server MCP Server
This is an MCP (Model Context Protocol) server that provides access to the tes-mcp-server API. It enables AI agents and LLMs to interact with tes-mcp-server through standardized tools.
Features
🔧 MCP Protocol: Built on the Model Context Protocol for seamless AI integration
🌐 Full API Access: Provides tools for interacting with tes-mcp-server endpoints
🐳 Docker Support: Easy deployment with Docker and Docker Compose
⚡ Async Operations: Built with FastMCP for efficient async handling
API Documentation
tes-mcp-server Website: https://date.nager.at
API Documentation: https://weather-mcp.example.com/docs
Available Tools
This server provides the following tools:
example_tool: Placeholder tool (to be implemented)
Note: Replace
Installation
Using Docker (Recommended)
Clone this repository:
git clone https://github.com/Traia-IO/tes-mcp-server-mcp-server.git cd tes-mcp-server-mcp-serverRun with Docker:
./run_local_docker.sh
Using Docker Compose
Create a
.envfile with your configuration:
PORT=8000
2. Start the server:
```bash
docker-compose upManual Installation
Install dependencies using
uv:uv pip install -e .Run the server:
uv run python -m server
## Usage
### Health Check
Test if the server is running:
```bash
python mcp_health_check.pyUsing with CrewAI
from traia_iatp.mcp.traia_mcp_adapter import create_mcp_adapter
# Connect to the MCP server
with create_mcp_adapter(
url="http://localhost:8000/mcp/"
) as tools:
# Use the tools
for tool in tools:
print(f"Available tool: {tool.name}")
# Example usage
result = await tool.example_tool(query="test")
print(result)Development
Testing the Server
Start the server locally
Run the health check:
python mcp_health_check.pyTest individual tools using the CrewAI adapter
Adding New Tools
To add new tools, edit server.py and:
Create API client functions for tes-mcp-server endpoints
Add
@mcp.tool()decorated functionsUpdate this README with the new tools
Update
deployment_params.jsonwith the tool names in the capabilities array
Deployment
Deployment Configuration
The deployment_params.json file contains the deployment configuration for this MCP server:
{
"github_url": "https://github.com/Traia-IO/tes-mcp-server-mcp-server",
"mcp_server": {
"name": "tes-mcp-server-mcp",
"description": "This mcp server exposes public weather and climate data through a standardized mcp-compatible api.
it allows ai agents to retrieve current weather conditions, 7-day forecasts, and historical climate data
for supported locations worldwide.
use cases include:
- ai-powered travel planning
- weather-aware automation
- data enrichment for conversational agents
",
"server_type": "streamable-http",
"capabilities": [
// List all implemented tool names here
"example_tool"
]
},
"deployment_method": "cloud_run",
"gcp_project_id": "traia-mcp-servers",
"gcp_region": "us-central1",
"tags": ["tes-mcp-server", "api"],
"ref": "main"
}Important: Always update the capabilities array when you add or remove tools!
Google Cloud Run
This server is designed to be deployed on Google Cloud Run. The deployment will:
Build a container from the Dockerfile
Deploy to Cloud Run with the specified configuration
Expose the
/mcpendpoint for client connections
Environment Variables
PORT: Server port (default: 8000)STAGE: Environment stage (default: MAINNET, options: MAINNET, TESTNET)LOG_LEVEL: Logging level (default: INFO)
Troubleshooting
Server not starting: Check Docker logs with
docker logs <container-id>Connection errors: Ensure the server is running on the expected port3. Tool errors: Check the server logs for detailed error messages
Contributing
Fork the repository
Create a feature branch
Implement new tools or improvements
Update the README and deployment_params.json
Submit a pull request