Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@tes-mcp-serverget the 7-day weather forecast for London"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
tes-mcp-server MCP Server
This is an MCP (Model Context Protocol) server that provides access to the tes-mcp-server API. It enables AI agents and LLMs to interact with tes-mcp-server through standardized tools.
Features
🔧 MCP Protocol: Built on the Model Context Protocol for seamless AI integration
🌐 Full API Access: Provides tools for interacting with tes-mcp-server endpoints
🐳 Docker Support: Easy deployment with Docker and Docker Compose
⚡ Async Operations: Built with FastMCP for efficient async handling
API Documentation
tes-mcp-server Website: https://date.nager.at
API Documentation: https://weather-mcp.example.com/docs
Available Tools
This server provides the following tools:
example_tool: Placeholder tool (to be implemented)
Note: Replace
Installation
Using Docker (Recommended)
Clone this repository:
git clone https://github.com/Traia-IO/tes-mcp-server-mcp-server.git cd tes-mcp-server-mcp-serverRun with Docker:
./run_local_docker.sh
Using Docker Compose
Create a
.envfile with your configuration:
PORT=8000
Manual Installation
Install dependencies using
uv:uv pip install -e .Run the server:
uv run python -m server
Using with CrewAI
Development
Testing the Server
Start the server locally
Run the health check:
python mcp_health_check.pyTest individual tools using the CrewAI adapter
Adding New Tools
To add new tools, edit server.py and:
Create API client functions for tes-mcp-server endpoints
Add
@mcp.tool()decorated functionsUpdate this README with the new tools
Update
deployment_params.jsonwith the tool names in the capabilities array
Deployment
Deployment Configuration
The deployment_params.json file contains the deployment configuration for this MCP server:
Important: Always update the capabilities array when you add or remove tools!
Google Cloud Run
This server is designed to be deployed on Google Cloud Run. The deployment will:
Build a container from the Dockerfile
Deploy to Cloud Run with the specified configuration
Expose the
/mcpendpoint for client connections
Environment Variables
PORT: Server port (default: 8000)STAGE: Environment stage (default: MAINNET, options: MAINNET, TESTNET)LOG_LEVEL: Logging level (default: INFO)
Troubleshooting
Server not starting: Check Docker logs with
docker logs <container-id>Connection errors: Ensure the server is running on the expected port3. Tool errors: Check the server logs for detailed error messages
Contributing
Fork the repository
Create a feature branch
Implement new tools or improvements
Update the README and deployment_params.json
Submit a pull request