Provides access to the Nikola TEST MCP API through the Swagger Petstore endpoint, enabling AI agents to interact with the API's tools and capabilities.
Nikola TEST MCP MCP Server
This is an MCP (Model Context Protocol) server that provides access to the Nikola TEST MCP API. It enables AI agents and LLMs to interact with Nikola TEST MCP through standardized tools.
Features
🔧 MCP Protocol: Built on the Model Context Protocol for seamless AI integration
🌐 Full API Access: Provides tools for interacting with Nikola TEST MCP endpoints
🐳 Docker Support: Easy deployment with Docker and Docker Compose
⚡ Async Operations: Built with FastMCP for efficient async handling
API Documentation
Nikola TEST MCP Website: https://petstore.swagger.io/
API Documentation:
Available Tools
This server provides the following tools:
example_tool: Placeholder tool (to be implemented)get_api_info: Get information about the API service and authentication status
Note: Replace
Installation
Using Docker (Recommended)
Clone this repository:
git clone https://github.com/Traia-IO/nikola-test-mcp-mcp-server.git cd nikola-test-mcp-mcp-serverRun with Docker:
./run_local_docker.sh
Using Docker Compose
Create a
.envfile with your configuration:
PORT=8000
Manual Installation
Install dependencies using
uv:uv pip install -e .Run the server:
uv run python -m server
Using with CrewAI
Development
Testing the Server
Start the server locally
Run the health check:
python mcp_health_check.pyTest individual tools using the CrewAI adapter
Adding New Tools
To add new tools, edit server.py and:
Create API client functions for Nikola TEST MCP endpoints
Add
@mcp.tool()decorated functionsUpdate this README with the new tools
Update
deployment_params.jsonwith the tool names in the capabilities array
Deployment
Deployment Configuration
The deployment_params.json file contains the deployment configuration for this MCP server:
Important: Always update the capabilities array when you add or remove tools!
Google Cloud Run
This server is designed to be deployed on Google Cloud Run. The deployment will:
Build a container from the Dockerfile
Deploy to Cloud Run with the specified configuration
Expose the
/mcpendpoint for client connections
Environment Variables
PORT: Server port (default: 8000)STAGE: Environment stage (default: MAINNET, options: MAINNET, TESTNET)LOG_LEVEL: Logging level (default: INFO)
Troubleshooting
Server not starting: Check Docker logs with
docker logs <container-id>Connection errors: Ensure the server is running on the expected port3. Tool errors: Check the server logs for detailed error messages
Contributing
Fork the repository
Create a feature branch
Implement new tools or improvements
Update the README and deployment_params.json
Submit a pull request
License
This server cannot be installed