Skip to main content
Glama
README.md7.23 kB
# Testmo MCP Server A **Model Context Protocol (MCP)** server that provides seamless integration with [Testmo](https://www.testmo.com/) test management platform. Built with [FastMCP](https://github.com/jlowin/fastmcp), this server enables AI assistants to interact directly with your Testmo instance for test case management. ![Python](https://img.shields.io/badge/Python-3.10+-blue.svg) ![MCP](https://img.shields.io/badge/MCP-Compatible-green.svg) ![License](https://img.shields.io/badge/License-MIT-yellow.svg) ## Features - 📋 **Test Case Management** - Create, read, update, and delete test cases - 📁 **Folder Organization** - Create and manage folders to organize test cases - 🔍 **Project Discovery** - List projects, templates, and milestones - ✅ **Step-by-Step Tests** - Create detailed test cases with steps and expected results - 🏃 **Test Run Support** - List and manage test runs - 🔧 **Debug Tools** - Built-in API debugging capabilities ## Prerequisites - Python 3.10+ - A [Testmo](https://www.testmo.com/) account with API access - API token from your Testmo instance ## Installation ### 1. Clone the Repository ```bash git clone https://github.com/filipljoljic/Testmo-MCP.git cd Testmo-MCP ``` ### 2. Create Virtual Environment ```bash python3 -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate ``` ### 3. Install Dependencies ```bash pip install -r requirements.txt ``` ### 4. Configure Environment Variables Set your Testmo credentials as environment variables: ```bash export TESTMO_BASE_URL="https://your-instance.testmo.net" export TESTMO_TOKEN="your_api_token_here" ``` **Getting your API token:** 1. Log into your Testmo instance 2. Go to **Settings** → **API** 3. Generate a new API token 4. Copy the token and use it in the environment variable ## Usage ### Running the MCP Server ```bash python3 testmoMCP.py ``` The server will start and be ready to accept MCP connections from compatible AI assistants. ### Integrating with Claude Desktop Add the following to your Claude Desktop configuration (`claude_desktop_config.json`): ```json { "mcpServers": { "testmo": { "command": "python3", "args": ["/path/to/Testmo-MCP/testmoMCP.py"], "env": { "TESTMO_BASE_URL": "https://your-instance.testmo.net", "TESTMO_TOKEN": "your_api_token_here" } } } } ``` ### Integrating with Cursor Add to your Cursor MCP settings: ```json { "mcpServers": { "testmo": { "command": "python3", "args": ["/path/to/Testmo-MCP/testmoMCP.py"], "env": { "TESTMO_BASE_URL": "https://your-instance.testmo.net", "TESTMO_TOKEN": "your_api_token_here" } } } } ``` ## Available Tools ### Project Management | Tool | Description | |------|-------------| | `list_projects` | List all projects to find project IDs | | `get_project_templates` | Get available templates for a project | | `get_project_folders` | List folders within a project | | `get_milestones` | Get milestones for a project | ### Test Case Operations | Tool | Description | |------|-------------| | `create_test_case` | Create a new test case with steps | | `create_test_case_with_steps_json` | Create test case with JSON-formatted steps | | `create_test_case_raw` | Create test case with raw JSON payload | | `get_test_case` | Get details of a specific test case | | `list_test_cases` | List test cases in a project | | `update_test_case` | Update an existing test case | | `delete_test_cases` | Delete one or more test cases | ### Folder Management | Tool | Description | |------|-------------| | `create_folder` | Create a new folder for organizing test cases | ### Test Run Operations | Tool | Description | |------|-------------| | `list_test_runs` | List test runs in a project | | `create_manual_test_run` | Create a manual test run (note: limited API support) | ### Debugging | Tool | Description | |------|-------------| | `debug_api_test` | Test API connectivity and view configuration | ## Examples ### Creating a Test Case with Steps Using natural language with an AI assistant: > "Create a test case called 'User Login Flow' with steps to navigate to login, enter credentials, and verify successful login" The assistant will use the `create_test_case` tool with: ``` project_id: 1 title: "User Login Flow" description: "Verify user can successfully log in to the application" steps: "1. Navigate to login page | Expected: Login page loads 2. Enter valid credentials | Expected: Fields accept input 3. Click Login button | Expected: User is logged in and redirected to dashboard" ``` ### Listing Projects > "Show me all available Testmo projects" The assistant will call `list_projects` and return: ``` Found Projects: ID: 1 | Name: Web Application | Active: True ID: 2 | Name: Mobile App | Active: True ``` ### Creating Organized Test Cases > "Create a folder called 'Authentication' in project 1, then create a test case inside it" ```python # First creates folder create_folder(project_id=1, name="Authentication") # Then creates test case in that folder create_test_case(project_id=1, folder_id=<new_folder_id>, title="Login Test", ...) ``` ## API Reference This MCP server integrates with the [Testmo REST API v1](https://docs.testmo.com/docs/api/rest-api). Key endpoints used: - `GET /api/v1/projects` - List projects - `POST /api/v1/projects/{id}/cases` - Create test cases - `GET /api/v1/projects/{id}/cases` - List test cases - `PATCH /api/v1/projects/{id}/cases` - Update test cases - `DELETE /api/v1/projects/{id}/cases` - Delete test cases - `GET /api/v1/projects/{id}/folders` - List folders - `POST /api/v1/projects/{id}/folders` - Create folders ## Priority Values When creating test cases, use these priority IDs: | ID | Priority | |----|----------| | 1 | Low | | 2 | Medium | | 3 | High | | 4 | Critical | ## Time Estimates Estimates are specified in **minutes**: | Value | Duration | |-------|----------| | 15 | 15 minutes | | 60 | 1 hour | | 480 | 8 hours (1 day) | ## Troubleshooting ### API Connection Issues 1. Verify your `TESTMO_BASE_URL` is correct (should be like `https://your-instance.testmo.net`) 2. Check that your API token is valid and has appropriate permissions 3. Use the `debug_api_test` tool to diagnose connectivity issues ### Test Case Creation Fails 1. Ensure you're using a valid `project_id` (use `list_projects` first) 2. For step-based test cases, get the correct `template_id` using `get_project_templates` 3. Verify folder exists if specifying `folder_id` ### Steps Not Showing Make sure to: 1. Use the "Case (steps)" template - get its ID via `get_project_templates` 2. Format steps correctly: `"1. Step description | Expected: expected result"` ## Contributing Contributions are welcome! Please feel free to submit a Pull Request. ## License This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details. ## Acknowledgments - [Testmo](https://www.testmo.com/) for providing a great test management platform - [FastMCP](https://github.com/jlowin/fastmcp) for the MCP server framework - [Anthropic](https://www.anthropic.com/) for the Model Context Protocol specification

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/filipljoljic/Testmo-MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server