Locust MCP Server

by QAInsights
Verified

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

Integrations

  • Enables running Locust load tests with configurable parameters (users, spawn rate, runtime) for HTTP/HTTPS performance testing through a simple API.

🚀 ⚡️ locust-mcp-server

A Model Context Protocol (MCP) server implementation for running Locust load tests. This server enables seamless integration of Locust load testing capabilities with AI-powered development environments.

✨ Features

  • Simple integration with Model Context Protocol framework
  • Support for headless and UI modes
  • Configurable test parameters (users, spawn rate, runtime)
  • Easy-to-use API for running Locust load tests
  • Real-time test execution output
  • HTTP/HTTPS protocol support out of the box
  • Custom task scenarios support

🔧 Prerequisites

Before you begin, ensure you have the following installed:

📦 Installation

  1. Clone the repository:
git clone https://github.com/qainsights/locust-mcp-server.git
  1. Install the required dependencies:
uv pip install -r requirements.txt
  1. Set up environment variables (optional): Create a .env file in the project root:
LOCUST_HOST=http://localhost:8089 # Default host for your tests LOCUST_USERS=3 # Default number of users LOCUST_SPAWN_RATE=1 # Default user spawn rate LOCUST_RUN_TIME=10s # Default test duration

🚀 Getting Started

  1. Create a Locust test script (e.g., hello.py):
from locust import HttpUser, task, between class QuickstartUser(HttpUser): wait_time = between(1, 5) @task def hello_world(self): self.client.get("/hello") self.client.get("/world") @task(3) def view_items(self): for item_id in range(10): self.client.get(f"/item?id={item_id}", name="/item") time.sleep(1) def on_start(self): self.client.post("/login", json={"username":"foo", "password":"bar"})
  1. Configure the MCP server using the below specs in your favorite MCP client (Claude Desktop, Cursor, Windsurf and more):
{ "mcpServers": { "locust": { "command": "/Users/naveenkumar/.local/bin/uv", "args": [ "--directory", "/Users/naveenkumar/Gits/locust-mcp-server", "run", "locust_server.py" ] } } }
  1. Now ask the LLM to run the test e.g. run locust test for hello.py. The Locust MCP server will use the following tool to start the test:
  • run_locust: Run a test with configurable options for headless mode, host, runtime, users, and spawn rate

📝 API Reference

Run Locust Test

run_locust( test_file: str, headless: bool = True, host: str = "http://localhost:8089", runtime: str = "10s", users: int = 3, spawn_rate: int = 1 )

Parameters:

  • test_file: Path to your Locust test script
  • headless: Run in headless mode (True) or with UI (False)
  • host: Target host to load test
  • runtime: Test duration (e.g., "30s", "1m", "5m")
  • users: Number of concurrent users to simulate
  • spawn_rate: Rate at which users are spawned

✨ Use Cases

  • LLM powered results analysis
  • Effective debugging with the help of LLM

🤝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

You must be authenticated.

A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

A server that integrates Locust load testing capabilities with AI-powered development environments, allowing users to run performance tests through natural language commands.

  1. ✨ Features
    1. 🔧 Prerequisites
      1. 📦 Installation
        1. 🚀 Getting Started
          1. 📝 API Reference
            1. Run Locust Test
          2. ✨ Use Cases
            1. 🤝 Contributing
              1. 📄 License
                ID: ka9m9g964t