Integrations
H3 CLI MCP Server
An MCP server that lets AI assistants and LLMs interact with the Horizon3.ai API using the official h3-cli
tool.
What is this?
This MCP server exposes the full power of the h3-cli to your AI coding assistant (Claude, Cursor, VS Code, etc). It enables:
- Scheduling and running pentests
- Querying pentest results, weaknesses, impacts, hosts, credentials, and more
- Automating security workflows and reporting
- All via natural language and LLM tools
Note: You must have a working h3-cli
installed and authenticated on your system. This server is a thin wrapper and does not manage your API keys or CLI installation.
Quick Copy-Paste: Add to Your MCP Client
Add this to your MCP client configuration (e.g., Cursor, Claude Desktop, Windsurf, etc):
- No need to clone or build this repo manually—
uvx
will fetch and run the latest version automatically. - For advanced usage, see below.
Features
- Full h3-cli API access: Everything you can do with the CLI, you can do via LLM tools.
- GraphQL documentation: Fetch up-to-date docs for all available queries and mutations.
- Parameter validation: Clear error messages and examples for all tool inputs.
- Prompt templates: Built-in guidance for pagination, pivots, and common workflows.
- Works with any MCP-compatible client: Claude, Cursor, Windsurf, VS Code, and more.
Tools Provided
Tool Name | Description |
---|---|
run_h3_command | Run any h3-cli command and return the output. |
fetch_graphql_docs | Fetch GraphQL schema/docs for any query, mutation, or type. |
run_graphql_request | Run a raw GraphQL query with variables and get the result. |
health_check | Check h3-cli installation and API connectivity. |
See your client’s tool discovery UI for full parameter details and examples.
Usage with VS Code, Cursor, Claude Desktop, etc.
- VS Code: Add the above config to your
.vscode/mcp.json
or User Settings (JSON). - Cursor: Add to
~/.cursor/mcp.json
or your project’s.cursor/mcp.json
. - Claude Desktop: Add to
claude_desktop_config.json
. - Windsurf: Add to your Windsurf MCP config file.
For more details, see your client’s documentation on MCP server configuration.
Troubleshooting
- If you see errors about
h3
not found, make sure you have installed and authenticatedh3-cli
(see below). - If you see authentication errors, double-check your API key in the CLI.
- For more help, see the official h3-cli setup guide.
License
1. Install h3-cli
- Get your API key from the Horizon3.ai Portal under User → Account Settings.
- The install script will prompt you to update your shell profile. Follow the instructions, then restart your Terminal.
2. Test your h3-cli install
You should see the h3-cli help text.
3. Verify your API connection
You should see a response like:
You must be authenticated.
An MCP server that allows AI assistants and LLMs to interact with the Horizon3.ai API for scheduling pentests, querying results, and automating security workflows through natural language commands.
Related MCP Servers
- AsecurityAlicenseAqualityA Model Context Protocol server that enables AI assistants to interact with Linear project management systems, allowing users to retrieve, create, and update issues, projects, and teams through natural language.Last updated -32805TypeScriptMIT License
- -securityAlicense-qualityAn MCP server implementation that integrates AI assistants with Langfuse workspaces, allowing models to query LLM metrics by time range.Last updated -9JavaScriptApache 2.0
- -securityAlicense-qualityAn MCP server that allows AI assistants to interact with Foundry datasets, ontology objects, and functions through natural language queries and commands.Last updated -2PythonMIT License
- AsecurityAlicenseAqualityAn MCP server that supercharges AI assistants with powerful tools for software development, enabling research, planning, code generation, and project scaffolding through natural language interaction.Last updated -116TypeScriptMIT License