Provides tools for interacting with the Bugcrowd API, enabling access to organizations, programs, vulnerability submissions, teams, rewards, and users. Allows for creating and managing bug reports, adding comments, creating teams, and distributing monetary rewards.
Provides integration with Google Gemini AI for interacting with the Bugcrowd API through natural language, supporting customizable model configuration.
Enables OpenAI models to interact with the Bugcrowd API, serving as the default agent platform with configurable model selection.
BugcrowdMCP: Server & Agents for the Bugcrowd API
A high-performance MCP (Model Context Protocol) server that provides secure, tool-based access to the Bugcrowd API, allowing for natural language interaction through various AI agent platforms.
Features
- Broad API Coverage: Provides tools for interacting with Organizations, Programs, Submissions, Assets, and more.
- Multi-Agent Support: Includes ready-to-use agents for OpenAI, Anthropic (Claude), Google (Gemini), and FastMCP.
- Extensible & Customizable: Easily switch between AI providers, configure different models, and integrate with platform-specific CLIs.
- Secure: Uses environment variables for API credentials and performs input validation.
- Dynamic Help: Includes a
help()
tool that provides real-time documentation for all available tools.
Quick Start
This guide will get you up and running with the default agent (openai
).
1. Prerequisites
- Python 3.10+
uv
for package installation.
2. Installation
Clone the repository, create a virtual environment, and install dependencies.
3. Configuration
Export your Bugcrowd and OpenAI API credentials as environment variables.
4. Run the Agent
Start the interactive agent.
You can now interact with the Bugcrowd API using natural language.
Example Prompts:
- "Show me available bug bounty programs"
- "What are the 5 most recent vulnerability submissions?"
- "Use the help tool to see all available commands"
Advanced Usage
Switching Agents
The true power of this server lies in its flexibility. You can easily switch between supported AI platforms by setting the AGENT_PLATFORM
environment variable.
- Supported platforms:
openai
(default),claude
,gemini
,fastmcp
.
Remember to set the appropriate API key for the agent you choose.
Example: Running the Gemini Agent
Example: Running the Claude Agent
Using the FastMCP Agent
The fastmcp
agent is a versatile client that can use different LLM backends. Configure it by setting the FASTMCP_PROVIDER
environment variable.
- Supported providers:
anthropic
(default),google
,openai
.
Example: Running FastMCP with the Google (Gemini) Backend
Customizing Agent Models
You can override the default models for each agent by setting environment variables:
- Claude:
CLAUDE_MAIN_MODEL
,CLAUDE_SUMMARY_MODEL
- Gemini:
GEMINI_MAIN_MODEL
,GEMINI_SUMMARY_MODEL
- OpenAI:
OPENAI_MODEL
Available Tools
The server provides the following tools. For detailed parameter information, run the agent and use the help()
tool (e.g., help('get_submission')
).
Category | Tool | Description |
---|---|---|
Organizations | get_organizations | List all accessible organizations |
get_organization | Get specific organization details | |
Programs | get_programs | List bug bounty programs |
get_program | Get specific program details | |
Submissions | get_submissions | List vulnerability submissions |
get_submission | Get specific submission details | |
create_submission | Create a new vulnerability report | |
update_submission | Update an existing submission | |
create_comment | Add a comment to a submission | |
Teams | get_teams | List all teams in an organization |
create_team | Create a new team | |
delete_team | Delete a team | |
Rewards | get_monetary_rewards | List bounty rewards |
create_monetary_reward | Create a new monetary reward | |
update_monetary_reward | Update an existing monetary reward | |
Users | get_users | List users in an organization |
get_user | Get specific user details | |
Health | server_health | Check server and API connectivity |
Help | help | Get detailed help for any tool |
Direct Integration
For integration with platform-specific CLIs (bypassing the included agent handlers), use the provided configuration templates.
- For OpenAI (
codex
): Usedocs/config.toml
. - For Gemini, Claude, etc.: Use
docs/config.json
.
Instructions:
- Copy the appropriate template file to your tool's configuration directory (e.g.,
~/.codex/config.toml
). - In the copied file, update the
cwd
variable to the absolute path of theBugcrowd_MCP_Server
project directory.
Documentation
- API Reference: A static reference for all tool and endpoint details.
- Architecture Diagram: An overview of the system architecture.
- Bugcrowd REST API: The official API documentation that this server is built upon.
For more detailed information on MCP server configuration, refer to the official documentation for your platform:
- OpenAI: Codex MCP Server Configuration
- Google Gemini: Configure MCP Servers
- Anthropic Claude: MCP for Claude
- FastMCP: JSON Configuration and Running a Server
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
Tools
A high-performance Model Context Protocol server that provides secure, tool-based access to the Bugcrowd API, allowing for natural language interaction with bug bounty programs through various AI agent platforms.
Related Resources
Related MCP Servers
- -securityFlicense-qualityA Model Context Protocol server that allows AI applications to interact with Crawlab's functionality through natural language, enabling spider management, task execution, and file operations.Last updated -4Python
- -securityFlicense-qualityA comprehensive Model Context Protocol server implementation that enables AI assistants to interact with file systems, databases, GitHub repositories, web resources, and system tools while maintaining security and control.Last updated -61TypeScript
- AsecurityAlicenseAqualityA Model Context Protocol server that enables AI assistants to interact with Bitbucket repositories, pull requests, and other resources through Bitbucket Cloud and Server APIs.Last updated -31129JavaScriptMIT License
- -securityAlicense-qualityA Model Context Protocol server that enables AI agents to interact with Backlog API for managing projects, issues, wikis, Git repositories, and other Backlog features.Last updated -TypeScriptMIT License