Skip to main content
Glama

BugcrowdMCP

BugcrowdMCP: Server & Agents for the Bugcrowd API

A high-performance MCP (Model Context Protocol) server that provides secure, tool-based access to the Bugcrowd API, allowing for natural language interaction through various AI agent platforms.

Features

  • Broad API Coverage: Provides tools for interacting with Organizations, Programs, Submissions, Assets, and more.
  • Multi-Agent Support: Includes ready-to-use agents for OpenAI, Anthropic (Claude), Google (Gemini), and FastMCP.
  • Extensible & Customizable: Easily switch between AI providers, configure different models, and integrate with platform-specific CLIs.
  • Secure: Uses environment variables for API credentials and performs input validation.
  • Dynamic Help: Includes a help() tool that provides real-time documentation for all available tools.

Quick Start

This guide will get you up and running with the default agent (openai).

1. Prerequisites

  • Python 3.10+
  • uv for package installation.

2. Installation

Clone the repository, create a virtual environment, and install dependencies.

git clone https://github.com/unstrike/Bugcrowd_MCP_Server.git cd Bugcrowd_MCP_Server # Create and activate virtual environment uv venv source .venv/bin/activate # Install dependencies uv sync

3. Configuration

Export your Bugcrowd and OpenAI API credentials as environment variables.

export BUGCROWD_API_USERNAME="your-username" export BUGCROWD_API_PASSWORD="your-password" export OPENAI_API_KEY="your-openai-api-key"

4. Run the Agent

Start the interactive agent.

uv run python -m bugcrowd_agents.agent_orchestrator

You can now interact with the Bugcrowd API using natural language.

Example Prompts:

  • "Show me available bug bounty programs"
  • "What are the 5 most recent vulnerability submissions?"
  • "Use the help tool to see all available commands"

Advanced Usage

Switching Agents

The true power of this server lies in its flexibility. You can easily switch between supported AI platforms by setting the AGENT_PLATFORM environment variable.

  • Supported platforms: openai (default), claude, gemini, fastmcp.

Remember to set the appropriate API key for the agent you choose.

Example: Running the Gemini Agent

# 1. Set the API key for Google export GOOGLE_AI_API_KEY="your-gemini-api-key" # 2. Run the orchestrator with the AGENT_PLATFORM variable AGENT_PLATFORM=gemini uv run python -m bugcrowd_agents.agent_orchestrator

Example: Running the Claude Agent

export ANTHROPIC_API_KEY="your-claude-api-key" AGENT_PLATFORM=claude uv run python -m bugcrowd_agents.agent_orchestrator

Using the FastMCP Agent

The fastmcp agent is a versatile client that can use different LLM backends. Configure it by setting the FASTMCP_PROVIDER environment variable.

  • Supported providers: anthropic (default), google, openai.

Example: Running FastMCP with the Google (Gemini) Backend

# 1. Set the API key for the desired backend export GOOGLE_AI_API_KEY="your-gemini-api-key" # 2. Set the platform and provider, then run AGENT_PLATFORM=fastmcp FASTMCP_PROVIDER=google uv run python -m bugcrowd_agents.agent_orchestrator

Customizing Agent Models

You can override the default models for each agent by setting environment variables:

  • Claude: CLAUDE_MAIN_MODEL, CLAUDE_SUMMARY_MODEL
  • Gemini: GEMINI_MAIN_MODEL, GEMINI_SUMMARY_MODEL
  • OpenAI: OPENAI_MODEL

Available Tools

The server provides the following tools. For detailed parameter information, run the agent and use the help() tool (e.g., help('get_submission')).

CategoryToolDescription
Organizationsget_organizationsList all accessible organizations
get_organizationGet specific organization details
Programsget_programsList bug bounty programs
get_programGet specific program details
Submissionsget_submissionsList vulnerability submissions
get_submissionGet specific submission details
create_submissionCreate a new vulnerability report
update_submissionUpdate an existing submission
create_commentAdd a comment to a submission
Teamsget_teamsList all teams in an organization
create_teamCreate a new team
delete_teamDelete a team
Rewardsget_monetary_rewardsList bounty rewards
create_monetary_rewardCreate a new monetary reward
update_monetary_rewardUpdate an existing monetary reward
Usersget_usersList users in an organization
get_userGet specific user details
Healthserver_healthCheck server and API connectivity
HelphelpGet detailed help for any tool

Direct Integration

For integration with platform-specific CLIs (bypassing the included agent handlers), use the provided configuration templates.

  • For OpenAI (codex): Use docs/config.toml.
  • For Gemini, Claude, etc.: Use docs/config.json.

Instructions:

  1. Copy the appropriate template file to your tool's configuration directory (e.g., ~/.codex/config.toml).
  2. In the copied file, update the cwd variable to the absolute path of the Bugcrowd_MCP_Server project directory.

Documentation

For more detailed information on MCP server configuration, refer to the official documentation for your platform:

Install Server
A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

A high-performance Model Context Protocol server that provides secure, tool-based access to the Bugcrowd API, allowing for natural language interaction with bug bounty programs through various AI agent platforms.

  1. Features
    1. Quick Start
      1. 1. Prerequisites
      2. 2. Installation
      3. 3. Configuration
      4. 4. Run the Agent
    2. Advanced Usage
      1. Switching Agents
      2. Using the FastMCP Agent
      3. Customizing Agent Models
    3. Available Tools
      1. Direct Integration
        1. Documentation

          Related MCP Servers

          • -
            security
            F
            license
            -
            quality
            A Model Context Protocol server that allows AI applications to interact with Crawlab's functionality through natural language, enabling spider management, task execution, and file operations.
            Last updated -
            4
            Python
          • -
            security
            F
            license
            -
            quality
            A comprehensive Model Context Protocol server implementation that enables AI assistants to interact with file systems, databases, GitHub repositories, web resources, and system tools while maintaining security and control.
            Last updated -
            6
            1
            TypeScript
          • A
            security
            A
            license
            A
            quality
            A Model Context Protocol server that enables AI assistants to interact with Bitbucket repositories, pull requests, and other resources through Bitbucket Cloud and Server APIs.
            Last updated -
            3
            112
            9
            JavaScript
            MIT License
            • Linux
            • Apple
          • -
            security
            A
            license
            -
            quality
            A Model Context Protocol server that enables AI agents to interact with Backlog API for managing projects, issues, wikis, Git repositories, and other Backlog features.
            Last updated -
            TypeScript
            MIT License

          View all related MCP servers

          MCP directory API

          We provide all the information about MCP servers via our MCP API.

          curl -X GET 'https://glama.ai/api/mcp/v1/servers/unstrike/BugcrowdMCP'

          If you have feedback or need assistance with the MCP directory API, please join our Discord server