Used for managing environment variables during development, allowing developers to store FogBugz credentials in a .env file.
Used for repository hosting and version control of the project.
Used for package distribution and installation of the FogBugz MCP server.
The project was initiated with OpenAI's o3-mini-high model, which generated the development plan for the MCP server.
The implementation language for the FogBugz MCP server as mentioned in the project background.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@FogBugz MCP Servercreate a new bug report for the login page issue"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
FogBugz MCP Server
A Model Context Protocol (MCP) server for interacting with FogBugz through Language Learning Models (LLMs) such as Claude.
Overview
This server allows LLMs to perform various operations on FogBugz including:
Creating new issues/cases with optional attachments
Updating existing cases (changing project, area, milestone, priority)
Assigning cases to specific users
Listing a user's open cases
Getting direct links to specific cases
Searching for cases by various criteria
The server implements the Model Context Protocol (MCP) specification, allowing it to be used by any MCP-compatible LLM client.
Related MCP server: MongoDB MCP Server for LLMs
Project Background
This project was initiated with the help of OpenAI's o3-mini-high model, which generated a comprehensive development plan (see DEVELOPMENT-PLAN.md in the repository). The plan outlined the architecture, tools, and implementation details for building a FogBugz MCP server in TypeScript.
The detailed specification served as a blueprint for the development team, demonstrating how AI can effectively assist in the early phases of project design and planning. This project is both an example of AI-assisted development and a tool that enhances AI capabilities through the MCP protocol.
Installation
# Install from npm
npm install -g fogbugz-mcp
# Or use directly with npx
npx fogbugz-mcp <fogbugz-url> <api-key>Usage
Basic Usage
# Run with command line arguments
fogbugz-mcp https://yourcompany.fogbugz.com your-api-key
# Or use environment variables
export FOGBUGZ_URL=https://yourcompany.fogbugz.com
export TEST_FOGBUGZ_API_KEY=your-api-key
fogbugz-mcpDevelopment
# Clone the repository
git clone https://github.com/yourusername/fogbugz-mcp.git
cd fogbugz-mcp
# Install dependencies
npm install
# Create a .env file with your FogBugz credentials
echo "FOGBUGZ_URL=https://yourcompany.fogbugz.com" > .env
echo "TEST_FOGBUGZ_API_KEY=your-api-key" >> .env
# Run API explorer to test FogBugz API
npm run explore
# Run the development version of the server
npm run dev
# Run tests
npm test
# Build the project
npm run buildAPI Explorer
The project includes an API explorer tool for testing FogBugz API endpoints directly:
# Run all API tests
npm run explore
# Run a specific test (by index)
npm run explore 0 # Run the first testMCP Tools
This server provides the following MCP tools for LLMs:
fogbugz_create_case- Create a new FogBugz casefogbugz_update_case- Update an existing case's fieldsfogbugz_assign_case- Assign a case to a specific userfogbugz_list_my_cases- List cases assigned to a specific userfogbugz_search_cases- Search for cases using a query stringfogbugz_get_case_link- Get a direct link to a specific case
License
ISC
This server cannot be installed
Resources
Looking for Admin?
Admins can modify the Dockerfile, update the server description, and track usage metrics. If you are the server author, to access the admin panel.