Uses Docker to run the CyberChef-server backend that performs the actual data transformation operations
Integrates with the CyberChef-server project available through Git repositories
Integrates with CyberChef-server hosted on GitHub to provide data transformation capabilities
Uses Pydantic models for schema validation and enforcement of argument and return types for CyberChef operations
Implemented in Python and provides Python-based tooling for CyberChef integration
Cyberchef MCP Server
Pydantic-powered MCP server exposing most CyberChef operations as structured tools.
- Sources for operation metadata: see extract_operations.js
- Operations catalog JSON: utils/js/operations.json
What is this?
This project wraps the CyberChef-server HTTP API in an MCP (Model Context Protocol) server so AI agents and MCP-aware apps can:
- Discover CyberChef operations with fuzzy search
- Inspect the exact argument schema for any operation
- Execute single- or multi-step CyberChef recipes against text or binary data
- Validate and repair recipes programmatically
Prerequisites
You need a running CyberChef-server (the upstream API that performs the actual transforms):
By default this MCP server talks to http://localhost:3000/; you can override with --api-url.
Install (local)
Run (local)
This will start the MCP server using the streamable-http transport on the host/port you provide.
CLI flags:
- --api-url: Base URL of the upstream CyberChef-server (default http://localhost:3000/)
- --host: Interface to bind for the MCP server (default 127.0.0.1)
- --port: Port for the MCP server (default 3002)
Run with Docker
Builds a lightweight image and starts the MCP server on port 3002.
From this directory:
Then run it (pointing to your CyberChef-server):
MCP Tools exposed
These are the primary tools exported by the MCP server. Argument and return schemas are enforced with Pydantic models.
- search_operations(query: string, limit?: number=10, include_args?: boolean=false) → { total, items[], truncated? } Find relevant CyberChef operations by name/description with fuzzy matching. Optionally include argument lists.
- get_operation_args(op: string, compact?: boolean=true) → { ok, op, args[], error? } Return the exact argument schema for one operation; with compact=true, enum values are slugified.
- bake_recipe(input_data: string, recipe: [{op, args?}]) → { ok, output?, type?, errors[], warnings[] } Execute a single recipe for one input string.
- batch_bake_recipe(batch_input_data: string[], recipe: [...]) → { results: BakeRecipeResponse[] } Execute the same recipe for many inputs.
- validate_recipe(recipe: [{op, args?}]) → { ok, errors[], suggestions[], normalized? } Validate step names/args and suggest fixes or missing args.
- help_bake_recipe() → Cheat sheet with schema notes and examples for composing recipes.
- cyberchef_probe(raw_input: string) → ProbeOut Quick heuristics to guess encodings and propose a minimal recipe.
- perform_magic_operation(input_data: string, depth?: int=3, intensive_mode?: bool=false, extensive_language_support?: bool=false, crib_str?: string="") → dict Invoke CyberChef Magic; may be slow/approximate.
Tip: Operation names and argument keys are case-sensitive and must match CyberChef exactly. Use search_operations/get_operation_args first.
Example (agent integration)
See example/test-cyberchef.py for a full integration with Microsoft Autogen MCP workbench. It spins up this server and drives it strictly via tools. A minimal flow:
- search_operations("base64") to shortlist "From Base64".
- bake_recipe with:
- input_data: "SGVsbG8gV29ybGQh"
- recipe: [{"op":"From Base64","args":{}}]
Cross platform builds
Only once: docker buildx create --use --name xbuilder
Further updates:
Troubleshooting
- Connection errors when baking recipes: ensure CyberChef-server is running and --api-url points to it.
- Unknown op / bad args: call get_operation_args(op) and confirm exact key names and allowed enum values.
- Large search results: lower limit or narrow your query; the server truncates responses to stay under size caps.
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
Enables AI agents to discover, execute, and validate CyberChef operations for data encoding, decoding, encryption, and transformation tasks. Provides structured access to CyberChef's extensive catalog of data manipulation tools through natural language interactions.