Skip to main content
Glama

MCP Server For Garak LLM Vulnerability Scanner

A lightweight MCP (Model Context Protocol) server for Garak.

Example:

https://github.com/user-attachments/assets/f6095d26-2b79-4ef7-a889-fd6be27bbbda


Tools Provided

Overview

Name

Description

list_model_types

List all available model types (ollama, openai, huggingface, ggml)

list_models

List all available models for a given model type

list_garak_probes

List all available Garak attacks/probes

get_report

Get the report of the last run

run_attack

Run an attack with a given model and probe

Detailed Description

  • list_model_types

    • List all available model types that can be used for attacks

    • Returns a list of supported model types (ollama, openai, huggingface, ggml)

  • list_models

    • List all available models for a given model type

    • Input parameters:

      • model_type (string, required): The type of model to list (ollama, openai, huggingface, ggml)

    • Returns a list of available models for the specified type

  • list_garak_probes

    • List all available Garak attacks/probes

    • Returns a list of available probes/attacks that can be run

  • get_report

    • Get the report of the last run

    • Returns the path to the report file

  • run_attack

    • Run an attack with the given model and probe

    • Input parameters:

      • model_type (string, required): The type of model to use

      • model_name (string, required): The name of the model to use

      • probe_name (string, required): The name of the attack/probe to use

    • Returns a list of vulnerabilities found


Related MCP server: Mercado Livre MCP Server

Prerequisites

  1. Python 3.11 or higher: This project requires Python 3.11 or newer.

    # Check your Python version python --version
  2. Install uv: A fast Python package installer and resolver.

    pip install uv

    Or use Homebrew:

    brew install uv
  3. Optional: Ollama: If you want to run attacks on ollama models be sure that the ollama server is running.

ollama serve

Installation

  1. Clone this repository:

git clone https://github.com/BIGdeadLock/Garak-MCP.git
  1. Configure your MCP Host (Claude Desktop ,Cursor, etc):

{ "mcpServers": { "garak-mcp": { "command": "uv", "args": ["--directory", "path-to/Garak-MCP", "run", "garak-server"], "env": {} } } }

Tested on:

  • Cursor

  • Claude Desktop


Running Vulnerability Scans

You can run Garak vulnerability scans directly using the included CLI tool.

Prerequisites for Scanning

  1. Ollama must be running:

    ollama serve
  2. Pull a model to scan:

    ollama pull llama2

Using the CLI Scanner

After installation, you can use the garak-scan command:

# List available Ollama models uv run garak-scan --list-models # Scan a specific model with all probes uv run garak-scan --model llama2 # Scan with specific probes uv run garak-scan --model llama2 --probes encoding # Scan with custom output directory uv run garak-scan --model llama2 --output-dir ./my_scans # Run multiple parallel attempts uv run garak-scan --model llama2 --parallel-attempts 4

Scan Results

Scan results are saved in the output/ directory (or your specified directory) as JSONL files. Each scan creates a timestamped report file:

output/scan_llama2_20250125_143022.report.jsonl

GitHub Actions Integration

This repository includes a GitHub Actions workflow that automatically runs vulnerability scans:

  • Triggers: Push to main/master, pull requests, weekly schedule (Mondays at 2am UTC)

  • Manual runs: Go to Actions → Garak LLM Vulnerability Scan → Run workflow

  • Custom options: Specify model and probes when running manually

  • Results: Scan results are uploaded as workflow artifacts

To enable automated scanning:

  1. Ensure the workflow file exists at .github/workflows/garak-scan.yml

  2. Push to your repository

  3. Check the Actions tab to view scan results


Future Steps

  • Add support for Smithery AI: Docker and config

  • Improve Reporting

  • Test and validate OpenAI models (GPT-3.5, GPT-4)

  • Test and validate HuggingFace models

  • Test and validate local GGML models

One-click Deploy
A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/EdenYavin/Garak-MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server