Skip to main content
Glama

Garak-MCP

by EdenYavin

MCP Server For Garak LLM Vulnerability Scanner

A lightweight MCP (Model Context Protocol) server for Garak.

Example:

https://github.com/user-attachments/assets/f6095d26-2b79-4ef7-a889-fd6be27bbbda


Tools Provided

Overview

Name

Description

list_model_types

List all available model types (ollama, openai, huggingface, ggml)

list_models

List all available models for a given model type

list_garak_probes

List all available Garak attacks/probes

get_report

Get the report of the last run

run_attack

Run an attack with a given model and probe

Detailed Description

  • list_model_types

    • List all available model types that can be used for attacks

    • Returns a list of supported model types (ollama, openai, huggingface, ggml)

  • list_models

    • List all available models for a given model type

    • Input parameters:

      • model_type (string, required): The type of model to list (ollama, openai, huggingface, ggml)

    • Returns a list of available models for the specified type

  • list_garak_probes

    • List all available Garak attacks/probes

    • Returns a list of available probes/attacks that can be run

  • get_report

    • Get the report of the last run

    • Returns the path to the report file

  • run_attack

    • Run an attack with the given model and probe

    • Input parameters:

      • model_type (string, required): The type of model to use

      • model_name (string, required): The name of the model to use

      • probe_name (string, required): The name of the attack/probe to use

    • Returns a list of vulnerabilities found


Prerequisites

  1. Python 3.11 or higher: This project requires Python 3.11 or newer.

    # Check your Python version python --version
  2. Install uv: A fast Python package installer and resolver.

    pip install uv

    Or use Homebrew:

    brew install uv
  3. Optional: Ollama: If you want to run attacks on ollama models be sure that the ollama server is running.

ollama serve

Installation

  1. Clone this repository:

git clone https://github.com/BIGdeadLock/Garak-MCP.git
  1. Configure your MCP Host (Claude Desktop ,Cursor, etc):

{ "mcpServers": { "garak-mcp": { "command": "uv", "args": ["--directory", "path-to/Garak-MCP", "run", "garak-server"], "env": {} } } }

Tested on:

  • Cursor

  • Claude Desktop


Future Steps

  • Add support for Smithery AI: Docker and config

  • Improve Reporting

  • Test and validate OpenAI models (GPT-3.5, GPT-4)

  • Test and validate HuggingFace models

  • Test and validate local GGML models

Deploy Server
A
security – no known vulnerabilities
-
license - not tested
A
quality - confirmed to work

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

MCP Server For Garak LLM Vulnerability Scanner https://github.com/EdenYavin/Garak-MCP/blob/main/README.md

  1. Tools Provided
    1. Overview
    2. Detailed Description
  2. Prerequisites
    1. Installation
      1. Future Steps

        MCP directory API

        We provide all the information about MCP servers via our MCP API.

        curl -X GET 'https://glama.ai/api/mcp/v1/servers/EdenYavin/Garak-MCP'

        If you have feedback or need assistance with the MCP directory API, please join our Discord server