Skip to main content
Glama

Zeek-MCP

local-only server

The server can only run on the client's local machine because it depends on local resources.

模型上下文协议服务器将 Zeek 网络分析功能与 LLM 聊天机器人相集成,使它们能够通过自然语言交互分析 PCAP 文件并解析网络日志。

  1. 目录
    1. 先决条件
      1. 安装
        1. 1. 克隆存储库
        2. 2.安装依赖项
      2. 用法
        1. 3. 运行 MCP 服务器
        2. 4. 使用 MCP 工具
      3. 示例
        1. 执照

          Related MCP Servers

          • A
            security
            A
            license
            A
            quality
            A Model Context Protocol server that provides LLM Agents with a comprehensive toolset for IP geolocation, network diagnostics, system monitoring, cryptographic operations, and QR code generation.
            16
            TypeScript
          • -
            security
            A
            license
            -
            quality
            A server implementing Model Context Protocol that enables LLMs to interact with the ZenML platform, providing access to pipeline data, stack information, and the ability to trigger new pipeline runs.
            Python
          • A
            security
            A
            license
            A
            quality
            A Model Context Protocol server that provides LLMs with real-time network traffic analysis capabilities, enabling tasks like threat hunting, network diagnostics, and anomaly detection through Wireshark's tshark.
            7
            JavaScript
          • A
            security
            F
            license
            A
            quality
            A Model Context Protocol server that allows AI assistants to execute and manage JMeter performance tests through natural language commands.
            6
            Python

          View all related MCP servers

          MCP directory API

          We provide all the information about MCP servers via our MCP API.

          curl -X GET 'https://glama.ai/api/mcp/v1/servers/Gabbo01/Zeek-MCP'

          If you have feedback or need assistance with the MCP directory API, please join our Discord server