Skip to main content
Glama
jerrelblankenship

Kibana MCP Server

Kibana MCP Server

CI

A Model Context Protocol (MCP) server that enables AI assistants to interact with Kibana dashboards, visualizations, and Elasticsearch data through a standardized interface.

Features

  • Resources: Read-only access to Kibana dashboards, visualizations, data views, and saved searches

  • Tools: Execute searches, export dashboards, and query Elasticsearch data

  • Dual Transport: Supports both stdio (local) and HTTP/SSE (containerized) transports

  • Docker Support: Production-ready containerization with Docker and Podman

  • Authentication: API key and username/password authentication

  • Type-Safe: Built with TypeScript for enhanced reliability

Architecture

┌─────────────────┐
│   AI Assistant  │
│  (Claude, etc.) │
└────────┬────────┘
         │ MCP Protocol
         │
┌────────▼────────┐      ┌─────────────┐
│   MCP Server    │─────▶│   Kibana    │
│  (This Server)  │      │   REST API  │
└─────────────────┘      └──────┬──────┘
                                │
                         ┌──────▼──────┐
                         │Elasticsearch│
                         └─────────────┘

Quick Start

Docker Compose is the preferred way to run this server. Credentials are passed via shell environment variables so nothing is hard-coded.

  1. Export your Kibana credentials (API key or username/password):

    # Option A: API key
    export KIBANA_API_KEY=your_api_key_here
    
    # Option B: Username/password
    export KIBANA_USERNAME=your_username
    export KIBANA_PASSWORD=your_password
  2. Build and start:

    docker compose up --build -d
  3. Verify it's running:

    curl http://localhost:3000/health
  4. View logs / stop:

    docker compose logs -f
    docker compose down

The KIBANA_URL defaults to https://localhost:5601 and can be overridden:

export KIBANA_URL=https://your-kibana-instance.com

Local Development

  1. Install dependencies:

    npm install
  2. Configure environment:

    cp .env.example .env
    # Edit .env with your Kibana credentials
  3. Run in development mode:

    # Stdio mode (for Claude Desktop)
    npm run dev
    
    # HTTP mode (for testing)
    npm run dev:http
  4. Build and run production:

    npm run build
    npm start        # stdio mode
    npm start:http   # HTTP mode

Configuration

Environment Variables

Create a .env file based on .env.example:

# Kibana Configuration (required)
KIBANA_URL=https://your-kibana-instance.com
KIBANA_API_KEY=your_api_key_here

# Alternative: Username/Password Authentication
# KIBANA_USERNAME=your_username
# KIBANA_PASSWORD=your_password

# Server Configuration
MCP_TRANSPORT=http           # or stdio
HTTP_PORT=3000               # Port for HTTP server
LOG_LEVEL=info               # debug, info, warn, error

Authentication Methods

API Key (Recommended):

KIBANA_URL=https://kibana.example.com
KIBANA_API_KEY=your_base64_encoded_api_key

Username/Password:

KIBANA_URL=https://kibana.example.com
KIBANA_USERNAME=admin
KIBANA_PASSWORD=your_password

MCP Capabilities

Resources (Read-Only Data)

  • kibana://dashboards - List all dashboards

  • kibana://dashboard/{id} - Get specific dashboard

  • kibana://visualizations - List all visualizations

  • kibana://data-views - List all data views

  • kibana://saved-searches - List saved searches

Tools (Executable Functions)

list_dashboards

List dashboards with optional search and pagination.

{
  "search": "security",
  "page": 1,
  "perPage": 20
}

get_dashboard

Get detailed information about a specific dashboard.

{
  "id": "dashboard-id-here"
}

export_dashboard

Export dashboard with all dependencies.

{
  "id": "dashboard-id-here",
  "includeReferences": true
}

search_logs

Query Elasticsearch data through Kibana.

{
  "index": "logs-*",
  "query": {
    "match": {
      "message": "error"
    }
  },
  "size": 10,
  "sort": [{"@timestamp": "desc"}]
}

Other Tools

  • list_visualizations - List visualizations

  • get_visualization - Get visualization details

  • list_data_views - List available data views

Connecting to AI Assistants

This server supports two transports. They share the same core server logic (src/server.ts) but differ in how the client communicates with it:

HTTP/SSE (src/http-server.ts)

stdio (src/index.ts)

How it works

Long-running HTTP server. Clients connect via Server-Sent Events (SSE) and send JSON-RPC over POST requests.

Client spawns the server as a child process. JSON-RPC messages flow over stdin/stdout.

When to use

Remote/containerized deployments, Claude Code, any network-based MCP client

Local-only usage, Claude Desktop app

Run with

docker compose up -d or npm run dev:http

npm run dev or npm start

Entry point

src/http-server.ts

src/index.ts

Claude Code (HTTP/SSE transport)

Claude Code connects to MCP servers over SSE. Start the HTTP server first, then register it with Claude Code.

# Start the server
docker compose up -d

# Add as a user-scoped MCP server
claude mcp add --scope user --transport sse kibana http://localhost:3000/sse

Option 2: Project config (.mcp.json)

Create .mcp.json in your project root (shared with the team via version control):

{
  "mcpServers": {
    "kibana": {
      "type": "sse",
      "url": "http://localhost:3000/sse"
    }
  }
}

Verification: In Claude Code, type /mcp to see available servers. You should see "kibana" listed with its resources and tools.

Claude Desktop (stdio transport)

For the Claude Desktop app, use stdio transport.

Add to your Claude Desktop configuration (~/Library/Application Support/Claude/claude_desktop_config.json on macOS):

{
  "mcpServers": {
    "kibana": {
      "command": "node",
      "args": ["/path/to/jb-kibana-mcp/dist/index.js"],
      "env": {
        "KIBANA_URL": "https://your-kibana.com",
        "KIBANA_API_KEY": "your-api-key"
      }
    }
  }
}

Generic MCP Clients (SSE)

Any MCP client that supports SSE transport can connect to:

http://localhost:3000/sse

The SSE handshake flow:

  1. Client opens GET /sse — receives an endpoint event with a session-specific message URL

  2. Client sends JSON-RPC messages via POST /message?sessionId=<id>

  3. Server streams responses back over the SSE connection

Additional endpoints:

  • GET /health — Health check (returns JSON status)

  • GET /info — Server metadata and capabilities

Docker Deployment

Build Image

docker build -t kibana-mcp:latest .

Run Container

docker run -d \
  --name kibana-mcp \
  -p 3000:3000 \
  -e KIBANA_URL=https://your-kibana.com \
  -e KIBANA_API_KEY=your-api-key \
  kibana-mcp:latest

Docker Compose

# Start
docker compose up -d

# View logs
docker compose logs -f

# Stop
docker compose down

Development

Project Structure

jb-kibana-mcp/
├── src/
│   ├── index.ts              # Stdio transport entry point (Claude Desktop)
│   ├── http-server.ts        # HTTP/SSE transport entry point (Claude Code, Docker)
│   ├── server.ts             # Core MCP server logic
│   ├── kibana/
│   │   ├── client.ts         # Kibana API client
│   │   ├── types.ts          # TypeScript types
│   │   └── auth.ts           # Authentication
│   ├── resources/
│   │   └── index.ts          # MCP resources
│   └── tools/
│       └── index.ts          # MCP tools
├── Dockerfile
├── docker-compose.yml
└── package.json

Adding New Tools

  1. Define the tool schema in src/tools/index.ts

  2. Implement the handler in the tools/call request handler

  3. Add corresponding Kibana client method if needed

Testing

Unit tests (no external dependencies, mocked Kibana):

npm test                       # run once
npm run test:watch             # watch mode
npm run test:coverage          # with coverage report

Integration tests (require a live Kibana instance):

Integration tests start an in-process MCP server, connect over SSE, and exercise every tool and resource against real Kibana. They are kept separate from unit tests so npm test stays fast and offline.

  1. Set environment variables — the tests load .env via dotenv, so values already in .env (like KIBANA_URL) are picked up automatically. Shell environment variables take precedence. You need:

    # Already in .env:
    KIBANA_URL=https://your-kibana-instance.com
    
    # Set in your shell (or add to .env):
    export KIBANA_API_KEY=your-api-key
    # — or —
    export KIBANA_USERNAME=you@example.com
    export KIBANA_PASSWORD=your-password
  2. Run:

    npm run test:integration

    If KIBANA_URL or credentials are missing, the tests skip automatically (no failures).

What the integration tests cover:

Area

Tests

MCP handshake

SSE connect, initialize, initialized notification

tools/list

All 7 tools registered

resources/list

All 4 resources registered

list_dashboards

Pagination, search filtering

get_dashboard

Fetch by ID

export_dashboard

NDJSON export with references

list_visualizations

Listing

get_visualization

Fetch by ID

list_data_views

Listing

search_logs

match_all, size limits, sort

resources/read

Read dashboards, data-views, dashboard by ID

Error handling

Nonexistent dashboard, unknown tool

Manual testing:

# Health check
curl http://localhost:3000/health

# Server info
curl http://localhost:3000/info

# Test with MCP Inspector
npx @modelcontextprotocol/inspector dist/index.js

Security

  • Container Isolation: Runs as non-root user (mcpuser)

  • Minimal Base Image: Uses node:20-slim to reduce attack surface

  • Secret Management: Environment variables for credentials

  • API Authentication: Supports API keys and basic auth

  • RBAC: Respects Kibana's role-based access control

Troubleshooting

Connection Issues

# Check if Kibana is accessible
curl -I https://your-kibana.com/api/status

# Verify authentication
curl -H "Authorization: ApiKey YOUR_KEY" \
     -H "kbn-xsrf: true" \
     https://your-kibana.com/api/status

Container Issues

# View logs
docker logs kibana-mcp-server

# Shell into container
docker exec -it kibana-mcp-server /bin/sh

# Rebuild without cache
docker compose build --no-cache

CI

A GitHub Actions workflow runs on every pull request targeting main and on pushes to main. It builds the project and runs unit tests across Node.js 20 and 22. See .github/workflows/ci.yml.

Contributing

Contributions are welcome! Please follow these guidelines:

  1. Use TypeScript for all new code

  2. Follow existing code style

  3. Add tests for new features

  4. Update documentation

  5. Ensure CI passes — the build and unit tests must succeed before merging

License

MIT

Resources

-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/jerrelblankenship/jb-kibana-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server