Provides access to query Elasticsearch data through Kibana's API, enabling log searches and data retrieval with support for custom queries, sorting, and filtering.
Enables AI assistants to interact with Kibana dashboards, visualizations, data views, and saved searches, with tools for executing searches, exporting dashboards, and querying Elasticsearch data through Kibana's REST API.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Kibana MCP Servershow me the top 5 error logs from the last hour"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Kibana MCP Server
A Model Context Protocol (MCP) server that enables AI assistants to interact with Kibana dashboards, visualizations, and Elasticsearch data through a standardized interface.
Features
Resources: Read-only access to Kibana dashboards, visualizations, data views, and saved searches
Tools: Execute searches, export dashboards, and query Elasticsearch data
Dual Transport: Supports both stdio (local) and HTTP/SSE (containerized) transports
Docker Support: Production-ready containerization with Docker and Podman
Authentication: API key and username/password authentication
Type-Safe: Built with TypeScript for enhanced reliability
Architecture
┌─────────────────┐
│ AI Assistant │
│ (Claude, etc.) │
└────────┬────────┘
│ MCP Protocol
│
┌────────▼────────┐ ┌─────────────┐
│ MCP Server │─────▶│ Kibana │
│ (This Server) │ │ REST API │
└─────────────────┘ └──────┬──────┘
│
┌──────▼──────┐
│Elasticsearch│
└─────────────┘Quick Start
Using Docker Compose (Recommended)
Docker Compose is the preferred way to run this server. Credentials are passed via shell environment variables so nothing is hard-coded.
Export your Kibana credentials (API key or username/password):
# Option A: API key export KIBANA_API_KEY=your_api_key_here # Option B: Username/password export KIBANA_USERNAME=your_username export KIBANA_PASSWORD=your_passwordBuild and start:
docker compose up --build -dVerify it's running:
curl http://localhost:3000/healthView logs / stop:
docker compose logs -f docker compose down
The KIBANA_URL defaults to https://localhost:5601 and can be overridden:
export KIBANA_URL=https://your-kibana-instance.comLocal Development
Install dependencies:
npm installConfigure environment:
cp .env.example .env # Edit .env with your Kibana credentialsRun in development mode:
# Stdio mode (for Claude Desktop) npm run dev # HTTP mode (for testing) npm run dev:httpBuild and run production:
npm run build npm start # stdio mode npm start:http # HTTP mode
Configuration
Environment Variables
Create a .env file based on .env.example:
# Kibana Configuration (required)
KIBANA_URL=https://your-kibana-instance.com
KIBANA_API_KEY=your_api_key_here
# Alternative: Username/Password Authentication
# KIBANA_USERNAME=your_username
# KIBANA_PASSWORD=your_password
# Server Configuration
MCP_TRANSPORT=http # or stdio
HTTP_PORT=3000 # Port for HTTP server
LOG_LEVEL=info # debug, info, warn, errorAuthentication Methods
API Key (Recommended):
KIBANA_URL=https://kibana.example.com
KIBANA_API_KEY=your_base64_encoded_api_keyUsername/Password:
KIBANA_URL=https://kibana.example.com
KIBANA_USERNAME=admin
KIBANA_PASSWORD=your_passwordMCP Capabilities
Resources (Read-Only Data)
kibana://dashboards- List all dashboardskibana://dashboard/{id}- Get specific dashboardkibana://visualizations- List all visualizationskibana://data-views- List all data viewskibana://saved-searches- List saved searches
Tools (Executable Functions)
list_dashboards
List dashboards with optional search and pagination.
{
"search": "security",
"page": 1,
"perPage": 20
}get_dashboard
Get detailed information about a specific dashboard.
{
"id": "dashboard-id-here"
}export_dashboard
Export dashboard with all dependencies.
{
"id": "dashboard-id-here",
"includeReferences": true
}search_logs
Query Elasticsearch data through Kibana.
{
"index": "logs-*",
"query": {
"match": {
"message": "error"
}
},
"size": 10,
"sort": [{"@timestamp": "desc"}]
}Other Tools
list_visualizations- List visualizationsget_visualization- Get visualization detailslist_data_views- List available data views
Connecting to AI Assistants
This server supports two transports. They share the same core server logic (src/server.ts) but differ in how the client communicates with it:
HTTP/SSE ( | stdio ( | |
How it works | Long-running HTTP server. Clients connect via Server-Sent Events (SSE) and send JSON-RPC over POST requests. | Client spawns the server as a child process. JSON-RPC messages flow over stdin/stdout. |
When to use | Remote/containerized deployments, Claude Code, any network-based MCP client | Local-only usage, Claude Desktop app |
Run with |
|
|
Entry point |
|
|
Claude Code (HTTP/SSE transport)
Claude Code connects to MCP servers over SSE. Start the HTTP server first, then register it with Claude Code.
Option 1: CLI (Recommended)
# Start the server
docker compose up -d
# Add as a user-scoped MCP server
claude mcp add --scope user --transport sse kibana http://localhost:3000/sseOption 2: Project config (.mcp.json)
Create .mcp.json in your project root (shared with the team via version control):
{
"mcpServers": {
"kibana": {
"type": "sse",
"url": "http://localhost:3000/sse"
}
}
}Verification: In Claude Code, type /mcp to see available servers. You should see "kibana" listed with its resources and tools.
Claude Desktop (stdio transport)
For the Claude Desktop app, use stdio transport.
Add to your Claude Desktop configuration (~/Library/Application Support/Claude/claude_desktop_config.json on macOS):
{
"mcpServers": {
"kibana": {
"command": "node",
"args": ["/path/to/jb-kibana-mcp/dist/index.js"],
"env": {
"KIBANA_URL": "https://your-kibana.com",
"KIBANA_API_KEY": "your-api-key"
}
}
}
}Generic MCP Clients (SSE)
Any MCP client that supports SSE transport can connect to:
http://localhost:3000/sseThe SSE handshake flow:
Client opens
GET /sse— receives anendpointevent with a session-specific message URLClient sends JSON-RPC messages via
POST /message?sessionId=<id>Server streams responses back over the SSE connection
Additional endpoints:
GET /health— Health check (returns JSON status)GET /info— Server metadata and capabilities
Docker Deployment
Build Image
docker build -t kibana-mcp:latest .Run Container
docker run -d \
--name kibana-mcp \
-p 3000:3000 \
-e KIBANA_URL=https://your-kibana.com \
-e KIBANA_API_KEY=your-api-key \
kibana-mcp:latestDocker Compose
# Start
docker compose up -d
# View logs
docker compose logs -f
# Stop
docker compose downDevelopment
Project Structure
jb-kibana-mcp/
├── src/
│ ├── index.ts # Stdio transport entry point (Claude Desktop)
│ ├── http-server.ts # HTTP/SSE transport entry point (Claude Code, Docker)
│ ├── server.ts # Core MCP server logic
│ ├── kibana/
│ │ ├── client.ts # Kibana API client
│ │ ├── types.ts # TypeScript types
│ │ └── auth.ts # Authentication
│ ├── resources/
│ │ └── index.ts # MCP resources
│ └── tools/
│ └── index.ts # MCP tools
├── Dockerfile
├── docker-compose.yml
└── package.jsonAdding New Tools
Define the tool schema in
src/tools/index.tsImplement the handler in the
tools/callrequest handlerAdd corresponding Kibana client method if needed
Testing
Unit tests (no external dependencies, mocked Kibana):
npm test # run once
npm run test:watch # watch mode
npm run test:coverage # with coverage reportIntegration tests (require a live Kibana instance):
Integration tests start an in-process MCP server, connect over SSE, and exercise
every tool and resource against real Kibana. They are kept separate from unit
tests so npm test stays fast and offline.
Set environment variables — the tests load
.envvia dotenv, so values already in.env(likeKIBANA_URL) are picked up automatically. Shell environment variables take precedence. You need:# Already in .env: KIBANA_URL=https://your-kibana-instance.com # Set in your shell (or add to .env): export KIBANA_API_KEY=your-api-key # — or — export KIBANA_USERNAME=you@example.com export KIBANA_PASSWORD=your-passwordRun:
npm run test:integrationIf
KIBANA_URLor credentials are missing, the tests skip automatically (no failures).
What the integration tests cover:
Area | Tests |
MCP handshake | SSE connect, initialize, initialized notification |
| All 7 tools registered |
| All 4 resources registered |
| Pagination, search filtering |
| Fetch by ID |
| NDJSON export with references |
| Listing |
| Fetch by ID |
| Listing |
| match_all, size limits, sort |
| Read dashboards, data-views, dashboard by ID |
Error handling | Nonexistent dashboard, unknown tool |
Manual testing:
# Health check
curl http://localhost:3000/health
# Server info
curl http://localhost:3000/info
# Test with MCP Inspector
npx @modelcontextprotocol/inspector dist/index.jsSecurity
Container Isolation: Runs as non-root user (mcpuser)
Minimal Base Image: Uses node:20-slim to reduce attack surface
Secret Management: Environment variables for credentials
API Authentication: Supports API keys and basic auth
RBAC: Respects Kibana's role-based access control
Troubleshooting
Connection Issues
# Check if Kibana is accessible
curl -I https://your-kibana.com/api/status
# Verify authentication
curl -H "Authorization: ApiKey YOUR_KEY" \
-H "kbn-xsrf: true" \
https://your-kibana.com/api/statusContainer Issues
# View logs
docker logs kibana-mcp-server
# Shell into container
docker exec -it kibana-mcp-server /bin/sh
# Rebuild without cache
docker compose build --no-cacheCI
A GitHub Actions workflow runs on every pull request targeting main and on pushes to main. It builds the project and runs unit tests across Node.js 20 and 22. See .github/workflows/ci.yml.
Contributing
Contributions are welcome! Please follow these guidelines:
Use TypeScript for all new code
Follow existing code style
Add tests for new features
Update documentation
Ensure CI passes — the build and unit tests must succeed before merging
License
MIT