Enables introspection of Dremio system metrics and workload analysis by connecting to Prometheus setups, enhancing insights with Dremio-related performance data.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Dremio MCP Servershow me the top 5 customers by total sales last quarter"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Dremio MCP server
Table of Contents
Introduction
This repo provides an Model Context Protocol (MCP) server for easing LLM integration with Dremio. If you are new to MCP and MCP Servers take our Dremio MCP Server course on Dremio University (DremioU). If you are familiar with these concepts already please proceed below.
Installation
The Dremio MCP server can be deployed in two ways:
Remote / Streaming HTTP Deployment
For production deployments in Kubernetes environments, use the Helm chart:
Quick Start with Helm
Key Features
โ OAuth + External Token Provider authentication (recommended for production)
โ Streaming HTTP mode for web-based deployments
โ Horizontal Pod Autoscaling for scalability
โ Prometheus metrics integration
โ Ingress support with TLS/SSL
โ Security best practices (non-root, read-only filesystem)
Documentation
Helm Chart README - Complete installation and configuration guide
Authentication Guide - OAuth + External Token Provider implementation
Example Configurations - Production and development examples
Local Installation (Desktop/Development)
The MCP server runs locally on the machine that runs the LLM frontend (eg Claude). The installation steps are simple:
Clone or download this repository.
Install the uv package manager (note that the MCP server requires python 3.11 or later)
If you install this for the first time, restart your terminal at the end of the install
Ensure that you have python installed by running the command below. It should show python 3.11 or later (If you don't have python installed, follow the instructions here OR simply run
uv python install)
Do a sanity check by running the command and validating the output as shown below.
Initial setup
There are two configurations necessary before the MCP server can be invoked.
The server config file: This will cover the details of connecting and communicating with Dremio
The LLM config file: This covers configuring the LLM desktop app (Claude for now) to make it aware of the MCP server
Quick start
The quickest way to do this setup is -
Create the dremio config file as outlined below and be prepared with these values
Note: the uri is api endpoint associated with your environment:
For Dremio cloud based in the US region (https://app.dremio.cloud) use
https://api.dremio.cloudor use the short handprodFor Dremio cloud based in the EMEA region (https://app.eu.dremio.cloud) use
https://api.eu.dremio.cloudor use the short handprodemeaFor SW/K8S deployments use https://<coordinatorโhost>:<9047 or custom port>
Note: For security purposes, if you don't want the PAT to leak into your shell history file, create a file with your PAT in it and give it as an argument to the dremio config.
Example:
Download and install Claude Desktop (Claude)
Note: Claude has system requirements, such as node.js, please validate your system requirements with Claude official documentation.
Create the Claude config file using
Validate the config files using
You are done!. You can start Claude and start using the MCP server
Demo (Local install)

The rest of the documentation below provides details of the config files
Configuration details
MCP server config file
This file is located by default at $HOME/.config/dremioai/config.yaml but can be overriden using the --cfg option at runtime for dremio-mcp-server
Format
Modes
There are 3 modes
FOR_DATA_PATTERNS- the normal mode where MCP server will allow LLM to look at tables and data to allow pattern discovery and other use casesFOR_SELF- a mode which allows the MCP server to introspect Dremio system, including workload analysis and so on.FOR_PROMETHEUS- a mode that allow MCP server to connect to your prometheus setup, if one exists, to enhance insights with Dremio related metrics
Multiple modes can be specified with separated by ,
The LLM (Claude) config file
Note: This is applicable only for local installs
To setup the Claude config file (refer to this as an example) edit the Claude desktop config file
macOS:
~/Library/Application Support/Claude/claude_desktop_config.jsonWindows:
%APPDATA%\Claude\claude_desktop_config.json
And then add this section
This will pickup the default location of MCP server config file. It can also be passed in the args section above as "--config-file", "<custom config file>" after run
Logging
The Dremio MCP server automatically writes log files to platform-specific directories following operating system conventions. This helps with troubleshooting and monitoring the server's operation.
Log File Locations
The log files are stored in the following locations based on your operating system:
Linux
Directory:
~/.local/share/dremioai/logs/Full path:
~/.local/share/dremioai/logs/dremioai.logXDG compliance: Respects
$XDG_DATA_HOMEenvironment variable if set
macOS
Directory:
~/Library/Logs/dremioai/Full path:
~/Library/Logs/dremioai/dremioai.log
Windows
Directory:
%LOCALAPPDATA%\dremioai\logs\Full path:
%LOCALAPPDATA%\dremioai\logs\dremioai.logTypical location:
C:\Users\<username>\AppData\Local\dremioai\logs\dremioai.log
Controlling File Logging
By default, the MCP server logs to the logfile mentioned above. To control it further, you can use the following environment variables and command line options:
Use JSON format:
JSON_LOGGING=1or pass--enable-json-loggingfor structured JSON logsDisable file logging: pass
--no-log-to-fileto disable writing logs to file
Example:
The log directory is automatically created if it doesn't exist, so no manual setup is required.
Further Documentation
Architecture: Detailed overview of the Dremio MCP server architecture, including component interactions and data flows.
Tools: Comprehensive guide to available tools, including:
Tool categories and types
Usage examples
Development guidelines
Integration support
Settings: Complete configuration reference covering:
Dremio connection settings
Tool configurations
Framework integrations
Environment variables
Additional Information
This repository is intended to be open source software that encourages contributions of any kind, like adding features, reporting issues and contributing fixes. This is not a part of Dremio product support.
Testing
The project uses pytest for testing. To run the tests:
GitHub Actions automatically runs tests on pull requests and pushes to the main branch.
Contributing
Please see our Contributing Guide for details on:
Setting up your development environment
Making contributions
Code style guidelines
Documentation requirements
Running tests