Building, Deploying & Using Fabric RTI MCP Server
Written by Om-Shree-0709 on .
In mid‑2025, Microsoft released an official open‑source MCP server for Fabric Real‑Time Intelligence (RTI). This server enables AI agents to interact with live Fabric data sources like Eventhouse and Azure Data Explorer using natural‑language prompts, without requiring direct programming knowledge. The tool serves as the bridge between agents and real‑time analytics 12.
Setting Up the Fabric RTI MCP Server
You can install the server on your local machine using pip. In Visual Studio Code, you can also add it via the Copilot Chat command palette, which simplifies configuration:
Installation via pip:
Or inside VS Code:
- Open the command palette (Ctrl+Shift+P)
- Run
MCP: Add Server
- Choose “install from Pip”
- Enter package name
microsoft-fabric-rti-mcp
- Provide Fabric Eventhouse or ADX endpoint and authentication token when prompted 1
You can also clone the GitHub repository and install manually using UVX or pip 1.
Code / Implementation
Once installed, configure the MCP server in your settings.json
or mcp.json
configuration:
On Windows, you can create a batch script (run_mcp.bat
) to launch the server:
With Visual Studio Code configured, you can now interact with Fabric from Copilot or Cursor. For example, you could ask the agent:
“List workspaces in Fabric”
or “Sample 10 rows from table StormEvents”
and the agent will automatically use the list_databases
, list_tables
, or sample_rows
tools to complete the request 13.
Behind the Scenes
When running, the MCP server connects to Fabric RTI backends (Eventhouse or Azure Data Explorer). It fetches schema metadata and defines tool interfaces for listing databases, tables, sampling rows, and executing arbitrary KQL queries. Agents send natural language or structured requests via JSON-RPC. The server translates these into KQL queries, submits them to Fabric services, and returns results as JSON. Authentication is handled through Azure tokens or service principals, and logs are captured for auditing and debugging. Parameter suggestions and error messages are structured to help agents and developers understand failures or misconfigurations 13.
My Thoughts
This server makes live data querying accessible to AI agents and lowers the barrier for developers and analysts. Natural‑language prompts map to real-time analytics features, which is useful for interactive dashboards and monitoring workflows. Developers should pay attention to authentication setup and parameter validation to prevent misuse. The performance may be affected by large datasets, so consider query limits or caching approaches. In enterprise environments, governance around MCP tool usage is essential. Despite potential challenges, this server brings powerful real-time data access to agent-based workflows.
References
Footnotes
Written by Om-Shree-0709 (@Om-Shree-0709)