Skip to main content
Glama

Building, Deploying & Using Fabric RTI MCP Server

Written by on .

servers
mcp
microsoft
Fabric
RTI

  1. Setting Up the Fabric RTI MCP Server
    1. Code / Implementation
      1. Behind the Scenes
        1. My Thoughts
          1. References

            In mid‑2025, Microsoft released an official open‑source MCP server for Fabric Real‑Time Intelligence (RTI). This server enables AI agents to interact with live Fabric data sources like Eventhouse and Azure Data Explorer using natural‑language prompts, without requiring direct programming knowledge. The tool serves as the bridge between agents and real‑time analytics 12.

            Setting Up the Fabric RTI MCP Server

            You can install the server on your local machine using pip. In Visual Studio Code, you can also add it via the Copilot Chat command palette, which simplifies configuration:

            Installation via pip:

            pip install microsoft-fabric-rti-mcp

            Or inside VS Code:

            1. Open the command palette (Ctrl+Shift+P)
            2. Run MCP: Add Server
            3. Choose “install from Pip”
            4. Enter package name microsoft-fabric-rti-mcp
            5. Provide Fabric Eventhouse or ADX endpoint and authentication token when prompted 1

            You can also clone the GitHub repository and install manually using UVX or pip 1.

            Code / Implementation

            Once installed, configure the MCP server in your settings.json or mcp.json configuration:

            { "mcp": { "servers": { "fabric-rti": { "command": "uvx", "args": ["microsoft-fabric-rti-mcp"], "env": { "KUSTO_SERVICE_URI": "https://<your-cluster>.kusto.windows.net/", "KUSTO_SERVICE_DEFAULT_DB": "Samples", "AZ_OPENAI_EMBEDDING_ENDPOINT": "https://<your-azure-openai-endpoint>" } } } } }

            On Windows, you can create a batch script (run_mcp.bat) to launch the server:

            @echo off SET PATH=%USERPROFILE%\.local\bin;%PATH% cd C:\path\to\fabric-rti-mcp\ uvx run fabric_rti_mcp.server

            With Visual Studio Code configured, you can now interact with Fabric from Copilot or Cursor. For example, you could ask the agent: “List workspaces in Fabric” or “Sample 10 rows from table StormEvents” and the agent will automatically use the list_databases, list_tables, or sample_rows tools to complete the request 13.

            Behind the Scenes

            When running, the MCP server connects to Fabric RTI backends (Eventhouse or Azure Data Explorer). It fetches schema metadata and defines tool interfaces for listing databases, tables, sampling rows, and executing arbitrary KQL queries. Agents send natural language or structured requests via JSON-RPC. The server translates these into KQL queries, submits them to Fabric services, and returns results as JSON. Authentication is handled through Azure tokens or service principals, and logs are captured for auditing and debugging. Parameter suggestions and error messages are structured to help agents and developers understand failures or misconfigurations 13.

            Image

            My Thoughts

            This server makes live data querying accessible to AI agents and lowers the barrier for developers and analysts. Natural‑language prompts map to real-time analytics features, which is useful for interactive dashboards and monitoring workflows. Developers should pay attention to authentication setup and parameter validation to prevent misuse. The performance may be affected by large datasets, so consider query limits or caching approaches. In enterprise environments, governance around MCP tool usage is essential. Despite potential challenges, this server brings powerful real-time data access to agent-based workflows.

            References

            Footnotes

            1. Microsoft Fabric blog: Introducing MCP Support for Real‑Time Intelligence (RTI) 2 3 4 5

            2. Microsoft Learn: What’s New in Microsoft Fabric noting MCP RTI support

            3. GitHub repository: microsoft/fabric‑rti‑mcp showing install and configuration details 2

            Written by Om-Shree-0709 (@Om-Shree-0709)