Skip to main content
Glama

Microsoft Fabric RTI MCP Server

Written by on .

Microsoft Fabric
servers
mcp

  1. Core Features and Architecture
    1. Code / Implementation
      1. Behind the Scenes
        1. My Thoughts
          1. References

            Microsoft recently released an open‑source MCP server for Fabric Real-Time Intelligence (RTI) in June 2025. This server allows AI agents to query real-time data in Microsoft Fabric, including Eventhouse and Azure Data Explorer, using natural-language prompts through the MCP interface. It simplifies integration between agents and live data analytics services 12.

            Core Features and Architecture

            The Fabric RTI MCP server exposes a suite of data tools to agents: listing databases, listing tables, retrieving table schemas, sampling rows, and executing arbitrary Kusto Query Language (KQL) commands 1. Agents send requests in plain English, which the MCP server translates into KQL under the hood. The server includes intelligent parameter suggestion and consistent error handling to produce clear, user-friendly responses 1.

            Image

            The server connects to Fabric RTI backends—Eventhouse or Azure Data Explorer (ADX)—via APIs. It translates the agent’s request into appropriate KQL and returns results as JSON. Future enhancements may add support for RTI components like Eventstreams and Activator functions 1.

            Code / Implementation

            Below is a simplified example demonstrating how to install and start the server, and how the MCP tools might be used:

            pip install microsoft-fabric-rti-mcp
            { "tools": [ {"name": "list_databases"}, {"name": "list_tables", "database": "IotData"}, {"name": "sample_rows", "table": "IotData", "limit": 10}, {"name": "execute_query", "query": "StormEvents | count"} ] }

            In Python:

            from fabric_rti_mcp import RTIMcpServer server = RTIMcpServer( fabric_endpoint="https://fabric.azuredomain.com", auth_token="YOUR_AUTH_TOKEN" ) server.start_tools()

            Agents can invoke tools such as list_databases, sample_rows, or execute_query, which the server translates into KQL and runs against Fabric RTI 1.

            Behind the Scenes

            When an agent connects, the server performs schema introspection by querying Fabric APIs. Each tool definition includes parameter metadata and permitted arguments. When a request comes in, the server maps agent inputs to KQL syntax, potentially completing parameters. It handles errors in a structured way so that agents receive meaningful feedback rather than generic failure messages 1.

            Image

            Authentication is handled via Azure tokens or service principal flows. The server runs as a Python process and leverages standard JSON-RPC transport. Tool definitions enforce field shapes, and introspection ensures agents understand what parameters go with each tool. Requests and responses are logged for traceability and debugging.

            My Thoughts

            This Fabric RTI MCP server is a powerful tool for bridging live data analytics with agents, particularly for BI and monitoring use cases. The translation from natural language to KQL allows less technical users to query live data without learning a query language. The introspection and structured error handling improve usability and developer experience.

            However, developers should consider performance and latency, particularly with large datasets. Tool definitions must enforce strict parameter validation to avoid malformed queries. Managing authentication securely is essential. Overall, this server makes Fabric RTI accessible to AI agents in a clean, extensible way.

            References

            Footnotes

            1. Manufacturing blog: Introducing MCP Support for Real‑Time Intelligence (RTI) 2 3 4 5 6

            2. What’s New? Microsoft Fabric blog noting MCP support now available

            Written by Om-Shree-0709 (@Om-Shree-0709)