Supports containerized deployment with health checks, monitoring, and Docker Compose orchestration for running the MCP server in production environments.
Enables workspace-level Git integration for version control of Microsoft Fabric notebooks, pipelines, and environments.
Enables Git integration for Microsoft Fabric workspaces, allowing connection to GitHub repositories for version control of notebooks and data pipelines.
Supports CI/CD workflows for automated workspace provisioning, testing, and deployment pipelines with rollback capabilities.
Enables GitHub Copilot to manage Microsoft Fabric workspaces, execute Spark jobs, and provision analytics resources through the built-in terminal using Azure CLI authentication.
Integrates with Grafana dashboards for monitoring and observability of Microsoft Fabric resources, Spark applications, and MCP server metrics.
Supports creation and management of Jupyter notebooks in Microsoft Fabric with predefined templates for analytics and machine learning workflows.
Provides enterprise deployment on Azure Kubernetes Service (AKS) with horizontal pod autoscaling, load balancing, and comprehensive monitoring capabilities.
Runtime environment for the MCP server, enabling server-side execution of Microsoft Fabric operations and API integrations.
Distributed as an npm package (mcp-for-microsoft-fabric-analytics) for Node.js-based integrations and JavaScript/TypeScript development.
Exposes Prometheus metrics endpoints for custom monitoring of Fabric operations, job execution, and server performance.
Available as a Python package (fabric-analytics-mcp) for easy installation and integration with Python-based workflows and tools.
Provides Python client libraries and test scripts for interacting with the Livy API and Microsoft Fabric Spark workloads.
Built with TypeScript for type-safe development of Microsoft Fabric integrations with comprehensive schema validation.
Uses Zod for schema validation and input sanitization of all MCP tool parameters and API requests.
Microsoft Fabric Analytics MCP Server
A comprehensive Model Context Protocol (MCP) server that provides analytics capabilities and tools for interacting with Microsoft Fabric data platform. This server enables AI assistants like Claude to seamlessly access, analyze, and monitor Microsoft Fabric resources through standardized MCP protocols, bringing the power of Microsoft Fabric directly to your AI conversations.
📋 Table of Contents
🔒 Security "Cancel the problematic Spark application"
🌟 Key Features
🏗️ Complete Workspace Management - Create, delete, and manage Fabric workspaces with capacity assignment
🔄 Enhanced CRUD Operations - Create, read, update, and delete all Fabric items (notebooks, lakehouses, datasets, reports)
📓 Advanced Notebook Management - Create, execute, and manage Fabric notebooks with 5 predefined templates
⚡ Livy API Integration - Full Spark session and batch job management with real-time monitoring
📊 Comprehensive Spark Monitoring - Real-time monitoring across workspaces, items, and applications
🤖 Multi-AI Assistant Support - Works with Claude Desktop, GitHub Copilot, and other MCP-compatible AI tools
🔐 Enhanced Azure CLI Authentication - Zero-config setup with automatic token management
�️ Enterprise Authentication - Multiple auth methods (Bearer, Service Principal, Device Code, Interactive, Azure CLI)
📈 Analytics & Insights - Generate comprehensive monitoring dashboards with real-time metrics
🧪 End-to-End Testing - Complete test suite with real workspace creation and job execution
🔄 Advanced Token Management - Automatic token validation, refresh, and expiration handling "List all Fabric capacities I can use" "Assign workspace 1234abcd-abcd-1234-abcd-123456789000 to capacity f9998888-7777-6666-5555-444433332222" "Show all workspaces in capacity f9998888-7777-6666-5555-444433332222" "Unassign workspace 1234abcd-abcd-1234-abcd-123456789000 from its capacity"
"Create a new workspace called 'Analytics-Q1-2025' and assign it to our premium capacity"
"List all workspaces in our tenant and show their capacity assignments"
"Add user john.doe@company.com as Admin to the Analytics workspace"
"Create a development environment in the Analytics workspace with Python and R libraries"
"Connect the Analytics workspace to our GitHub repository for version control"
🔀 Synapse to Fabric Migration Tools
🆕 Automated Spark Workload Migration
The MCP server now includes 4 specialized migration tools that automate the migration of Spark notebooks and pipelines from Azure Synapse Analytics to Microsoft Fabric:
🔍 Migration Discovery Tools
fabric_list_synapse_workspaces - List all Synapse workspaces in your Azure subscription
fabric_discover_synapse_workspace - Inventory notebooks, pipelines, linked services, and Spark jobs from Synapse
🔄 Transformation & Migration Tools
fabric_transform_notebooks - Transform Synapse notebooks to Fabric format (mssparkutils → notebookutils)
fabric_migrate_synapse_to_fabric - Complete end-to-end migration with discovery, transformation, and provisioning
✨ Key Migration Features
Automatic Code Transformation - Converts Synapse-specific code to Fabric equivalents:
mssparkutils→notebookutilsSynapse magic commands → Fabric magic commands
ABFSS path rewriting to OneLake
Spark pool configuration cleanup
Comprehensive Asset Discovery - Inventories all migrat assets:
Jupyter notebooks (ipynb format)
Data pipelines and workflows
Linked services and connections
Spark job definitions
Safe Testing with Dry Run - Preview all changes before applying:
Test transformations without provisioning
Validate transformed code
Review change reports
End-to-End Automation - Complete migration pipeline:
Discovery → Transformation → Provisioning → Validation
Automatic lakehouse creation
OneLake shortcut provisioning
Comprehensive migration reports
🎯 Migration Scenarios
📋 Explore Before Migrating:
🔄 Preview Transformations:
🚀 Complete Migration:
📊 Detailed Migration Guide: See MIGRATION.md for comprehensive migration documentation including:
Step-by-step workflows
Transformation rule details
Best practices and troubleshooting
Complete examples
🎯 End-to-End Testing with Real Workspaces
The MCP server now includes comprehensive end-to-end testing that creates real workspaces, assigns them to capacities, and executes actual jobs to validate the complete workflow:
What it tests:
✅ Workspace Creation - Creates real Fabric workspaces
✅ Capacity Assignment - Attaches workspaces to your Fabric capacity
✅ Item Creation - Creates notebooks, lakehouses, and other items
✅ Job Execution - Runs actual Spark jobs and monitors completion
✅ Resource Cleanup - Automatically removes all test resources
🚀 Deployment Options
🤖 Claude Desktop Integration
Recommended for AI Assistant Usage:
💡 Get Bearer Token: Visit Power BI Embed Setup to generate tokens
⚠️ Important: Tokens expire after ~1 hour and need to be refreshed
🔧 Claude Desktop Authentication Fix
If you experience 60-second timeouts during startup, this is due to interactive authentication flows blocking Claude Desktop's sandboxed environment. Solution:
Use Bearer Token Method (Recommended):
Set
FABRIC_AUTH_METHOD: "bearer_token"in your configProvide
FABRIC_TOKENwith a valid bearer tokenThis bypasses interactive authentication entirely
Alternative - Per-Tool Authentication:
Provide token directly in tool calls:
bearerToken: "your_token_here"Or use simulation mode:
bearerToken: "simulation"
Troubleshooting:
Server now has 10-second timeout protection to prevent hanging
Falls back to simulation mode if authentication fails
Enhanced error messages provide clear guidance
🎯 Quick Fix: The server automatically prioritizes
FABRIC_TOKENenvironment variable over interactive authentication flows, preventing Claude Desktop timeouts.
📱 Local Development
🐳 Docker Deployment
☸️ Azure Kubernetes Service (AKS)
🌐 Azure MCP Server (Preview)
📚 Detailed Guides:
🛠️ Tools & Capabilities
🔍 CRUD Operations for Fabric Items
Tool:
list-fabric-itemsDescription: List items in a Microsoft Fabric workspace (Lakehouses, Notebooks, etc.)
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDitemType: Filter by item type (optional)
Tool:
create-fabric-itemDescription: Create new items in Microsoft Fabric workspace
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDitemType: Type of item (Lakehouse, Notebook, Dataset, Report, Dashboard)displayName: Display name for the new itemdescription: Optional description
Tool:
get-fabric-itemDescription: Get detailed information about a specific Microsoft Fabric item
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDitemId: ID of the item to retrieve
Tool:
update-fabric-itemDescription: Update existing items in Microsoft Fabric workspace
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDitemId: ID of the item to updatedisplayName: New display name (optional)description: New description (optional)
Tool:
delete-fabric-itemDescription: Delete items from Microsoft Fabric workspace
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDitemId: ID of the item to delete
🔍 Query Fabric Dataset (Enhanced)
Tool:
query-fabric-datasetDescription: Execute SQL or KQL queries against Microsoft Fabric datasets
Parameters:
bearerToken: Microsoft Fabric bearer token (optional - uses simulation if not provided)workspaceId: Microsoft Fabric workspace IDdatasetName: Name of the dataset to queryquery: SQL or KQL query to execute
🚀 Execute Fabric Notebook
Tool:
execute-fabric-notebookDescription: Execute a notebook in Microsoft Fabric workspace
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDnotebookId: ID of the notebook to executeparameters: Optional parameters to pass to the notebook
📊 Get Analytics Metrics
Tool:
get-fabric-metricsDescription: Retrieve performance and usage metrics for Microsoft Fabric items
Parameters:
workspaceId: Microsoft Fabric workspace IDitemId: Item ID (dataset, report, etc.)timeRange: Time range for metrics (1h, 24h, 7d, 30d)metrics: List of metrics to analyze
🔧 Analyze Data Model
Tool:
analyze-fabric-modelDescription: Analyze a Microsoft Fabric data model and get optimization recommendations
Parameters:
workspaceId: Microsoft Fabric workspace IDitemId: Item ID to analyze
📋 Generate Analytics Report
Tool:
generate-fabric-reportDescription: Generate comprehensive analytics reports for Microsoft Fabric workspaces
Parameters:
workspaceId: Microsoft Fabric workspace IDreportType: Type of report (performance, usage, health, summary)
🚀 Livy API Integration (Sessions & Batch Jobs)
Session Management
Tool:
create-livy-sessionDescription: Create a new Livy session for interactive Spark/SQL execution
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDlakehouseId: Microsoft Fabric lakehouse IDsessionConfig: Optional session configuration
Tool:
get-livy-sessionDescription: Get details of a Livy session
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDlakehouseId: Microsoft Fabric lakehouse IDsessionId: Livy session ID
Tool:
list-livy-sessionsDescription: List all Livy sessions in a lakehouse
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDlakehouseId: Microsoft Fabric lakehouse ID
Tool:
delete-livy-sessionDescription: Delete a Livy session
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDlakehouseId: Microsoft Fabric lakehouse IDsessionId: Livy session ID
Statement Execution
Tool:
execute-livy-statementDescription: Execute SQL or Spark statements in a Livy session
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDlakehouseId: Microsoft Fabric lakehouse IDsessionId: Livy session IDcode: SQL or Spark code to executekind: Statement type (sql, spark, etc.)
Tool:
get-livy-statementDescription: Get status and results of a Livy statement
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDlakehouseId: Microsoft Fabric lakehouse IDsessionId: Livy session IDstatementId: Statement ID
Batch Job Management
Tool:
create-livy-batchDescription: Create a new Livy batch job for long-running operations
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDlakehouseId: Microsoft Fabric lakehouse IDbatchConfig: Batch job configuration
Tool:
get-livy-batchDescription: Get details of a Livy batch job
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDlakehouseId: Microsoft Fabric lakehouse IDbatchId: Batch job ID
Tool:
list-livy-batchesDescription: List all Livy batch jobs in a lakehouse
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDlakehouseId: Microsoft Fabric lakehouse ID
Tool:
delete-livy-batchDescription: Delete a Livy batch job
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDlakehouseId: Microsoft Fabric lakehouse IDbatchId: Batch job ID
📊 Spark Application Monitoring
Workspace-Level Monitoring
Tool:
get-workspace-spark-applicationsDescription: Get all Spark applications in a Microsoft Fabric workspace
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDcontinuationToken: Optional token for pagination
Item-Specific Monitoring
Tool:
get-notebook-spark-applicationsDescription: Get all Spark applications for a specific notebook
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDnotebookId: Notebook IDcontinuationToken: Optional token for pagination
Tool:
get-lakehouse-spark-applicationsDescription: Get all Spark applications for a specific lakehouse
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDlakehouseId: Lakehouse IDcontinuationToken: Optional token for pagination
Tool:
get-spark-job-definition-applicationsDescription: Get all Spark applications for a specific Spark Job Definition
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDsparkJobDefinitionId: Spark Job Definition IDcontinuationToken: Optional token for pagination
Application Management
Tool:
get-spark-application-detailsDescription: Get detailed information about a specific Spark application
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDlivyId: Livy session ID
Tool:
cancel-spark-applicationDescription: Cancel a running Spark application
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDlivyId: Livy session ID
Monitoring Dashboard
Tool:
get-spark-monitoring-dashboardDescription: Generate a comprehensive monitoring dashboard with analytics
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace ID
📓 Notebook Management
The MCP server provides comprehensive notebook management capabilities with predefined templates and custom notebook support.
Create Notebook from Template
Tool:
create-fabric-notebookDescription: Create new Fabric notebooks from predefined templates or custom definitions
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDdisplayName: Display name for the new notebooktemplate: Template type (blank, sales_analysis, nyc_taxi_analysis, data_exploration, machine_learning, custom)customNotebook: Custom notebook definition (required if template is 'custom')environmentId: Optional environment ID to attachlakehouseId: Optional default lakehouse IDlakehouseName: Optional default lakehouse name
Available Templates:
blank: Basic notebook with minimal setup
sales_analysis: Comprehensive sales data analysis with sample dataset
nyc_taxi_analysis: NYC taxi trip data analysis with sample dataset
data_exploration: Structured data exploration template
machine_learning: Complete ML workflow template
custom: Use your own notebook definition
Get Notebook Definition
Tool:
get-fabric-notebook-definitionDescription: Retrieve the notebook definition (cells, metadata) from an existing Fabric notebook
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDnotebookId: ID of the notebook to retrieveformat: Format to return (ipynb or fabricGitSource)
Update Notebook Definition
Tool:
update-fabric-notebook-definitionDescription: Update the notebook definition (cells, metadata) of an existing Fabric notebook
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDnotebookId: ID of the notebook to updatenotebookDefinition: Updated notebook definition object
Execute Notebook
Tool:
run-fabric-notebookDescription: Execute a Fabric notebook on-demand with optional parameters and configuration
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDnotebookId: ID of the notebook to runparameters: Optional notebook parameters (key-value pairs with types)configuration: Optional execution configuration (environment, lakehouse, pools, etc.)
Features:
📓 Base64 encoded notebook payload support
🔧 Comprehensive metadata management
🌐 Environment and lakehouse integration
🎛️ Parameterized notebook execution
⚡ Spark configuration support
🔤 Support for multiple programming languages (Python, Scala, SQL, R)
🚀 Quick Start
🎯 Installation Methods
Choose your preferred installation method:
Option 1: Python Package (PyPI) ⭐ Recommended
Option 2: NPM Package
Option 3: Universal Installation Script
For automated setup with environment configuration:
Unix/Linux/macOS:
Windows (PowerShell):
Option 4: Docker
📖 See
Option 5: From Source (Development)
⚙️ Configuration
Set up your environment variables:
🔧 Claude Desktop Setup
Add to your Claude Desktop config:
Windows: %APPDATA%\Claude\claude_desktop_config.json
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
For PyPI Installation:
For NPM Installation:
For Source Installation:
🚀 Start Using
Restart Claude Desktop and try these queries:
"List all workspaces I have access to"
"Find workspace named 'Analytics'"
"List all items in my Fabric workspace [your-workspace-id]"
"Create a new lakehouse called 'Analytics Hub'"
"Show me all running Spark applications"
"Execute this SQL query: SELECT * FROM my_table LIMIT 10"
🧪 Development & Testing
Running the Server
Testing Livy API Integration
For comprehensive testing of Spark functionality, install Python dependencies:
Available Test Scripts:
livy_api_test.ipynb- Interactive notebook for step-by-step testingcomprehensive_livy_test.py- Full-featured test with error handlingspark_monitoring_test.py- Spark application monitoring testsmcp_spark_monitoring_demo.py- MCP server integration demo
Claude Desktop Integration
Add this configuration to your Claude Desktop config file:
Windows: %APPDATA%\Claude\claude_desktop_config.json
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
🎉 You're ready! Restart Claude Desktop and start asking questions about your Microsoft Fabric data!
Livy API Testing Setup
For testing the Livy API functionality, additional Python dependencies are required:
Available Test Scripts:
livy_api_test.ipynb- Interactive Jupyter notebook for step-by-step testingcomprehensive_livy_test.py- Full-featured test with error handlingsimple_livy_test.py- Simple test following example patternslivy_batch_test.py- Batch job testing capabilitiesspark_monitoring_test.py- Spark application monitoring tests
Usage
Running the Server
Development Mode
Testing with Claude Desktop
Add the following configuration to your Claude Desktop config file:
Windows: %APPDATA%\Claude\claude_desktop_config.json
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
💬 Example Queries
Once connected to Claude Desktop, you can ask natural language questions like:
CRUD Operations:
"List all Lakehouses in my workspace"
"Create a new Notebook called 'Data Analysis'"
"Update the description of my lakehouse"
"Delete the test notebook from my workspace"
Notebook Management:
"Create a sales analysis notebook with sample data"
"Generate a new NYC taxi analysis notebook"
"Create a machine learning notebook template"
"Get the definition of my existing notebook"
"Run my notebook with specific parameters"
"Update my notebook with new cells"
Data Operations:
"Query the sales dataset to get total revenue by region"
"Execute my analytics notebook with today's date"
Analytics:
"Get performance metrics for the last 24 hours"
"Analyze my data model and provide optimization recommendations"
"Generate a usage report for my workspace"
Livy API Operations:
"Create a Livy session for interactive Spark analysis"
"Execute SQL query 'SELECT * FROM my_table LIMIT 10'"
"Run Spark code to show all tables"
"Monitor my batch job progress"
Spark Application Monitoring:
"Show me all Spark applications in my workspace"
"What's the status of my notebook Spark jobs?"
"Generate a comprehensive Spark monitoring dashboard"
"Show me recent failed applications"
"Cancel the problematic Spark application"
Capacity Management:
"List all Fabric capacities I can use"
"Assign workspace 1234abcd-abcd-1234-abcd-123456789000 to capacity f9998888-7777-6666-5555-444433332222"
"Show all workspaces in capacity f9998888-7777-6666-5555-444433332222"
"Unassign workspace 1234abcd-abcd-1234-abcd-123456789000 from its capacity"
🧩 Capacity Management Tools
Manage Microsoft Fabric capacity assignments directly from your AI assistant. These tools let you inspect available capacities, attach/detach workspaces, and audit capacity usage.
Available Tools
fabric_list_capacities– Enumerate all capacities you can access (ID, SKU, region, state)fabric_assign_workspace_to_capacity– Attach a workspace to a dedicated capacityfabric_unassign_workspace_from_capacity– Return a workspace to shared capacityfabric_list_capacity_workspaces– List all workspaces currently hosted on a given capacity
Notes
If authentication fails or you're in simulation mode, capacity responses are simulated.
Real capacity operations require appropriate Fabric / Power BI admin permissions.
You can provide a bearer token per call (
bearerTokenfield) or rely on global auth.
Minimal Parameter Reference
Tool | Required Parameters | Optional |
fabric_list_capacities | (none) | bearerToken |
fabric_assign_workspace_to_capacity | capacityId, workspaceId | bearerToken |
fabric_unassign_workspace_from_capacity | workspaceId | bearerToken |
fabric_list_capacity_workspaces | capacityId | bearerToken |
❗ Troubleshooting
JSON Parse Errors (e.g., Unexpected token 'P')
If Claude Desktop or another MCP client reports an error like:
This is the EXACT issue reported in GitHub issue where the user saw "Unexpected token 'P', 'Please set'..." errors.
This almost always means something wrote plain text to STDOUT (which must contain ONLY JSON-RPC frames). Common causes:
Added
console.logdebug statements in server codeA dependency emitting warnings to STDOUT
Early logging before transport initialization
Fixes Implemented in This Server
✅ A startup guard now redirects
console.log/console.infoto STDERR automatically✅ Debug output has been consolidated behind the
DEBUG_MCP_RUN=1flag✅ All diagnostic messages go to STDERR, keeping STDOUT clean for JSON-RPC protocol
What You Can Do
Avoid adding raw
console.logstatements—preferconsole.error(goes to STDERR)If you must allow stdout logging temporarily (NOT recommended), set:
ALLOW_UNSAFE_STDOUT=trueRemove after debugging
Regenerate the build (
npm run build) after changes to ensure compiled output matches source
Capacity Tools Not Appearing?
If the capacity tools don't show up when the client lists tools:
Ensure you rebuilt after pulling changes:
npm run buildConfirm you're launching the server from
build/index.jsand not an older snapshotVerify no MCP client-side allowlist is filtering tool names
Run a quick enumeration test: ask the assistant "List all available tools"
If still missing, delete the
build/folder and rebuild to clear stale artifacts
Authentication Edge Cases
Azure CLI auth may fail silently without an active
az loginsessionBearer tokens expire (~1 hour); refresh if operations suddenly fail
For local testing: falling back to simulation still lets you prototype tool flows
Getting Verbose Diagnostics
Set the following (sent to STDERR, safe for MCP framing):
Optionally add structured auth tracing:
Need a new troubleshooting topic? Open an issue or PR so others benefit from the resolution.
🔐 Authentication
This MCP server supports multiple authentication methods powered by Microsoft Authentication Library (MSAL):
🤖 For Claude Desktop: Use Bearer Token Authentication (Method #1) for the best experience and compatibility.
🔧 Claude Desktop Fix: Recent updates prevent authentication timeouts by prioritizing bearer tokens and adding timeout protection for interactive authentication flows.
🎫 1. Bearer Token Authentication (Recommended for Claude Desktop)
Perfect for AI assistants and interactive usage:
For Claude Desktop:
Visit Power BI Embed Setup
Generate a bearer token for your workspace
Add to your
claude_desktop_config.jsonNo timeout issues - bypasses interactive authentication entirely
For Testing:
🤖 2. Service Principal Authentication (Recommended for Production)
Use Azure AD application credentials:
Client ID (Application ID)
Client Secret
Tenant ID (Directory ID)
Environment Variables Setup:
Claude Desktop Configuration:
📱 3. Device Code Authentication
Sign in with browser on another device (great for headless environments):
🌐 4. Interactive Authentication
Automatic browser-based authentication:
🔧 5. Azure CLI Authentication ⭐ (Recommended for Local Development)
Use your existing Azure CLI login for seamless local testing:
Prerequisites:
Install Azure CLI:
winget install Microsoft.AzureCLI(Windows) or DownloadLogin to Azure:
az loginSet active subscription:
az account set --subscription "your-subscription-name"
Benefits:
✅ Zero Configuration - Uses your existing Azure login
✅ Instant Setup - No app registration or client secrets needed
✅ Multi-Account Support - Switch Azure accounts easily
✅ Perfect for Development - Seamless local testing experience
Quick Test:
💡 Pro Tip: Azure CLI authentication is perfect for developers who want to quickly test the MCP server without complex Azure AD app setup. Just
az loginand you're ready to go!
🔧 Complete Authentication Setup
📚 Detailed Guides:
Authentication Setup Guide - Complete Azure AD setup
Claude Desktop Config Examples - Ready-to-use configurations
🔍 Authentication Testing
Check your authentication status:
🔒 Security Best Practices
Never commit authentication tokens to version control
Use Service Principal authentication for production deployments
Device Code flow is perfect for CI/CD and headless environments
Interactive authentication is ideal for development and testing
All tokens are automatically validated and include expiration checking
Note: The MCP server seamlessly handles token validation and provides clear error messages for authentication issues.
☸️ Azure Kubernetes Service (AKS) Deployment
Deploy the MCP server as a scalable service on Azure Kubernetes Service for enterprise production use.
🚀 Quick AKS Deployment
Prerequisites
Azure CLI installed and configured
Docker installed
kubectl installed
Azure subscription with AKS permissions
1. Build and Push Docker Image
2. Deploy to AKS
3. Access the MCP Server
Once deployed, your MCP server will be available at:
🏗️ Architecture Overview
The AKS deployment includes:
Horizontal Pod Autoscaler (3-10 pods based on CPU/memory)
Azure Load Balancer for high availability
SSL/TLS termination with Azure Application Gateway
ConfigMaps for environment configuration
Secrets for secure credential storage
Health checks and readiness probes
Resource limits and quality of service guarantees
📁 Deployment Files
All Kubernetes manifests are located in the /k8s directory:
namespace.yaml- Dedicated namespacedeployment.yaml- Application deployment with scalingservice.yaml- Load balancer serviceingress.yaml- External access and SSLconfigmap.yaml- Configuration managementsecret.yaml- Secure credential storagehpa.yaml- Horizontal Pod Autoscaler
🔧 Configuration
Configure the deployment by setting these environment variables:
🔐 Production Security
The AKS deployment includes enterprise-grade security:
Non-root container execution
Read-only root filesystem
Secret management via Azure Key Vault integration
Network policies for traffic isolation
RBAC with minimal required permissions
Pod security standards enforcement
📊 Monitoring & Scaling
Azure Monitor integration for logs and metrics
Application Insights for performance monitoring
Prometheus metrics endpoint for custom monitoring
Auto-scaling based on CPU (70%) and memory (80%) thresholds
Health checks for automatic pod restart
🔄 CI/CD Integration
The deployment scripts support:
Azure DevOps pipelines
GitHub Actions workflows
Automated testing before deployment
Blue-green deployments for zero downtime
Rollback capabilities for quick recovery
📚 Detailed Guide: See AKS_DEPLOYMENT.md for complete setup instructions.
🌐 Azure Model Context Protocol Server (Preview)
Microsoft Azure now offers a preview service for hosting MCP servers natively. This eliminates the need for custom infrastructure management.
🚀 Azure MCP Server Deployment
Prerequisites
Azure subscription with MCP preview access
Azure CLI with MCP extensions
Deploy to Azure MCP Service
Configure Authentication
Access Your MCP Server
🔧 Azure MCP Server Features
Automatic scaling based on usage
Built-in monitoring and logging
Integrated security with Azure AD
Zero infrastructure management
Global CDN for low latency
Automatic SSL/TLS certificates
💰 Cost Optimization
Azure MCP Server offers:
Pay-per-request pricing model
Automatic hibernation during idle periods
Resource sharing across multiple clients
No minimum infrastructure costs
📚 Learn More: Azure MCP Server Documentation
Note: Azure MCP Server is currently in preview. Check Azure Preview Terms for service availability and limitations.
🏗️ Architecture
This MCP server is built with:
TypeScript for type-safe development
MCP SDK for Model Context Protocol implementation
Zod for schema validation and input sanitization
Node.js runtime environment
⚙️ Configuration
The server uses the following configuration files:
tsconfig.json- TypeScript compiler configurationpackage.json- Node.js package configuration.vscode/mcp.json- MCP server configuration for VS Code
🔧 Development
Project Structure
Adding New Tools
To add new tools to the server:
Define the input schema using Zod
Implement the tool using
server.tool()Add error handling and validation
Update documentation
API Integration
This server includes:
✅ Production Ready:
Full Microsoft Fabric Livy API integration
Spark session lifecycle management
Statement execution with SQL and Spark support
Batch job management for long-running operations
Comprehensive error handling and retry logic
Real-time polling and result retrieval
🧪 Demonstration Features:
CRUD operations (configurable for real APIs)
Analytics and metrics (extensible framework)
Data model analysis (template implementation)
🧪 Testing
🚀 End-to-End Testing
The MCP server includes comprehensive end-to-end testing that creates real workspaces, items, and jobs to validate complete functionality using Azure CLI authentication.
Quick Setup for E2E Testing
What the E2E Test Does
The end-to-end test creates a complete workflow in your Microsoft Fabric tenant:
🔐 Validates Azure CLI Authentication - Uses your existing
az loginsession🏗️ Creates a Test Workspace - New workspace with unique naming
⚡ Attaches to Capacity - Links workspace to your Fabric capacity (optional)
📓 Creates Notebooks & Lakehouses - Test items for validation
🏃 Runs Real Jobs - Executes notebook with actual Spark code
📊 Monitors Execution - Tracks job status and completion
🧹 Cleans Up Resources - Removes all created test resources
E2E Test Configuration
The setup script creates a .env.e2e configuration file:
E2E Test Features
✅ Real Resource Creation - Creates actual Fabric workspaces and items
✅ Azure CLI Integration - Uses your existing Azure authentication
✅ Capacity Assignment - Tests workspace-to-capacity attachment
✅ Job Execution - Runs real Spark jobs and monitors completion
✅ Automatic Cleanup - Removes all test resources automatically
✅ Comprehensive Logging - Detailed logging of all operations
✅ Error Handling - Robust error handling and recovery
Prerequisites for E2E Testing
Azure CLI installed and logged in:
az loginMicrosoft Fabric Access with permissions to:
Create workspaces
Create notebooks and lakehouses
Run Spark jobs
(Optional) Assign workspaces to capacity
Fabric Capacity (optional but recommended):
Set
FABRIC_CAPACITY_IDin.env.e2efor capacity testingWithout capacity, workspace will use shared capacity
Running E2E Tests
E2E Test Output
The test provides comprehensive output including:
⚠️ Important Notes for E2E Testing
Creates Real Resources: The test creates actual workspaces and items in your Fabric tenant
Requires Permissions: Ensure you have necessary Fabric permissions
Uses Capacity: Jobs may consume capacity units if using dedicated capacity
Automatic Cleanup: All resources are automatically deleted after testing
Network Dependent: Requires stable internet connection for API calls
🧪 Unit & Integration Testing
Prerequisites
Available Test Scripts
livy_api_test.ipynb- Interactive Jupyter notebook for step-by-step testingcomprehensive_livy_test.py- Full-featured test with error handlingsimple_livy_test.py- Simple test following example patternslivy_batch_test.py- Batch job testing capabilitiesspark_monitoring_test.py- Spark application monitoring tests
Quick Testing
Interactive Testing:
jupyter notebook livy_api_test.ipynbCommand Line Testing:
python simple_livy_test.py python spark_monitoring_test.pyComprehensive Testing:
🤝 Contributing
We welcome contributions! Here's how to get started:
Fork the repository
Create a feature branch (
git checkout -b feature/amazing-feature)Make your changes and add tests if applicable
Commit your changes (
git commit -m 'Add amazing feature')Push to the branch (
git push origin feature/amazing-feature)Open a Pull Request
Development Guidelines
Follow TypeScript best practices
Add JSDoc comments for new functions
Update tests for any new functionality
Update documentation as needed
See CONTRIBUTING.md for detailed guidelines
🔒 Security
Never commit authentication tokens to version control
Use environment variables for sensitive configuration
Follow Microsoft Fabric security best practices
Report security issues privately via GitHub security advisories
See SECURITY.md for our full security policy
📝 License
This project is licensed under the MIT License - see the LICENSE file for details.
Support
For issues and questions:
📖 Check the MCP documentation
📚 Review Microsoft Fabric API documentation
🐛 Open an issue in this repository
💬 Join the community discussions
Acknowledgments
Microsoft Fabric Analytics team for the comprehensive data platform and analytics capabilities
Microsoft Fabric Platform teams for the robust API platform and infrastructure
Bogdan Crivat and Chris Finlan for the inspiring brainstorming conversation that gave me the idea to open-source this project
Anthropic for the Model Context Protocol specification
This project began as my weekend hack project exploring AI integration with Microsoft Fabric. During a casual conversation with Chris and Bogdan about making AI tooling more accessible. What started as a personal experiment over a weekend is now available for everyone to build upon.