Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Fabric Data Engineering MCP ServerRun the Daily ETL notebook in the Analytics workspace"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Fabric Data Engineering MCP Server
A Model Context Protocol (MCP) server for Microsoft Fabric Data Engineering that provides full execution access to Fabric Notebooks, Pipelines, Lakehouses, and Spark jobs.
Why This Exists
Currently available MCP servers don't cover Fabric Data Engineering execution:
@microsoft/fabric-mcp→ API specs only, no execution@azure/mcp→ Azure management, not Fabric-specificpowerbi-modeling-mcp→ Semantic models only@bytebase/dbhub→ SQL queries only
This MCP server fills the gap by providing execution capabilities for Fabric Data Engineering workloads.
Features
Notebook Operations
notebook_list - List all notebooks in a workspace
notebook_run - Execute a notebook (with optional parameters)
notebook_run_status - Check the status of a running notebook
notebook_run_cancel - Cancel a running notebook
Pipeline Operations
pipeline_list - List all data pipelines in a workspace
pipeline_run - Execute a pipeline (with optional parameters)
pipeline_run_status - Check the status of a running pipeline
pipeline_run_cancel - Cancel a running pipeline
Lakehouse Operations
lakehouse_list - List all Lakehouses in a workspace
lakehouse_get - Get Lakehouse details (including SQL endpoint info)
lakehouse_create - Create a new Lakehouse
lakehouse_delete - Delete a Lakehouse
lakehouse_tables_list - List all tables in a Lakehouse
lakehouse_table_load - Load data from OneLake into a table
Spark Job Operations
spark_job_list - List all Spark job definitions
spark_job_run - Execute a Spark job definition
spark_job_status - Check the status of a Spark job run
spark_job_cancel - Cancel a running Spark job
Workspace Operations
workspace_list - List all accessible workspaces
workspace_get - Get workspace details
workspace_items_list - List all items in a workspace (with optional type filter)
Scheduler Operations
schedule_list - List all schedules for an item
schedule_create - Create a schedule (Daily, Weekly, or Cron)
schedule_delete - Delete a schedule
schedule_enable - Enable a schedule
schedule_disable - Disable a schedule
Installation
npm install fabric-data-engineering-mcpOr run directly with npx:
npx fabric-data-engineering-mcpAuthentication
The server supports multiple authentication methods via Azure Identity:
1. Azure CLI (Recommended for Development)
No configuration needed! Just run:
az loginThen use the MCP server - it will automatically use your Azure CLI credentials.
2. Environment Variables (Service Principal)
Set these environment variables:
export AZURE_TENANT_ID="your-tenant-id"
export AZURE_CLIENT_ID="your-client-id"
export AZURE_CLIENT_SECRET="your-client-secret"3. Managed Identity (Azure Hosted)
When running in Azure (App Service, Functions, VMs, AKS), the server automatically uses Managed Identity.
4. VS Code Azure Extension
If you have the Azure extension installed and signed in, the server can use those credentials.
MCP Configuration
For Claude Desktop / VS Code
Add to your MCP settings:
Using Azure CLI auth (no credentials needed):
{
"mcpServers": {
"fabric-data-engineering": {
"command": "npx",
"args": ["-y", "fabric-data-engineering-mcp"]
}
}
}Using Service Principal:
{
"mcpServers": {
"fabric-data-engineering": {
"command": "npx",
"args": ["-y", "fabric-data-engineering-mcp"],
"env": {
"AZURE_TENANT_ID": "your-tenant-id",
"AZURE_CLIENT_ID": "your-client-id",
"AZURE_CLIENT_SECRET": "your-secret"
}
}
}
}Required Permissions
Your Azure identity needs the following permissions in Microsoft Fabric:
Workspace: At least Contributor role on target workspaces
Items: Execute permissions on notebooks, pipelines, and Spark jobs
Lakehouses: Read/Write permissions for Lakehouse operations
Usage Examples
List Workspaces
User: List all my Fabric workspaces
Assistant: [calls workspace_list]Run a Notebook
User: Run the "Daily ETL" notebook in my "Analytics" workspace
Assistant: [calls workspace_list to find workspace ID]
[calls notebook_list to find notebook ID]
[calls notebook_run with workspace and notebook IDs]Check Job Status
User: What's the status of my notebook run?
Assistant: [calls notebook_run_status with run ID]Create a Lakehouse
User: Create a new Lakehouse called "SalesData" in my workspace
Assistant: [calls lakehouse_create with displayName "SalesData"]Schedule a Pipeline
User: Schedule my "Nightly Refresh" pipeline to run every day at 2am
Assistant: [calls schedule_create with Daily schedule type]Complementary MCP Servers
This server is designed to work alongside:
@bytebase/dbhub → SQL queries against Fabric Warehouse/Lakehouse SQL endpoints
powerbi-modeling-mcp → Semantic model operations via XMLA
@azure/mcp → General Azure resource management
@microsoft/fabric-mcp → API documentation and OneLake file operations
Development
Build from Source
git clone https://github.com/your-repo/fabric-data-engineering-mcp
cd fabric-data-engineering-mcp
npm install
npm run buildRun in Development Mode
npm run devType Check
npm run typecheckAPI Reference
Long-Running Operations
Notebook runs, pipeline runs, and Spark job runs are asynchronous operations. The *_run tools return immediately with a runId that you can use with *_run_status to poll for completion.
Status values:
NotStarted- Job is queued but hasn't startedInProgress- Job is currently runningCompleted- Job finished successfullyFailed- Job failed (checkfailureReason)Cancelled- Job was cancelled
Error Handling
The server provides detailed error messages:
{
"errorCode": "ItemNotFound",
"message": "The specified item was not found"
}Common error codes:
ItemNotFound- Workspace, notebook, pipeline, or Lakehouse doesn't existUnauthorized- Missing permissionsInvalidRequest- Invalid parametersTooManyRequests- Rate limited (server auto-retries)
Environment Variables
Variable | Description | Required |
| Azure AD tenant ID | For service principal auth |
| Application (client) ID | For service principal auth |
| Client secret | For service principal auth |
| Auth method: | No (default: |
Troubleshooting
"No valid Azure credentials found"
Run az login to authenticate with Azure CLI, or set the service principal environment variables.
"Application not found in tenant"
Verify your AZURE_CLIENT_ID and AZURE_TENANT_ID are correct.
"Multi-factor authentication required"
Use Azure CLI auth (az login) which handles MFA, or configure your app registration for MFA.
Rate Limiting
The server automatically retries on HTTP 429 responses with exponential backoff. If you're still seeing rate limit errors, reduce the frequency of your requests.
License
MIT
Contributing
Contributions welcome! Please read our contributing guidelines first.
This server cannot be installed
Resources
Looking for Admin?
Admins can modify the Dockerfile, update the server description, and track usage metrics. If you are the server author, to access the admin panel.