Provides utilities for retrieving schemas and metadata from Delta tables in Microsoft Fabric lakehouses, including generating markdown documentation for Delta tables.
The project is hosted on GitHub and can be cloned from a GitHub repository, though it doesn't appear to offer specific GitHub API integration functionality.
Enables generation of documentation for Delta tables in Markdown format, making it easier to document and share table schemas and metadata.
Microsoft Fabric MCP Server
A comprehensive Python-based MCP (Model Context Protocol) server for interacting with Microsoft Fabric APIs, featuring advanced PySpark notebook development, testing, and optimization capabilities with LLM integration.
🚀 Features
Core Fabric Operations
- ✅ Workspace, lakehouse, warehouse, and table management
- ✅ Delta table schemas and metadata retrieval
- ✅ SQL query execution and data loading
- ✅ Report and semantic model operations
Advanced PySpark Development
- 📓 Intelligent notebook creation with 6 specialized templates
- 🔧 Smart code generation for common PySpark operations
- ✅ Comprehensive validation with syntax and best practices checking
- 🎯 Fabric-specific optimizations and compatibility checks
- 📊 Performance analysis with scoring and optimization recommendations
- 🚀 Real-time monitoring and execution insights
LLM Integration
- 🤖 Natural language interface for PySpark development
- 🧠 Context-aware assistance with conversation memory
- 🎨 Intelligent code formatting and explanations
- 📈 Smart optimization suggestions based on project patterns
🏗️ Architecture
Interaction Flow
- Developer requests assistance in IDE
- IDE communicates with LLM (Claude/GPT)
- LLM analyzes using context and reasoning
- LLM calls MCP server tools intelligently
- MCP tools interact with Fabric API
- Results flow back through LLM with intelligent formatting
- Developer receives contextual, smart responses
📋 Requirements
- Python 3.12+
- Azure credentials for authentication
- uv (from astral): Installation instructions
- Azure CLI: Installation instructions
- Optional: Node.js for MCP inspector: Installation instructions
🔧 Installation
- Clone the repository:
- Set up virtual environment:
- Install dependencies:
🚀 Usage
- Using STDIO
Connect to Microsoft Fabric
Running with MCP Inspector
This starts the server with inspector at http://localhost:6274
.
VSCode Integration
Add to your launch.json
:
- Using HTTP
Start the MCP Server
VSCode Integration
Add to your launch.json
:
🛠️ Complete Tool Reference
1. Workspace Management
list_workspaces
List all available Fabric workspaces.
set_workspace
Set the current workspace context for the session.
2. Lakehouse Operations
list_lakehouses
List all lakehouses in a workspace.
create_lakehouse
Create a new lakehouse.
set_lakehouse
Set current lakehouse context.
3. Warehouse Operations
list_warehouses
List all warehouses in a workspace.
create_warehouse
Create a new warehouse.
set_warehouse
Set current warehouse context.
4. Table Operations
list_tables
List all tables in a lakehouse.
get_lakehouse_table_schema
Get schema for a specific table.
get_all_lakehouse_schemas
Get schemas for all tables in a lakehouse.
set_table
Set current table context.
5. SQL Operations
get_sql_endpoint
Get SQL endpoint for lakehouse or warehouse.
run_query
Execute SQL queries.
6. Data Loading
load_data_from_url
Load data from URL into tables.
7. Reports & Models
list_reports
List all reports in a workspace.
get_report
Get specific report details.
list_semantic_models
List semantic models in workspace.
get_semantic_model
Get specific semantic model.
8. Basic Notebook Operations
list_notebooks
List all notebooks in a workspace.
get_notebook_content
Retrieve notebook content.
update_notebook_cell
Update specific notebook cells.
9. Advanced PySpark Notebook Creation
create_pyspark_notebook
Create notebooks from basic templates.
create_fabric_notebook
Create Fabric-optimized notebooks.
10. PySpark Code Generation
generate_pyspark_code
Generate code for common operations.
generate_fabric_code
Generate Fabric-specific code.
11. Code Validation & Analysis
validate_pyspark_code
Validate PySpark code syntax and best practices.
validate_fabric_code
Validate Fabric compatibility.
analyze_notebook_performance
Comprehensive performance analysis.
12. Context Management
clear_context
Clear current session context.
📊 PySpark Templates
Basic Templates
- basic: Fundamental PySpark operations and DataFrame usage
- etl: Complete ETL pipeline with data cleaning and Delta Lake
- analytics: Advanced analytics with aggregations and window functions
- ml: Machine learning pipeline with MLlib and feature engineering
Advanced Templates
- fabric_integration: Lakehouse connectivity and Fabric-specific utilities
- streaming: Real-time processing with Structured Streaming
🎯 Best Practices
Fabric Optimization
Performance Optimization
Code Quality
🔄 Example LLM-Enhanced Workflows
Natural Language Requests
Performance Analysis
🔍 Troubleshooting
Common Issues
- Authentication: Ensure
az login
with correct scope - Context: Use
clear_context()
to reset session state - Workspace: Verify workspace names and permissions
- Templates: Check available template types in documentation
Getting Help
- Use validation tools for code issues
- Check performance analysis for optimization opportunities
- Leverage LLM natural language interface for guidance
📈 Performance Metrics
The analysis tools provide:
- Operation counts per notebook cell
- Performance issues detection and flagging
- Optimization opportunities identification
- Scoring system (0-100) for code quality
- Fabric compatibility assessment
🤝 Contributing
This project welcomes contributions! Please see our contributing guidelines for details.
📄 License
This project is licensed under the MIT License. See the LICENSE file for details.
🙏 Acknowledgments
Inspired by: https://github.com/Augustab/microsoft_fabric_mcp/tree/main
Ready to supercharge your Microsoft Fabric development with intelligent PySpark assistance! 🚀
This server cannot be installed
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
A Python-based MCP server that enables interaction with Microsoft Fabric APIs for managing workspaces, lakehouses, warehouses, and tables through natural language.
Related MCP Servers
- -securityAlicense-qualityA Python-based MCP server that integrates OpenAPI-described REST APIs into MCP workflows, enabling dynamic exposure of API endpoints as MCP tools.Last updated -2115MIT License
- -securityFlicense-qualityThis is an MCP server that facilitates building tools for interacting with various APIs and workflows, supporting Python-based development with potential for customizable prompts and user configurations.Last updated -
- AsecurityFlicenseAqualityA Python-based server that helps users easily install and configure other MCP servers across different platforms.Last updated -23
- -securityAlicense-qualityA crash course for Python developers on building and integrating Model Context Protocol (MCP) servers into production applications and agent systems.Last updated -MIT License