Skip to main content
Glama

livy_run_statement

Execute PySpark, Scala, or SparkR code in an existing Livy session to process data in Microsoft Fabric. Supports immediate or asynchronous execution with result tracking.

Instructions

Execute code in a Livy session.

Executes PySpark, Scala, or SparkR code in an existing Livy session. The session must be in 'idle' state to accept new statements.

Important Notes:

  • Use df.show() or df.printSchema() to inspect DataFrames before accessing columns

  • SHOW TABLES returns 'namespace' column, not 'database' in Fabric

  • Avoid direct Row attribute access without schema verification

  • When with_wait=False, returns immediately with statement ID - check status separately

Parameters: workspace_id: Fabric workspace ID. lakehouse_id: Fabric lakehouse ID. session_id: Livy session ID (must be in 'idle' state). code: Code to execute (PySpark, Scala, or SparkR). kind: Statement kind - 'pyspark' (default), 'scala', or 'sparkr'. with_wait: If True (default), wait for statement completion before returning. timeout_seconds: Maximum time to wait for statement completion (default: from config).

Returns: Dictionary with statement details including id, state, output, and execution details.

Example: ```python # Execute PySpark code result = livy_run_statement( workspace_id="12345678-1234-1234-1234-123456789abc", lakehouse_id="87654321-4321-4321-4321-210987654321", session_id="0", code="df = spark.range(10)\ndf.count()", kind="pyspark", with_wait=True )

if result.get("state") == "available": output = result.get("output", {}) if output.get("status") == "ok": print(f"Result: {output.get('data', {}).get('text/plain')}") ```

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
workspace_idYes
lakehouse_idYes
session_idYes
codeYes
kindNopyspark
with_waitNo
timeout_secondsNo

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/bablulawrence/ms-fabric-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server