Skip to main content
Glama
avivsinai

langfuse-mcp

get_prompt_unresolved

Fetch raw prompt content with dependency tags preserved for analyzing prompt composition and debugging dependency chains in Langfuse.

Instructions

Fetch a specific prompt by name WITHOUT resolving dependencies.

Returns raw prompt content with dependency tags intact (e.g., @@@langfusePrompt:name=xxx@@@) when the SDK supports resolve=false. Otherwise returns the resolved prompt and marks metadata.resolved=True. Useful for analyzing prompt composition and debugging dependency chains. Args: ctx: Context object containing lifespan context with Langfuse client name: The name of the prompt to fetch label: Optional label to fetch. Cannot be used with version. version: Optional specific version number. Cannot be used with label. Returns: A dictionary containing the raw prompt details with dependency tags preserved. Raises: ValueError: If both label and version are specified LookupError: If prompt not found

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
nameYesThe name of the prompt to fetch
labelNoLabel to fetch (e.g., 'production', 'staging'). Mutually exclusive with version.
versionNoSpecific version number to fetch. Mutually exclusive with label.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/avivsinai/landfuse-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server