Skip to main content
Glama
avivsinai

langfuse-mcp

get_prompt

Retrieve a specific prompt by name with resolved dependencies from Langfuse. Use optional label or version parameters to fetch the desired prompt version.

Instructions

Fetch a specific prompt by name with resolved dependencies.

Retrieves a prompt from Langfuse with all dependency tags resolved. Uses the SDK's built-in caching for optimal performance. Args: ctx: Context object containing lifespan context with Langfuse client name: The name of the prompt to fetch label: Optional label to fetch (e.g., 'production'). Cannot be used with version. version: Optional specific version number. Cannot be used with label. Returns: A dictionary containing the prompt details: - id: Unique prompt identifier - name: Prompt name - version: Version number - type: 'text' or 'chat' - prompt: The prompt content (string for text, list for chat) - labels: List of labels assigned to this version - tags: List of tags - config: Model configuration (temperature, model, etc.) Raises: ValueError: If both label and version are specified LookupError: If prompt not found

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
nameYesThe name of the prompt to fetch
labelNoLabel to fetch (e.g., 'production', 'staging'). Mutually exclusive with version.
versionNoSpecific version number to fetch. Mutually exclusive with label.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/avivsinai/landfuse-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server