get_prompt
get_promptFetch a specific prompt by name from Langfuse, optionally specifying a version number or label like 'production' to retrieve prompt content, configuration, and metadata for LLM application observability.
Instructions
Fetch a specific prompt by name. Optionally pin to a version number or a label (e.g. 'production', 'staging'). Returns: name, version, type (text|chat), prompt content, labels, tags, config. Read-only.
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
| name | Yes | Prompt name (exact match) | |
| version | Yes | Version number — omit for latest | |
| label | Yes | Label, e.g. 'production' or 'staging' |