prompt_from_file
Generate responses from multiple LLM models by sending a prompt stored in a file. Specify the absolute file path and optional models to streamline model testing and integration.
Instructions
Send a prompt from a file to multiple LLM models. IMPORTANT: You MUST provide an absolute file path (e.g., /path/to/file or C:\path\to\file), not a relative path.
Input Schema
Name | Required | Description | Default |
---|---|---|---|
abs_file_path | Yes | Absolute path to the file containing the prompt (must be an absolute path, not relative) | |
models_prefixed_by_provider | No | List of models with provider prefixes (e.g., 'openai:gpt-4o' or 'o:gpt-4o'). If not provided, uses default models. |