prompt_from_file_to_file
Send prompts from a file to multiple LLM models and store their responses in specified directories, using absolute paths for file input and output.
Instructions
Send a prompt from a file to multiple LLM models and save responses to files. IMPORTANT: You MUST provide absolute paths (e.g., /path/to/file or C:\path\to\file) for both file and output directory, not relative paths.
Input Schema
Name | Required | Description | Default |
---|---|---|---|
abs_file_path | Yes | Absolute path to the file containing the prompt (must be an absolute path, not relative) | |
abs_output_dir | No | Absolute directory path to save the response files to (must be an absolute path, not relative. Default: current directory) | . |
models_prefixed_by_provider | No | List of models with provider prefixes (e.g., 'openai:gpt-4o' or 'o:gpt-4o'). If not provided, uses default models. |