Skip to main content
Glama

Just Prompt

by disler

prompt_from_file_to_file

Send prompts from a file to multiple LLM models and store their responses in specified directories, using absolute paths for file input and output.

Instructions

Send a prompt from a file to multiple LLM models and save responses to files. IMPORTANT: You MUST provide absolute paths (e.g., /path/to/file or C:\path\to\file) for both file and output directory, not relative paths.

Input Schema

NameRequiredDescriptionDefault
abs_file_pathYesAbsolute path to the file containing the prompt (must be an absolute path, not relative)
abs_output_dirNoAbsolute directory path to save the response files to (must be an absolute path, not relative. Default: current directory).
models_prefixed_by_providerNoList of models with provider prefixes (e.g., 'openai:gpt-4o' or 'o:gpt-4o'). If not provided, uses default models.

Input Schema (JSON Schema)

{ "properties": { "abs_file_path": { "description": "Absolute path to the file containing the prompt (must be an absolute path, not relative)", "title": "Abs File Path", "type": "string" }, "abs_output_dir": { "default": ".", "description": "Absolute directory path to save the response files to (must be an absolute path, not relative. Default: current directory)", "title": "Abs Output Dir", "type": "string" }, "models_prefixed_by_provider": { "anyOf": [ { "items": { "type": "string" }, "type": "array" }, { "type": "null" } ], "default": null, "description": "List of models with provider prefixes (e.g., 'openai:gpt-4o' or 'o:gpt-4o'). If not provided, uses default models.", "title": "Models Prefixed By Provider" } }, "required": [ "abs_file_path" ], "title": "PromptFromFileToFileSchema", "type": "object" }

Other Tools from Just Prompt

Related Tools

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/disler/just-prompt'

If you have feedback or need assistance with the MCP directory API, please join our Discord server