Skip to main content
Glama

Just Prompt

by disler

prompt_from_file

Generate responses from multiple LLM models by sending a prompt stored in a file. Specify the absolute file path and optional models to streamline model testing and integration.

Instructions

Send a prompt from a file to multiple LLM models. IMPORTANT: You MUST provide an absolute file path (e.g., /path/to/file or C:\path\to\file), not a relative path.

Input Schema

NameRequiredDescriptionDefault
abs_file_pathYesAbsolute path to the file containing the prompt (must be an absolute path, not relative)
models_prefixed_by_providerNoList of models with provider prefixes (e.g., 'openai:gpt-4o' or 'o:gpt-4o'). If not provided, uses default models.

Input Schema (JSON Schema)

{ "properties": { "abs_file_path": { "description": "Absolute path to the file containing the prompt (must be an absolute path, not relative)", "title": "Abs File Path", "type": "string" }, "models_prefixed_by_provider": { "anyOf": [ { "items": { "type": "string" }, "type": "array" }, { "type": "null" } ], "default": null, "description": "List of models with provider prefixes (e.g., 'openai:gpt-4o' or 'o:gpt-4o'). If not provided, uses default models.", "title": "Models Prefixed By Provider" } }, "required": [ "abs_file_path" ], "title": "PromptFromFileSchema", "type": "object" }

Other Tools from Just Prompt

Related Tools

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/disler/just-prompt'

If you have feedback or need assistance with the MCP directory API, please join our Discord server