Skip to main content
Glama

prompt_from_file_tool

Read prompts from files and send them to multiple LLM models for comparison, streamlining agile development workflows with unified prompt delivery.

Instructions

Read a prompt from a file and send it to multiple LLM models. Args: file_path: Path to the file containing the prompt text models_prefixed_by_provider: List of models in format "provider:model" (e.g., "openai:gpt-4"). If None, defaults to ["openai:gpt-4o-mini"] Returns: List of responses, one from each specified model

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
file_pathYes
models_prefixed_by_providerNo

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/danielscholl/agile-team-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server