Skip to main content
Glama

create_prompt_template

Generate structured prompt templates and context schemas for AI models based on user requirements or existing prompts, enabling testing and evaluation of AI-enabled applications.

Instructions

ABOUT THIS TOOL:

  • This tool is part of a toolchain that generates and provides test cases for a prompt template.

  • This tool helps an AI assistant to generate a prompt template based on one of the following:

    1. feature requirements defined by a user - in which case the tool will generate a new prompt template based on the feature requirements.

    2. a pre-existing prompt or prompt template that a user wants to test, evaluate, or modify - in which case the tool will convert it into a more structured and testable prompt template while leaving the original prompt language relatively unchanged.

  • This tool will return a structured prompt template (e.g. template) along with a context schema (e.g. contextSchema) that defines the expected input parameters for the prompt template.

  • In some cases, a user will want to add test coverage for ALL of the prompts in a given application. In these cases, the AI agent should use this tool to generate a prompt template for each prompt in the application, and should check the entire application for AI prompts that are not already covered by a prompt template in the ./prompts directory.

WHEN SHOULD THIS TOOL BE TRIGGERED?

  • This tool should be triggered whenever the user provides requirements for a new AI-enabled application or a new AI-enabled feature of an existing application (i.e. one that requires a prompt request to an LLM or any AI model).

  • This tool should also be triggered if the user provides a pre-existing prompt or prompt template from their codebase that they want to test, evaluate, or modify.

  • This tool should be triggered even if there are pre-existing files in the ./prompts directory with the <relevant-name>.prompt.yml convention (e.g. bedtime-story-generator.prompt.yml, plant-care-assistant.prompt.yml, customer-support-chatbot.prompt.yml, etc.). Similar files should NEVER be generated directly by the AI agent. Instead, the AI agent should use this tool to first generate a new prompt template.

PARAMETERS:

  • params: object

    • prompt: string (the feature requirements or pre-existing prompt/prompt template that will be used to generate a prompt template. Can be a multi-line string.)

    • promptOrigin: "codebase" | "requirements" (indicates whether the prompt comes from an existing codebase or from new requirements)

    • model: string (the model that the prompt template will be tested against. Explicitly specify the model if it can be inferred from the codebase. Otherwise, defaults to gpt-4.1-mini.)

    • temperature: number (the temperature of the prompt template. Explicitly specify the temperature if it can be inferred from the codebase. Otherwise, defaults to 1.)

EXAMPLE USAGE (from new requirements): { "params": { "prompt": "Create an app that takes any topic and an age (in years), then renders a 1-minute bedtime story for a person of that age.", "promptOrigin": "requirements" "model": "gpt-4.1-mini" "temperature": 1.0 } }

EXAMPLE USAGE (from pre-existing prompt/prompt template in codebase): { "params": { "prompt": "The user wants a bedtime story about {{topic}} for a person of age {{age}} years old. Please craft a captivating tale that captivates their imagination and provides a delightful bedtime experience.", "promptOrigin": "codebase" "model": "claude-3-5-sonnet-latest" "temperature": 0.7 } }

TOOL OUTPUT INSTRUCTIONS:

  • The tool will return...

    • a template that reformulates the user's prompt into a more structured format.

    • a contextSchema that defines the expected input parameters for the template.

    • a promptOrigin that indicates whether the prompt comes from an existing prompt or prompt template in the user's codebase or from new requirements.

  • The tool output -- the template, contextSchema, and promptOrigin -- will also be used as input to the recommend_prompt_template_tests tool to generate a list of recommended tests that can be used to test the prompt template.

Input Schema

NameRequiredDescriptionDefault
paramsNo

Input Schema (JSON Schema)

{ "$schema": "http://json-schema.org/draft-07/schema#", "additionalProperties": false, "properties": { "params": { "additionalProperties": false, "properties": { "model": { "default": "gpt-4.1-mini", "description": "The model that the prompt template will be tested against. Explicitly specify the model if it can be inferred from the codebase. Otherwise, defaults to `gpt-4.1-mini`.", "type": "string" }, "prompt": { "description": "The user's application, feature, or product requirements that will be used to generate a prompt template. Alternatively, a pre-existing prompt or prompt template can be provided if a user wants to test, evaluate, or modify it. (Can be a multi-line string.)", "type": "string" }, "promptOrigin": { "description": "The origin of the prompt - either \"codebase\" for existing prompts from the codebase, or \"requirements\" for new prompts from requirements.", "enum": [ "codebase", "requirements" ], "type": "string" }, "temperature": { "default": 1, "description": "The temperature of the prompt template. Explicitly specify the temperature if it can be inferred from the codebase. Otherwise, defaults to 1.", "type": "number" } }, "required": [ "prompt", "promptOrigin" ], "type": "object" } }, "type": "object" }

Implementation Reference

  • The handler function that implements the core logic of the 'create_prompt_template' tool. It creates a prompt template using CircletClient and returns a structured text response with keys for prompt origin, template, context schema, model, and next steps.
    export const createPromptTemplate: ToolCallback<{ params: typeof createPromptTemplateInputSchema; }> = async (args) => { const { prompt, promptOrigin, model } = args.params; const circlet = new CircletClient(); const promptObject = await circlet.circlet.createPromptTemplate( prompt, promptOrigin, ); return { content: [ { type: 'text', text: `${promptOriginKey}: ${promptOrigin} ${promptTemplateKey}: ${promptObject.template} ${contextSchemaKey}: ${JSON.stringify(promptObject.contextSchema, null, 2)} ${modelKey}: ${model} NEXT STEP: - Immediately call the \`${PromptWorkbenchToolName.recommend_prompt_template_tests}\` tool with: - template: the \`${promptTemplateKey}\` above - ${contextSchemaKey}: the \`${contextSchemaKey}\` above - ${promptOriginKey}: the \`${promptOriginKey}\` above - ${modelKey}: the \`${modelKey}\` above - ${temperatureKey}: the \`${temperatureKey}\` above `, }, ], }; };
  • Zod input schema defining the parameters for the 'create_prompt_template' tool: prompt, promptOrigin, model, and temperature.
    export const createPromptTemplateInputSchema = z.object({ prompt: z .string() .describe( "The user's application, feature, or product requirements that will be used to generate a prompt template. Alternatively, a pre-existing prompt or prompt template can be provided if a user wants to test, evaluate, or modify it. (Can be a multi-line string.)", ), promptOrigin: z .nativeEnum(PromptOrigin) .describe( `The origin of the prompt - either "${PromptOrigin.codebase}" for existing prompts from the codebase, or "${PromptOrigin.requirements}" for new prompts from requirements.`, ), model: z .string() .default(defaultModel) .describe( `The model that the prompt template will be tested against. Explicitly specify the model if it can be inferred from the codebase. Otherwise, defaults to \`${defaultModel}\`.`, ), temperature: z .number() .default(defaultTemperature) .describe( `The temperature of the prompt template. Explicitly specify the temperature if it can be inferred from the codebase. Otherwise, defaults to ${defaultTemperature}.`, ), });
  • Tool object registration defining the name 'create_prompt_template', detailed description, and input schema reference.
    export const createPromptTemplateTool = { name: PromptWorkbenchToolName.create_prompt_template, description: ` ABOUT THIS TOOL: - This tool is part of a toolchain that generates and provides test cases for a prompt template. - This tool helps an AI assistant to generate a prompt template based on one of the following: 1. feature requirements defined by a user - in which case the tool will generate a new prompt template based on the feature requirements. 2. a pre-existing prompt or prompt template that a user wants to test, evaluate, or modify - in which case the tool will convert it into a more structured and testable prompt template while leaving the original prompt language relatively unchanged. - This tool will return a structured prompt template (e.g. \`${templateKey}\`) along with a context schema (e.g. \`${contextSchemaKey}\`) that defines the expected input parameters for the prompt template. - In some cases, a user will want to add test coverage for ALL of the prompts in a given application. In these cases, the AI agent should use this tool to generate a prompt template for each prompt in the application, and should check the entire application for AI prompts that are not already covered by a prompt template in the \`${promptsOutputDirectory}\` directory. WHEN SHOULD THIS TOOL BE TRIGGERED? - This tool should be triggered whenever the user provides requirements for a new AI-enabled application or a new AI-enabled feature of an existing application (i.e. one that requires a prompt request to an LLM or any AI model). - This tool should also be triggered if the user provides a pre-existing prompt or prompt template from their codebase that they want to test, evaluate, or modify. - This tool should be triggered even if there are pre-existing files in the \`${promptsOutputDirectory}\` directory with the \`${fileNameTemplate}\` convention (e.g. \`${fileNameExample1}\`, \`${fileNameExample2}\`, \`${fileNameExample3}\`, etc.). Similar files should NEVER be generated directly by the AI agent. Instead, the AI agent should use this tool to first generate a new prompt template. PARAMETERS: - ${paramsKey}: object - ${promptKey}: string (the feature requirements or pre-existing prompt/prompt template that will be used to generate a prompt template. Can be a multi-line string.) - ${promptOriginKey}: "${PromptOrigin.codebase}" | "${PromptOrigin.requirements}" (indicates whether the prompt comes from an existing codebase or from new requirements) - ${modelKey}: string (the model that the prompt template will be tested against. Explicitly specify the model if it can be inferred from the codebase. Otherwise, defaults to \`${defaultModel}\`.) - ${temperatureKey}: number (the temperature of the prompt template. Explicitly specify the temperature if it can be inferred from the codebase. Otherwise, defaults to ${defaultTemperature}.) EXAMPLE USAGE (from new requirements): { "${paramsKey}": { "${promptKey}": "Create an app that takes any topic and an age (in years), then renders a 1-minute bedtime story for a person of that age.", "${promptOriginKey}": "${PromptOrigin.requirements}" "${modelKey}": "${defaultModel}" "${temperatureKey}": 1.0 } } EXAMPLE USAGE (from pre-existing prompt/prompt template in codebase): { "${paramsKey}": { "${promptKey}": "The user wants a bedtime story about {{topic}} for a person of age {{age}} years old. Please craft a captivating tale that captivates their imagination and provides a delightful bedtime experience.", "${promptOriginKey}": "${PromptOrigin.codebase}" "${modelKey}": "claude-3-5-sonnet-latest" "${temperatureKey}": 0.7 } } TOOL OUTPUT INSTRUCTIONS: - The tool will return... - a \`${templateKey}\` that reformulates the user's prompt into a more structured format. - a \`${contextSchemaKey}\` that defines the expected input parameters for the template. - a \`${promptOriginKey}\` that indicates whether the prompt comes from an existing prompt or prompt template in the user's codebase or from new requirements. - The tool output -- the \`${templateKey}\`, \`${contextSchemaKey}\`, and \`${promptOriginKey}\` -- will also be used as input to the \`${PromptWorkbenchToolName.recommend_prompt_template_tests}\` tool to generate a list of recommended tests that can be used to test the prompt template. `, inputSchema: createPromptTemplateInputSchema, };
  • Top-level registration of the 'createPromptTemplateTool' in the CCI_TOOLS array and mapping of the 'create_prompt_template' handler in CCI_HANDLERS for MCP integration.
    export const CCI_TOOLS = [ getBuildFailureLogsTool, getFlakyTestLogsTool, getLatestPipelineStatusTool, getJobTestResultsTool, configHelperTool, createPromptTemplateTool, recommendPromptTemplateTestsTool, runPipelineTool, listFollowedProjectsTool, runEvaluationTestsTool, rerunWorkflowTool, analyzeDiffTool, runRollbackPipelineTool, ]; // Extract the tool names as a union type type CCIToolName = (typeof CCI_TOOLS)[number]['name']; export type ToolHandler<T extends CCIToolName> = ToolCallback<{ params: Extract<(typeof CCI_TOOLS)[number], { name: T }>['inputSchema']; }>; // Create a type for the tool handlers that directly maps each tool to its appropriate input schema type ToolHandlers = { [K in CCIToolName]: ToolHandler<K>; }; export const CCI_HANDLERS = { get_build_failure_logs: getBuildFailureLogs, find_flaky_tests: getFlakyTestLogs, get_latest_pipeline_status: getLatestPipelineStatus, get_job_test_results: getJobTestResults, config_helper: configHelper, create_prompt_template: createPromptTemplate, recommend_prompt_template_tests: recommendPromptTemplateTests, run_pipeline: runPipeline, list_followed_projects: listFollowedProjects, run_evaluation_tests: runEvaluationTests, rerun_workflow: rerunWorkflow, analyze_diff: analyzeDiff, run_rollback_pipeline: runRollbackPipeline, } satisfies ToolHandlers;

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ampcome-mcps/circleci-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server