recommend_prompt_template_tests
Generate structured test cases for prompt templates by validating input parameters and context schemas, ensuring compatibility with specified models and origins.
Instructions
About this tool:
This tool is part of a toolchain that generates and provides test cases for a prompt template.
This tool generates an array of recommended tests for a given prompt template.
Parameters:
params: object
promptTemplate: string (the prompt template to be tested)
contextSchema: object (the context schema that defines the expected input parameters for the prompt template)
promptOrigin: "codebase" | "requirements" (indicates whether the prompt comes from an existing codebase or from new requirements)
model: string (the model that the prompt template will be tested against)
Example usage: { "params": { "promptTemplate": "The user wants a bedtime story about {{topic}} for a person of age {{age}} years old. Please craft a captivating tale that captivates their imagination and provides a delightful bedtime experience.", "contextSchema": { "topic": "string", "age": "number" }, "promptOrigin": "codebase" } }
The tool will return a structured array of test cases that can be used to test the prompt template.
Tool output instructions:
- The tool will return a recommendedTests array that can be used to test the prompt template.
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
| params | No |
Input Schema (JSON Schema)
Implementation Reference
- The primary handler function that implements the tool logic. It invokes the Circlet client to recommend tests for the given prompt template and context schema, then returns detailed text instructions for saving the results into structured files.export const recommendPromptTemplateTests: ToolCallback<{ params: typeof recommendPromptTemplateTestsInputSchema; }> = async (args) => { const { template, contextSchema, promptOrigin } = args.params; const circlet = new CircletClient(); const result = await circlet.circlet.recommendPromptTemplateTests({ template, contextSchema, }); const baseInstructions = `${recommendedTestsKey}: ${JSON.stringify(result, null, 2)} NEXT STEP: - Immediately save the \`${promptTemplateKey}\`, \`${contextSchemaKey}\`, and \`${recommendedTestsKey}\` to a single file containing the prompt template, context schema, and tests in a simple structured format (e.g. YAML, JSON, or whatever is most appropriate for the language of the current repository). - The ${fileExtension} file should be named in the format '${fileNameTemplate}' (e.g. '${fileNameExample1}', '${fileNameExample2}', '${fileNameExample3}', etc.) - The file should have the following keys: - \`name\`: string (the name of the prompt template) - \`description\`: string (a description of the prompt template) - \`version\`: string (the semantic version of the prompt template, e.g. "1.0.0") - \`${promptOriginKey}\`: string (the origin of the prompt template, e.g. "${PromptOrigin.codebase}" or "${PromptOrigin.requirements}") - \`${modelKey}\`: string (the model used for generating the prompt template and tests) - \`${temperatureKey}\`: number (the temperature used for generating the prompt template and tests) - \`template\`: multi-line string (the prompt template) - \`${contextSchemaKey}\`: object (the \`${contextSchemaKey}\`) - \`tests\`: array of objects (based on the \`${recommendedTestsKey}\`) - \`name\`: string (a relevant "Title Case" name for the test, based on the content of the \`${recommendedTestsKey}\` array item) - \`description\`: string (taken directly from string array item in \`${recommendedTestsKey}\`) - \`sampleInputs\`: object[] (the sample inputs for the \`${promptTemplateKey}\` and any tests within \`${recommendedTestsKey}\`) RULES FOR SAVING FILES: - The files should be saved in the \`${promptsOutputDirectory}\` directory at the root of the project. - Files should be written with respect to the prevailing conventions of the current repository. - The prompt files should be documented with a README description of what they do, and how they work. - If a README already exists in the \`${promptsOutputDirectory}\` directory, update it with the new prompt template information. - If a README does not exist in the \`${promptsOutputDirectory}\` directory, create one. - The files should be formatted using the user's preferred conventions. - Only save the following files (and nothing else): - \`${fileNameTemplate}\` - \`README.md\``; const integrationInstructions = promptOrigin === PromptOrigin.codebase ? ` FINALLY, ONCE ALL THE FILES ARE SAVED: 1. Ask user if they want to integrate the new templates into their app as a more tested and trustworthy replacement for their pre-existing prompt implementations. (Yes/No) 2. If yes, import the \`${promptsOutputDirectory}\` files into their app, following codebase conventions 3. Only use existing dependencies - no new imports 4. Ensure integration is error-free and builds successfully` : ''; return { content: [ { type: 'text', text: baseInstructions + integrationInstructions, }, ], }; };
- Zod schema defining the input parameters for the tool, including template, contextSchema, promptOrigin, model, and temperature.export const recommendPromptTemplateTestsInputSchema = z.object({ template: z .string() .describe( `The prompt template to be tested. Use the \`promptTemplate\` from the latest \`${PromptWorkbenchToolName.create_prompt_template}\` tool output (if available).`, ), contextSchema: z .record(z.string(), z.string()) .describe( `The context schema that defines the expected input parameters for the prompt template. Use the \`contextSchema\` from the latest \`${PromptWorkbenchToolName.create_prompt_template}\` tool output.`, ), promptOrigin: z .nativeEnum(PromptOrigin) .describe( `The origin of the prompt template, indicating where it came from (e.g. "${PromptOrigin.codebase}" or "${PromptOrigin.requirements}").`, ), model: z .string() .default(defaultModel) .describe( `The model to use for generating actual prompt outputs for testing. Defaults to ${defaultModel}.`, ), temperature: z .number() .default(defaultTemperature) .describe( `The temperature of the prompt template. Explicitly specify the temperature if it can be inferred from the codebase. Otherwise, defaults to ${defaultTemperature}.`, ), });
- src/tools/recommendPromptTemplateTests/tool.ts:11-43 (registration)Definition of the tool object with name, description, and reference to input schema, used for registration.export const recommendPromptTemplateTestsTool = { name: PromptWorkbenchToolName.recommend_prompt_template_tests, description: ` About this tool: - This tool is part of a toolchain that generates and provides test cases for a prompt template. - This tool generates an array of recommended tests for a given prompt template. Parameters: - ${paramsKey}: object - ${promptTemplateKey}: string (the prompt template to be tested) - ${contextSchemaKey}: object (the context schema that defines the expected input parameters for the prompt template) - ${promptOriginKey}: "${PromptOrigin.codebase}" | "${PromptOrigin.requirements}" (indicates whether the prompt comes from an existing codebase or from new requirements) - ${modelKey}: string (the model that the prompt template will be tested against) Example usage: { "${paramsKey}": { "${promptTemplateKey}": "The user wants a bedtime story about {{topic}} for a person of age {{age}} years old. Please craft a captivating tale that captivates their imagination and provides a delightful bedtime experience.", "${contextSchemaKey}": { "topic": "string", "age": "number" }, "${promptOriginKey}": "${PromptOrigin.codebase}" } } The tool will return a structured array of test cases that can be used to test the prompt template. Tool output instructions: - The tool will return a ${recommendedTestsVar} array that can be used to test the prompt template. `, inputSchema: recommendPromptTemplateTestsInputSchema, };
- src/circleci-tools.ts:58-72 (registration)Registration of the tool handler in the central CCI_HANDLERS map under the key 'recommend_prompt_template_tests'.export const CCI_HANDLERS = { get_build_failure_logs: getBuildFailureLogs, find_flaky_tests: getFlakyTestLogs, get_latest_pipeline_status: getLatestPipelineStatus, get_job_test_results: getJobTestResults, config_helper: configHelper, create_prompt_template: createPromptTemplate, recommend_prompt_template_tests: recommendPromptTemplateTests, run_pipeline: runPipeline, list_followed_projects: listFollowedProjects, run_evaluation_tests: runEvaluationTests, rerun_workflow: rerunWorkflow, analyze_diff: analyzeDiff, run_rollback_pipeline: runRollbackPipeline, } satisfies ToolHandlers;
- src/circleci-tools.ts:30-44 (registration)Registration of the tool object in the central CCI_TOOLS array.export const CCI_TOOLS = [ getBuildFailureLogsTool, getFlakyTestLogsTool, getLatestPipelineStatusTool, getJobTestResultsTool, configHelperTool, createPromptTemplateTool, recommendPromptTemplateTestsTool, runPipelineTool, listFollowedProjectsTool, runEvaluationTestsTool, rerunWorkflowTool, analyzeDiffTool, runRollbackPipelineTool, ];