recommend_prompt_template_tests
Generate test cases for prompt templates to validate functionality against specified models and context schemas.
Instructions
About this tool:
This tool is part of a toolchain that generates and provides test cases for a prompt template.
This tool generates an array of recommended tests for a given prompt template.
Parameters:
params: object
promptTemplate: string (the prompt template to be tested)
contextSchema: object (the context schema that defines the expected input parameters for the prompt template)
promptOrigin: "codebase" | "requirements" (indicates whether the prompt comes from an existing codebase or from new requirements)
model: string (the model that the prompt template will be tested against)
Example usage: { "params": { "promptTemplate": "The user wants a bedtime story about {{topic}} for a person of age {{age}} years old. Please craft a captivating tale that captivates their imagination and provides a delightful bedtime experience.", "contextSchema": { "topic": "string", "age": "number" }, "promptOrigin": "codebase" } }
The tool will return a structured array of test cases that can be used to test the prompt template.
Tool output instructions:
- The tool will return a recommendedTests array that can be used to test the prompt template.
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
| params | No |
Implementation Reference
- The main handler function for the 'recommend_prompt_template_tests' tool. It calls the Circlet client to recommend tests for the given prompt template and context schema, then returns detailed instructions for saving the prompt template, schema, tests, and README in the prompts directory.export const recommendPromptTemplateTests: ToolCallback<{ params: typeof recommendPromptTemplateTestsInputSchema; }> = async (args) => { const { template, contextSchema, promptOrigin } = args.params ?? {}; const circlet = new CircletClient(); const result = await circlet.circlet.recommendPromptTemplateTests({ template, contextSchema, }); const baseInstructions = `${recommendedTestsKey}: ${JSON.stringify(result, null, 2)} NEXT STEP: - Immediately save the \`${promptTemplateKey}\`, \`${contextSchemaKey}\`, and \`${recommendedTestsKey}\` to a single file containing the prompt template, context schema, and tests in a simple structured format (e.g. YAML, JSON, or whatever is most appropriate for the language of the current repository). - The ${fileExtension} file should be named in the format '${fileNameTemplate}' (e.g. '${fileNameExample1}', '${fileNameExample2}', '${fileNameExample3}', etc.) - The file should have the following keys: - \`name\`: string (the name of the prompt template) - \`description\`: string (a description of the prompt template) - \`version\`: string (the semantic version of the prompt template, e.g. "1.0.0") - \`${promptOriginKey}\`: string (the origin of the prompt template, e.g. "${PromptOrigin.codebase}" or "${PromptOrigin.requirements}") - \`${modelKey}\`: string (the model used for generating the prompt template and tests) - \`${temperatureKey}\`: number (the temperature used for generating the prompt template and tests) - \`template\`: multi-line string (the prompt template) - \`${contextSchemaKey}\`: object (the \`${contextSchemaKey}\`) - \`tests\`: array of objects (based on the \`${recommendedTestsKey}\`) - \`name\`: string (a relevant "Title Case" name for the test, based on the content of the \`${recommendedTestsKey}\` array item) - \`description\`: string (taken directly from string array item in \`${recommendedTestsKey}\`) - \`sampleInputs\`: object[] (the sample inputs for the \`${promptTemplateKey}\` and any tests within \`${recommendedTestsKey}\`) RULES FOR SAVING FILES: - The files should be saved in the \`${promptsOutputDirectory}\` directory at the root of the project. - Files should be written with respect to the prevailing conventions of the current repository. - The prompt files should be documented with a README description of what they do, and how they work. - If a README already exists in the \`${promptsOutputDirectory}\` directory, update it with the new prompt template information. - If a README does not exist in the \`${promptsOutputDirectory}\` directory, create one. - The files should be formatted using the user's preferred conventions. - Only save the following files (and nothing else): - \`${fileNameTemplate}\` - \`README.md\``; const integrationInstructions = promptOrigin === PromptOrigin.codebase ? ` FINALLY, ONCE ALL THE FILES ARE SAVED: 1. Ask user if they want to integrate the new templates into their app as a more tested and trustworthy replacement for their pre-existing prompt implementations. (Yes/No) 2. If yes, import the \`${promptsOutputDirectory}\` files into their app, following codebase conventions 3. Only use existing dependencies - no new imports 4. Ensure integration is error-free and builds successfully` : ''; return { content: [ { type: 'text', text: baseInstructions + integrationInstructions, }, ], }; };
- Zod input schema defining the parameters for the tool: template (string), contextSchema (record), promptOrigin (enum), model (string, default), temperature (number, default).export const recommendPromptTemplateTestsInputSchema = z.object({ template: z .string() .describe( `The prompt template to be tested. Use the \`promptTemplate\` from the latest \`${PromptWorkbenchToolName.create_prompt_template}\` tool output (if available).`, ), contextSchema: z .record(z.string(), z.string()) .describe( `The context schema that defines the expected input parameters for the prompt template. Use the \`contextSchema\` from the latest \`${PromptWorkbenchToolName.create_prompt_template}\` tool output.`, ), promptOrigin: z .nativeEnum(PromptOrigin) .describe( `The origin of the prompt template, indicating where it came from (e.g. "${PromptOrigin.codebase}" or "${PromptOrigin.requirements}").`, ), model: z .string() .default(defaultModel) .describe( `The model to use for generating actual prompt outputs for testing. Defaults to ${defaultModel}.`, ), temperature: z .number() .default(defaultTemperature) .describe( `The temperature of the prompt template. Explicitly specify the temperature if it can be inferred from the codebase. Otherwise, defaults to ${defaultTemperature}.`, ), });
- src/tools/recommendPromptTemplateTests/tool.ts:11-43 (registration)Local tool registration defining the MCP tool object with name, description, and input schema reference.export const recommendPromptTemplateTestsTool = { name: PromptWorkbenchToolName.recommend_prompt_template_tests, description: ` About this tool: - This tool is part of a toolchain that generates and provides test cases for a prompt template. - This tool generates an array of recommended tests for a given prompt template. Parameters: - ${paramsKey}: object - ${promptTemplateKey}: string (the prompt template to be tested) - ${contextSchemaKey}: object (the context schema that defines the expected input parameters for the prompt template) - ${promptOriginKey}: "${PromptOrigin.codebase}" | "${PromptOrigin.requirements}" (indicates whether the prompt comes from an existing codebase or from new requirements) - ${modelKey}: string (the model that the prompt template will be tested against) Example usage: { "${paramsKey}": { "${promptTemplateKey}": "The user wants a bedtime story about {{topic}} for a person of age {{age}} years old. Please craft a captivating tale that captivates their imagination and provides a delightful bedtime experience.", "${contextSchemaKey}": { "topic": "string", "age": "number" }, "${promptOriginKey}": "${PromptOrigin.codebase}" } } The tool will return a structured array of test cases that can be used to test the prompt template. Tool output instructions: - The tool will return a ${recommendedTestsVar} array that can be used to test the prompt template. `, inputSchema: recommendPromptTemplateTestsInputSchema, };
- src/circleci-tools.ts:37-54 (registration)Registration of the tool in the main CCI_TOOLS array export.export const CCI_TOOLS = [ getBuildFailureLogsTool, getFlakyTestLogsTool, getLatestPipelineStatusTool, getJobTestResultsTool, configHelperTool, createPromptTemplateTool, recommendPromptTemplateTestsTool, runPipelineTool, listFollowedProjectsTool, runEvaluationTestsTool, rerunWorkflowTool, downloadUsageApiDataTool, findUnderusedResourceClassesTool, analyzeDiffTool, runRollbackPipelineTool, listComponentVersionsTool, ];
- src/circleci-tools.ts:68-85 (registration)Registration of the tool handler in the CCI_HANDLERS object with key 'recommend_prompt_template_tests'.export const CCI_HANDLERS = { get_build_failure_logs: getBuildFailureLogs, find_flaky_tests: getFlakyTestLogs, get_latest_pipeline_status: getLatestPipelineStatus, get_job_test_results: getJobTestResults, config_helper: configHelper, create_prompt_template: createPromptTemplate, recommend_prompt_template_tests: recommendPromptTemplateTests, run_pipeline: runPipeline, list_followed_projects: listFollowedProjects, run_evaluation_tests: runEvaluationTests, rerun_workflow: rerunWorkflow, download_usage_api_data: downloadUsageApiData, find_underused_resource_classes: findUnderusedResourceClasses, analyze_diff: analyzeDiff, run_rollback_pipeline: runRollbackPipeline, list_component_versions: listComponentVersions, } satisfies ToolHandlers;