Skip to main content
Glama
CircleCI-Public

mcp-server-circleci

Official

create_prompt_template

Generate structured prompt templates from feature requirements or existing prompts to enable systematic testing and evaluation of AI applications.

Instructions

ABOUT THIS TOOL:

  • This tool is part of a toolchain that generates and provides test cases for a prompt template.

  • This tool helps an AI assistant to generate a prompt template based on one of the following:

    1. feature requirements defined by a user - in which case the tool will generate a new prompt template based on the feature requirements.

    2. a pre-existing prompt or prompt template that a user wants to test, evaluate, or modify - in which case the tool will convert it into a more structured and testable prompt template while leaving the original prompt language relatively unchanged.

  • This tool will return a structured prompt template (e.g. template) along with a context schema (e.g. contextSchema) that defines the expected input parameters for the prompt template.

  • In some cases, a user will want to add test coverage for ALL of the prompts in a given application. In these cases, the AI agent should use this tool to generate a prompt template for each prompt in the application, and should check the entire application for AI prompts that are not already covered by a prompt template in the ./prompts directory.

WHEN SHOULD THIS TOOL BE TRIGGERED?

  • This tool should be triggered whenever the user provides requirements for a new AI-enabled application or a new AI-enabled feature of an existing application (i.e. one that requires a prompt request to an LLM or any AI model).

  • This tool should also be triggered if the user provides a pre-existing prompt or prompt template from their codebase that they want to test, evaluate, or modify.

  • This tool should be triggered even if there are pre-existing files in the ./prompts directory with the <relevant-name>.prompt.yml convention (e.g. bedtime-story-generator.prompt.yml, plant-care-assistant.prompt.yml, customer-support-chatbot.prompt.yml, etc.). Similar files should NEVER be generated directly by the AI agent. Instead, the AI agent should use this tool to first generate a new prompt template.

PARAMETERS:

  • params: object

    • prompt: string (the feature requirements or pre-existing prompt/prompt template that will be used to generate a prompt template. Can be a multi-line string.)

    • promptOrigin: "codebase" | "requirements" (indicates whether the prompt comes from an existing codebase or from new requirements)

    • model: string (the model that the prompt template will be tested against. Explicitly specify the model if it can be inferred from the codebase. Otherwise, defaults to gpt-4.1-mini.)

    • temperature: number (the temperature of the prompt template. Explicitly specify the temperature if it can be inferred from the codebase. Otherwise, defaults to 1.)

EXAMPLE USAGE (from new requirements): { "params": { "prompt": "Create an app that takes any topic and an age (in years), then renders a 1-minute bedtime story for a person of that age.", "promptOrigin": "requirements" "model": "gpt-4.1-mini" "temperature": 1.0 } }

EXAMPLE USAGE (from pre-existing prompt/prompt template in codebase): { "params": { "prompt": "The user wants a bedtime story about {{topic}} for a person of age {{age}} years old. Please craft a captivating tale that captivates their imagination and provides a delightful bedtime experience.", "promptOrigin": "codebase" "model": "claude-3-5-sonnet-latest" "temperature": 0.7 } }

TOOL OUTPUT INSTRUCTIONS:

  • The tool will return...

    • a template that reformulates the user's prompt into a more structured format.

    • a contextSchema that defines the expected input parameters for the template.

    • a promptOrigin that indicates whether the prompt comes from an existing prompt or prompt template in the user's codebase or from new requirements.

  • The tool output -- the template, contextSchema, and promptOrigin -- will also be used as input to the recommend_prompt_template_tests tool to generate a list of recommended tests that can be used to test the prompt template.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
paramsNo

Implementation Reference

  • The ToolCallback handler function that implements the core logic for the 'create_prompt_template' MCP tool. It invokes the CircletClient to generate a prompt template and returns a structured text response with next steps.
    export const createPromptTemplate: ToolCallback<{
      params: typeof createPromptTemplateInputSchema;
    }> = async (args) => {
      const { prompt, promptOrigin, model } = args.params ?? {};
    
      const circlet = new CircletClient();
      const promptObject = await circlet.circlet.createPromptTemplate(
        prompt,
        promptOrigin,
      );
    
      return {
        content: [
          {
            type: 'text',
            text: `${promptOriginKey}: ${promptOrigin}
    
    ${promptTemplateKey}: ${promptObject.template}
    
    ${contextSchemaKey}: ${JSON.stringify(promptObject.contextSchema, null, 2)}
    
    ${modelKey}: ${model}
    
    NEXT STEP:
    - Immediately call the \`${PromptWorkbenchToolName.recommend_prompt_template_tests}\` tool with:
      - template: the \`${promptTemplateKey}\` above
      - ${contextSchemaKey}: the \`${contextSchemaKey}\` above
      - ${promptOriginKey}: the \`${promptOriginKey}\` above
      - ${modelKey}: the \`${modelKey}\` above
      - ${temperatureKey}: the \`${temperatureKey}\` above
    `,
          },
        ],
      };
    };
  • Zod schema defining the input parameters (prompt, promptOrigin, model, temperature) for the create_prompt_template tool.
    export const createPromptTemplateInputSchema = z.object({
      prompt: z
        .string()
        .describe(
          "The user's application, feature, or product requirements that will be used to generate a prompt template. Alternatively, a pre-existing prompt or prompt template can be provided if a user wants to test, evaluate, or modify it. (Can be a multi-line string.)",
        ),
      promptOrigin: z
        .nativeEnum(PromptOrigin)
        .describe(
          `The origin of the prompt - either "${PromptOrigin.codebase}" for existing prompts from the codebase, or "${PromptOrigin.requirements}" for new prompts from requirements.`,
        ),
      model: z
        .string()
        .default(defaultModel)
        .describe(
          `The model that the prompt template will be tested against. Explicitly specify the model if it can be inferred from the codebase. Otherwise, defaults to \`${defaultModel}\`.`,
        ),
      temperature: z
        .number()
        .default(defaultTemperature)
        .describe(
          `The temperature of the prompt template. Explicitly specify the temperature if it can be inferred from the codebase. Otherwise, defaults to ${defaultTemperature}.`,
        ),
    });
  • Defines the tool registration object with name 'create_prompt_template', detailed description, and references the input schema.
    export const createPromptTemplateTool = {
      name: PromptWorkbenchToolName.create_prompt_template,
      description: `
      ABOUT THIS TOOL:
      - This tool is part of a toolchain that generates and provides test cases for a prompt template.
      - This tool helps an AI assistant to generate a prompt template based on one of the following:
        1. feature requirements defined by a user - in which case the tool will generate a new prompt template based on the feature requirements.
        2. a pre-existing prompt or prompt template that a user wants to test, evaluate, or modify - in which case the tool will convert it into a more structured and testable prompt template while leaving the original prompt language relatively unchanged.
      - This tool will return a structured prompt template (e.g. \`${templateKey}\`) along with a context schema (e.g. \`${contextSchemaKey}\`) that defines the expected input parameters for the prompt template.
      - In some cases, a user will want to add test coverage for ALL of the prompts in a given application. In these cases, the AI agent should use this tool to generate a prompt template for each prompt in the application, and should check the entire application for AI prompts that are not already covered by a prompt template in the \`${promptsOutputDirectory}\` directory.
    
      WHEN SHOULD THIS TOOL BE TRIGGERED?
      - This tool should be triggered whenever the user provides requirements for a new AI-enabled application or a new AI-enabled feature of an existing  application (i.e. one that requires a prompt request to an LLM or any AI model).
      - This tool should also be triggered if the user provides a pre-existing prompt or prompt template from their codebase that they want to test, evaluate, or modify.
      - This tool should be triggered even if there are pre-existing files in the \`${promptsOutputDirectory}\` directory with the \`${fileNameTemplate}\` convention (e.g. \`${fileNameExample1}\`, \`${fileNameExample2}\`, \`${fileNameExample3}\`, etc.). Similar files should NEVER be generated directly by the AI agent. Instead, the AI agent should use this tool to first generate a new prompt template.
    
      PARAMETERS:
      - ${paramsKey}: object
        - ${promptKey}: string (the feature requirements or pre-existing prompt/prompt template that will be used to generate a prompt template. Can be a multi-line string.)
        - ${promptOriginKey}: "${PromptOrigin.codebase}" | "${PromptOrigin.requirements}" (indicates whether the prompt comes from an existing codebase or from new requirements)
        - ${modelKey}: string (the model that the prompt template will be tested against. Explicitly specify the model if it can be inferred from the codebase. Otherwise, defaults to \`${defaultModel}\`.)
        - ${temperatureKey}: number (the temperature of the prompt template. Explicitly specify the temperature if it can be inferred from the codebase. Otherwise, defaults to ${defaultTemperature}.)
    
      EXAMPLE USAGE (from new requirements):
      {
        "${paramsKey}": {
          "${promptKey}": "Create an app that takes any topic and an age (in years), then renders a 1-minute bedtime story for a person of that age.",
          "${promptOriginKey}": "${PromptOrigin.requirements}"
          "${modelKey}": "${defaultModel}"
          "${temperatureKey}": 1.0
        }
      }
    
      EXAMPLE USAGE (from pre-existing prompt/prompt template in codebase):
      {
        "${paramsKey}": {
          "${promptKey}": "The user wants a bedtime story about {{topic}} for a person of age {{age}} years old. Please craft a captivating tale that captivates their imagination and provides a delightful bedtime experience.",
          "${promptOriginKey}": "${PromptOrigin.codebase}"
          "${modelKey}": "claude-3-5-sonnet-latest"
          "${temperatureKey}": 0.7
        }
      }
    
      TOOL OUTPUT INSTRUCTIONS:
      - The tool will return...
        - a \`${templateKey}\` that reformulates the user's prompt into a more structured format.
        - a \`${contextSchemaKey}\` that defines the expected input parameters for the template.
        - a \`${promptOriginKey}\` that indicates whether the prompt comes from an existing prompt or prompt template in the user's codebase or from new requirements.
      - The tool output -- the \`${templateKey}\`, \`${contextSchemaKey}\`, and \`${promptOriginKey}\` -- will also be used as input to the \`${PromptWorkbenchToolName.recommend_prompt_template_tests}\` tool to generate a list of recommended tests that can be used to test the prompt template.
      `,
      inputSchema: createPromptTemplateInputSchema,
    };
  • Registers the createPromptTemplateTool in the main CCI_TOOLS array for the MCP server.
    export const CCI_TOOLS = [
      getBuildFailureLogsTool,
      getFlakyTestLogsTool,
      getLatestPipelineStatusTool,
      getJobTestResultsTool,
      configHelperTool,
      createPromptTemplateTool,
      recommendPromptTemplateTestsTool,
      runPipelineTool,
      listFollowedProjectsTool,
      runEvaluationTestsTool,
      rerunWorkflowTool,
      downloadUsageApiDataTool,
      findUnderusedResourceClassesTool,
      analyzeDiffTool,
      runRollbackPipelineTool,
      listComponentVersionsTool,
    ];
  • Underlying API client method called by the tool handler to create the prompt template via HTTP POST to '/workbench'.
    async createPromptTemplate(
      prompt: string,
      promptOrigin: PromptOrigin,
    ): Promise<PromptObject> {
      const result = await this.client.post<WorkbenchResponse>('/workbench', {
        prompt,
        promptOrigin,
      });
    
      const parsedResult = WorkbenchResponseSchema.safeParse(result);
    
      if (!parsedResult.success) {
        throw new Error(
          `Failed to parse workbench response. Error: ${parsedResult.error.message}`,
        );
      }
    
      return parsedResult.data.workbench;
    }

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/CircleCI-Public/mcp-server-circleci'

If you have feedback or need assistance with the MCP directory API, please join our Discord server