Skip to main content
Glama

retell_create_llm

Configure custom AI agents by setting model parameters, system prompts, and conversation tools for voice and chat interactions.

Instructions

Create a new Retell LLM configuration with custom prompts and settings.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
modelYesThe base model (e.g., 'gpt-4o', 'claude-3.5-sonnet', 'gpt-4o-mini')
general_promptYesThe main system prompt defining the agent's behavior and personality
begin_messageNoOptional: The greeting message the agent says when call starts
general_toolsNoOptional: Array of tool configurations for function calling
inbound_dynamic_variables_webhook_urlNoOptional: Webhook URL to fetch dynamic variables for inbound calls
knowledge_base_idsNoOptional: Array of knowledge base IDs to use

Implementation Reference

  • The handler logic for the 'retell_create_llm' tool in the executeTool switch statement. It forwards the request to Retell's API endpoint /create-retell-llm.
    case "retell_create_llm":
      return retellRequest("/create-retell-llm", "POST", args);
  • src/index.ts:702-739 (registration)
    Registration of the 'retell_create_llm' tool in the tools array, including name, description, and inputSchema for validation.
    {
      name: "retell_create_llm",
      description: "Create a new Retell LLM configuration with custom prompts and settings.",
      inputSchema: {
        type: "object",
        properties: {
          model: {
            type: "string",
            description: "The base model (e.g., 'gpt-4o', 'claude-3.5-sonnet', 'gpt-4o-mini')"
          },
          general_prompt: {
            type: "string",
            description: "The main system prompt defining the agent's behavior and personality"
          },
          begin_message: {
            type: "string",
            description: "Optional: The greeting message the agent says when call starts"
          },
          general_tools: {
            type: "array",
            description: "Optional: Array of tool configurations for function calling",
            items: {
              type: "object"
            }
          },
          inbound_dynamic_variables_webhook_url: {
            type: "string",
            description: "Optional: Webhook URL to fetch dynamic variables for inbound calls"
          },
          knowledge_base_ids: {
            type: "array",
            items: { type: "string" },
            description: "Optional: Array of knowledge base IDs to use"
          }
        },
        required: ["model", "general_prompt"]
      }
    },

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/itsanamune/retellsimp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server