Skip to main content
Glama

retell_create_llm

Configure a custom AI agent with specific prompts, model selection, and tools for voice or chat interactions.

Instructions

Create a new Retell LLM configuration with custom prompts and settings.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
modelYesThe base model (e.g., 'gpt-4o', 'claude-3.5-sonnet', 'gpt-4o-mini')
general_promptYesThe main system prompt defining the agent's behavior and personality
begin_messageNoOptional: The greeting message the agent says when call starts
general_toolsNoOptional: Array of tool configurations for function calling
inbound_dynamic_variables_webhook_urlNoOptional: Webhook URL to fetch dynamic variables for inbound calls
knowledge_base_idsNoOptional: Array of knowledge base IDs to use

Implementation Reference

  • Handler case within the executeTool switch statement that invokes the retellRequest helper to POST to the Retell API /create-retell-llm endpoint with the provided arguments.
    case "retell_create_llm": return retellRequest("/create-retell-llm", "POST", args);
  • Tool schema definition including name, description, and detailed input schema for validating parameters when creating a new Retell LLM.
    { name: "retell_create_llm", description: "Create a new Retell LLM configuration with custom prompts and settings.", inputSchema: { type: "object", properties: { model: { type: "string", description: "The base model (e.g., 'gpt-4o', 'claude-3.5-sonnet', 'gpt-4o-mini')" }, general_prompt: { type: "string", description: "The main system prompt defining the agent's behavior and personality" }, begin_message: { type: "string", description: "Optional: The greeting message the agent says when call starts" }, general_tools: { type: "array", description: "Optional: Array of tool configurations for function calling", items: { type: "object" } }, inbound_dynamic_variables_webhook_url: { type: "string", description: "Optional: Webhook URL to fetch dynamic variables for inbound calls" }, knowledge_base_ids: { type: "array", items: { type: "string" }, description: "Optional: Array of knowledge base IDs to use" } }, required: ["model", "general_prompt"] } },
  • src/index.ts:1283-1285 (registration)
    Registration of the list tools handler which returns the full tools array including retell_create_llm.
    server.setRequestHandler(ListToolsRequestSchema, async () => { return { tools }; });
  • Core helper function that handles all authenticated HTTP requests to the Retell AI API, used by the tool handler.
    async function retellRequest( endpoint: string, method: string = "GET", body?: Record<string, unknown> ): Promise<unknown> { const apiKey = getApiKey(); const headers: Record<string, string> = { "Authorization": `Bearer ${apiKey}`, "Content-Type": "application/json", }; const options: RequestInit = { method, headers, }; if (body && method !== "GET") { options.body = JSON.stringify(body); } const response = await fetch(`${RETELL_API_BASE}${endpoint}`, options); if (!response.ok) { const errorText = await response.text(); throw new Error(`Retell API error (${response.status}): ${errorText}`); } // Handle 204 No Content if (response.status === 204) { return { success: true }; } return response.json(); }
  • src/index.ts:1287-1291 (registration)
    MCP server request handler for tool execution (CallToolRequestSchema) that dispatches to executeTool based on tool name.
    // Register tool execution handler server.setRequestHandler(CallToolRequestSchema, async (request) => { const { name, arguments: args } = request.params; try {

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/itsanamune/retellsimp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server