retell_create_llm
Configure a custom AI agent with specific prompts, model selection, and tools for voice or chat interactions.
Instructions
Create a new Retell LLM configuration with custom prompts and settings.
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| model | Yes | The base model (e.g., 'gpt-4o', 'claude-3.5-sonnet', 'gpt-4o-mini') | |
| general_prompt | Yes | The main system prompt defining the agent's behavior and personality | |
| begin_message | No | Optional: The greeting message the agent says when call starts | |
| general_tools | No | Optional: Array of tool configurations for function calling | |
| inbound_dynamic_variables_webhook_url | No | Optional: Webhook URL to fetch dynamic variables for inbound calls | |
| knowledge_base_ids | No | Optional: Array of knowledge base IDs to use |
Implementation Reference
- src/index.ts:1201-1202 (handler)Handler case within the executeTool switch statement that invokes the retellRequest helper to POST to the Retell API /create-retell-llm endpoint with the provided arguments.case "retell_create_llm": return retellRequest("/create-retell-llm", "POST", args);
- src/index.ts:702-739 (schema)Tool schema definition including name, description, and detailed input schema for validating parameters when creating a new Retell LLM.{ name: "retell_create_llm", description: "Create a new Retell LLM configuration with custom prompts and settings.", inputSchema: { type: "object", properties: { model: { type: "string", description: "The base model (e.g., 'gpt-4o', 'claude-3.5-sonnet', 'gpt-4o-mini')" }, general_prompt: { type: "string", description: "The main system prompt defining the agent's behavior and personality" }, begin_message: { type: "string", description: "Optional: The greeting message the agent says when call starts" }, general_tools: { type: "array", description: "Optional: Array of tool configurations for function calling", items: { type: "object" } }, inbound_dynamic_variables_webhook_url: { type: "string", description: "Optional: Webhook URL to fetch dynamic variables for inbound calls" }, knowledge_base_ids: { type: "array", items: { type: "string" }, description: "Optional: Array of knowledge base IDs to use" } }, required: ["model", "general_prompt"] } },
- src/index.ts:1283-1285 (registration)Registration of the list tools handler which returns the full tools array including retell_create_llm.server.setRequestHandler(ListToolsRequestSchema, async () => { return { tools }; });
- src/index.ts:23-57 (helper)Core helper function that handles all authenticated HTTP requests to the Retell AI API, used by the tool handler.async function retellRequest( endpoint: string, method: string = "GET", body?: Record<string, unknown> ): Promise<unknown> { const apiKey = getApiKey(); const headers: Record<string, string> = { "Authorization": `Bearer ${apiKey}`, "Content-Type": "application/json", }; const options: RequestInit = { method, headers, }; if (body && method !== "GET") { options.body = JSON.stringify(body); } const response = await fetch(`${RETELL_API_BASE}${endpoint}`, options); if (!response.ok) { const errorText = await response.text(); throw new Error(`Retell API error (${response.status}): ${errorText}`); } // Handle 204 No Content if (response.status === 204) { return { success: true }; } return response.json(); }
- src/index.ts:1287-1291 (registration)MCP server request handler for tool execution (CallToolRequestSchema) that dispatches to executeTool based on tool name.// Register tool execution handler server.setRequestHandler(CallToolRequestSchema, async (request) => { const { name, arguments: args } = request.params; try {