Skip to main content
Glama

retell_create_chat_agent

Create a text-based chat agent by configuring LLM engines and webhooks for conversational AI applications.

Instructions

Create a new chat agent for text-based conversations.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
response_engineYesThe LLM engine configuration
agent_nameNoDisplay name for the chat agent
webhook_urlNoURL for receiving chat event webhooks

Implementation Reference

  • Handler logic for retell_create_chat_agent: makes a POST request to Retell API /create-chat-agent endpoint using the generic retellRequest helper.
    case "retell_create_chat_agent":
      return retellRequest("/create-chat-agent", "POST", args);
  • Input schema defining parameters for creating a chat agent: response_engine (required with type and optional llm_id), agent_name, webhook_url.
    {
      name: "retell_create_chat_agent",
      description: "Create a new chat agent for text-based conversations.",
      inputSchema: {
        type: "object",
        properties: {
          response_engine: {
            type: "object",
            description: "The LLM engine configuration",
            properties: {
              type: {
                type: "string",
                enum: ["retell-llm", "custom-llm"],
                description: "The type of response engine"
              },
              llm_id: {
                type: "string",
                description: "The LLM ID to use"
              }
            },
            required: ["type"]
          },
          agent_name: {
            type: "string",
            description: "Display name for the chat agent"
          },
          webhook_url: {
            type: "string",
            description: "URL for receiving chat event webhooks"
          }
        },
        required: ["response_engine"]
      }
    },
  • src/index.ts:1283-1285 (registration)
    Registers the listTools handler which returns the tools array containing retell_create_chat_agent.
    server.setRequestHandler(ListToolsRequestSchema, async () => {
      return { tools };
    });
  • Generic HTTP client for Retell API: handles authentication, request formatting, error handling, and JSON parsing. Used by all tool handlers.
    async function retellRequest(
      endpoint: string,
      method: string = "GET",
      body?: Record<string, unknown>
    ): Promise<unknown> {
      const apiKey = getApiKey();
    
      const headers: Record<string, string> = {
        "Authorization": `Bearer ${apiKey}`,
        "Content-Type": "application/json",
      };
    
      const options: RequestInit = {
        method,
        headers,
      };
    
      if (body && method !== "GET") {
        options.body = JSON.stringify(body);
      }
    
      const response = await fetch(`${RETELL_API_BASE}${endpoint}`, options);
    
      if (!response.ok) {
        const errorText = await response.text();
        throw new Error(`Retell API error (${response.status}): ${errorText}`);
      }
    
      // Handle 204 No Content
      if (response.status === 204) {
        return { success: true };
      }
    
      return response.json();
    }
  • Retrieves RETELL_API_KEY from environment and validates presence. Used by retellRequest.
    function getApiKey(): string {
      const apiKey = process.env.RETELL_API_KEY;
      if (!apiKey) {
        throw new Error("RETELL_API_KEY environment variable is required");
      }
      return apiKey;
    }

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/itsanamune/retellsimp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server