openai_chat
Send messages to OpenAI's chat completion API using models like GPT-4o, GPT-4o-mini, o1-preview, or o1-mini for AI-powered conversations and responses.
Instructions
Use this tool when a user specifically requests to use one of OpenAI's models (gpt-4o, gpt-4o-mini, o1-preview, o1-mini). This tool sends messages to OpenAI's chat completion API using the specified model.
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| messages | Yes | Array of messages to send to the API | |
| model | No | Model to use for completion (gpt-4o, gpt-4o-mini, o1-preview, o1-mini) | gpt-4o |
Implementation Reference
- index.ts:95-137 (handler)Handler for the 'openai_chat' tool: parses input, validates model, calls OpenAI chat.completions.create, returns response or error.case "openai_chat": { try { // Parse request arguments const { messages: rawMessages, model } = request.params.arguments as { messages: Array<{ role: string; content: string }>; model?: SupportedModel; }; // Validate model if (!SUPPORTED_MODELS.includes(model!)) { throw new Error(`Unsupported model: ${model}. Must be one of: ${SUPPORTED_MODELS.join(", ")}`); } // Convert messages to OpenAI's expected format const messages: ChatCompletionMessageParam[] = rawMessages.map(msg => ({ role: msg.role as "system" | "user" | "assistant", content: msg.content })); // Call OpenAI API with fixed temperature const completion = await openai.chat.completions.create({ messages, model: model! }); // Return the response return { content: [{ type: "text", text: completion.choices[0]?.message?.content || "No response received" }] }; } catch (error) { return { content: [{ type: "text", text: `OpenAI API error: ${(error as Error).message}` }], isError: true }; } } default:
- index.ts:37-67 (schema)Input schema for 'openai_chat' tool defining messages array (role/content) and optional model from supported list.inputSchema: { type: "object", properties: { messages: { type: "array", description: "Array of messages to send to the API", items: { type: "object", properties: { role: { type: "string", enum: ["system", "user", "assistant"], description: "Role of the message sender" }, content: { type: "string", description: "Content of the message" } }, required: ["role", "content"] } }, model: { type: "string", enum: SUPPORTED_MODELS, description: `Model to use for completion (${SUPPORTED_MODELS.join(", ")})`, default: DEFAULT_MODEL } }, required: ["messages"] }
- index.ts:33-69 (registration)Tool registration in TOOLS array used by ListToolsRequestHandler, including name, description, and inputSchema.const TOOLS: Tool[] = [ { name: "openai_chat", description: `Use this tool when a user specifically requests to use one of OpenAI's models (${SUPPORTED_MODELS.join(", ")}). This tool sends messages to OpenAI's chat completion API using the specified model.`, inputSchema: { type: "object", properties: { messages: { type: "array", description: "Array of messages to send to the API", items: { type: "object", properties: { role: { type: "string", enum: ["system", "user", "assistant"], description: "Role of the message sender" }, content: { type: "string", description: "Content of the message" } }, required: ["role", "content"] } }, model: { type: "string", enum: SUPPORTED_MODELS, description: `Model to use for completion (${SUPPORTED_MODELS.join(", ")})`, default: DEFAULT_MODEL } }, required: ["messages"] } } ];