configureOpenAI
Configure OpenAI integration in Spline 3D scenes to generate AI responses, map variables, and automate interactions when scenes load.
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| sceneId | Yes | Scene ID | |
| model | No | OpenAI model to use | gpt-3.5-turbo |
| apiKey | No | OpenAI API key (uses env var if not provided) | |
| prompt | Yes | System prompt/behavior for the AI | |
| requestOnStart | No | Whether to call OpenAI when scene loads | |
| variableMappings | No | Mappings from OpenAI response to Spline variables |
Implementation Reference
- src/tools/api-webhook-tools.js:283-331 (registration)Direct registration of the 'configureOpenAI' MCP tool, including inline Zod input schema and async handler function that configures OpenAI integration for a Spline scene via API call.server.tool( 'configureOpenAI', { sceneId: z.string().min(1).describe('Scene ID'), model: z.enum(['gpt-3.5-turbo', 'gpt-4-turbo', 'gpt-4o-mini', 'gpt-4o']) .default('gpt-3.5-turbo').describe('OpenAI model to use'), apiKey: z.string().optional().describe('OpenAI API key (uses env var if not provided)'), prompt: z.string().min(1).describe('System prompt/behavior for the AI'), requestOnStart: z.boolean().optional().default(false) .describe('Whether to call OpenAI when scene loads'), variableMappings: z.array(z.object({ responseField: z.string().describe('Field from API response'), variableName: z.string().describe('Spline variable name'), })).optional().describe('Mappings from OpenAI response to Spline variables'), }, async ({ sceneId, model, apiKey, prompt, requestOnStart, variableMappings }) => { try { const openaiConfig = { model, apiKey: apiKey || process.env.OPENAI_API_KEY, prompt, requestOnStart: requestOnStart || false, ...(variableMappings && { variableMappings }), }; const result = await apiClient.request('POST', `/scenes/${sceneId}/openai`, openaiConfig); return { content: [ { type: 'text', text: `OpenAI integration configured successfully with ID: ${result.id}` } ] }; } catch (error) { return { content: [ { type: 'text', text: `Error configuring OpenAI: ${error.message}` } ], isError: true }; } } );
- src/tools/api-webhook-tools.js:298-330 (handler)The core handler function implementing the 'configureOpenAI' tool logic: constructs config from params, calls Spline API to configure OpenAI integration, returns success/error response.async ({ sceneId, model, apiKey, prompt, requestOnStart, variableMappings }) => { try { const openaiConfig = { model, apiKey: apiKey || process.env.OPENAI_API_KEY, prompt, requestOnStart: requestOnStart || false, ...(variableMappings && { variableMappings }), }; const result = await apiClient.request('POST', `/scenes/${sceneId}/openai`, openaiConfig); return { content: [ { type: 'text', text: `OpenAI integration configured successfully with ID: ${result.id}` } ] }; } catch (error) { return { content: [ { type: 'text', text: `Error configuring OpenAI: ${error.message}` } ], isError: true }; } } );
- Zod input schema defining parameters for the 'configureOpenAI' tool: sceneId, model, apiKey, prompt, requestOnStart, variableMappings.{ sceneId: z.string().min(1).describe('Scene ID'), model: z.enum(['gpt-3.5-turbo', 'gpt-4-turbo', 'gpt-4o-mini', 'gpt-4o']) .default('gpt-3.5-turbo').describe('OpenAI model to use'), apiKey: z.string().optional().describe('OpenAI API key (uses env var if not provided)'), prompt: z.string().min(1).describe('System prompt/behavior for the AI'), requestOnStart: z.boolean().optional().default(false) .describe('Whether to call OpenAI when scene loads'), variableMappings: z.array(z.object({ responseField: z.string().describe('Field from API response'), variableName: z.string().describe('Spline variable name'), })).optional().describe('Mappings from OpenAI response to Spline variables'), },