update-assistant
Modify an existing OpenAI assistant's configuration, including its name, instructions, model, temperature, file attachments, and file search settings.
Instructions
Update an existing OpenAI assistant
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| assistant_id | Yes | The ID of the assistant to update | |
| name | No | The new name of the assistant | |
| instructions | No | The new instructions for the assistant | |
| model | No | The new model for the assistant | |
| temperature | No | The new sampling temperature | |
| file_ids | No | The new list of file IDs attached to the assistant | |
| enable_file_search | No | Enable or disable the file search tool |
Implementation Reference
- src/mcp_server_openai/llm.py:57-77 (handler)Core handler implementation in LLMConnector that updates the OpenAI assistant using the API, handling kwargs filtering and special file_ids logic.async def update_assistant(self, assistant_id: str, **kwargs): try: # Filter out None values to avoid overwriting existing settings with defaults update_data = {k: v for k, v in kwargs.items() if v is not None} # Handle file_ids separately to fit the tool_resources structure if 'file_ids' in update_data: file_ids = update_data.pop('file_ids') if file_ids: update_data['tool_resources'] = {'file_search': {'vector_store_ids': file_ids}} else: # To remove files update_data['tool_resources'] = {} if not update_data: raise ValueError("No update data provided") response = await self.client.beta.assistants.update(assistant_id, **update_data) return response except Exception as e: logger.error(f"Failed to update assistant {assistant_id}: {str(e)}") raise
- src/mcp_server_openai/server.py:73-89 (registration)Tool registration in list_tools(), defining the name, description, and input schema.types.Tool( name="update-assistant", description="Update an existing OpenAI assistant", inputSchema={ "type": "object", "properties": { "assistant_id": {"type": "string", "description": "The ID of the assistant to update"}, "name": {"type": "string", "description": "The new name of the assistant"}, "instructions": {"type": "string", "description": "The new instructions for the assistant"}, "model": {"type": "string", "description": "The new model for the assistant"}, "temperature": {"type": "number", "description": "The new sampling temperature"}, "file_ids": {"type": "array", "items": {"type": "string"}, "description": "The new list of file IDs attached to the assistant"}, "enable_file_search": {"type": "boolean", "description": "Enable or disable the file search tool"} }, "required": ["assistant_id"] } ),
- src/mcp_server_openai/server.py:174-186 (handler)MCP server tool_call dispatcher branch for update-assistant, preparing arguments and delegating to LLMConnector.elif name == "update-assistant": assistant_id = arguments.pop("assistant_id") update_kwargs = arguments.copy() if "enable_file_search" in update_kwargs: tools = [] if update_kwargs.pop("enable_file_search"): tools.append({"type": "file_search"}) update_kwargs["tools"] = tools response = await connector.update_assistant(assistant_id, **update_kwargs) return [types.TextContent(type="text", text=f"Assistant updated:\\n{response}")]