Skip to main content
Glama
snilld-ai
by snilld-ai

update-assistant

Modify an existing OpenAI assistant's configuration, including its name, instructions, model, temperature, file attachments, and file search settings.

Instructions

Update an existing OpenAI assistant

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
assistant_idYesThe ID of the assistant to update
nameNoThe new name of the assistant
instructionsNoThe new instructions for the assistant
modelNoThe new model for the assistant
temperatureNoThe new sampling temperature
file_idsNoThe new list of file IDs attached to the assistant
enable_file_searchNoEnable or disable the file search tool

Implementation Reference

  • Core handler implementation in LLMConnector that updates the OpenAI assistant using the API, handling kwargs filtering and special file_ids logic.
    async def update_assistant(self, assistant_id: str, **kwargs): try: # Filter out None values to avoid overwriting existing settings with defaults update_data = {k: v for k, v in kwargs.items() if v is not None} # Handle file_ids separately to fit the tool_resources structure if 'file_ids' in update_data: file_ids = update_data.pop('file_ids') if file_ids: update_data['tool_resources'] = {'file_search': {'vector_store_ids': file_ids}} else: # To remove files update_data['tool_resources'] = {} if not update_data: raise ValueError("No update data provided") response = await self.client.beta.assistants.update(assistant_id, **update_data) return response except Exception as e: logger.error(f"Failed to update assistant {assistant_id}: {str(e)}") raise
  • Tool registration in list_tools(), defining the name, description, and input schema.
    types.Tool( name="update-assistant", description="Update an existing OpenAI assistant", inputSchema={ "type": "object", "properties": { "assistant_id": {"type": "string", "description": "The ID of the assistant to update"}, "name": {"type": "string", "description": "The new name of the assistant"}, "instructions": {"type": "string", "description": "The new instructions for the assistant"}, "model": {"type": "string", "description": "The new model for the assistant"}, "temperature": {"type": "number", "description": "The new sampling temperature"}, "file_ids": {"type": "array", "items": {"type": "string"}, "description": "The new list of file IDs attached to the assistant"}, "enable_file_search": {"type": "boolean", "description": "Enable or disable the file search tool"} }, "required": ["assistant_id"] } ),
  • MCP server tool_call dispatcher branch for update-assistant, preparing arguments and delegating to LLMConnector.
    elif name == "update-assistant": assistant_id = arguments.pop("assistant_id") update_kwargs = arguments.copy() if "enable_file_search" in update_kwargs: tools = [] if update_kwargs.pop("enable_file_search"): tools.append({"type": "file_search"}) update_kwargs["tools"] = tools response = await connector.update_assistant(assistant_id, **update_kwargs) return [types.TextContent(type="text", text=f"Assistant updated:\\n{response}")]

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/snilld-ai/openai-assistant-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server