create-assistant
Build custom OpenAI assistants by configuring name, instructions, model, and file attachments for specialized AI tasks.
Instructions
Create a new OpenAI assistant
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| name | Yes | The name of the assistant | |
| instructions | Yes | The assistant's instructions | |
| model | Yes | The model to use | gpt-4-turbo |
| temperature | No | The sampling temperature | |
| file_ids | No | A list of file IDs to attach to the assistant | |
| enable_file_search | No | Enable file search tool |
Implementation Reference
- src/mcp_server_openai/server.py:159-172 (handler)Dispatch handler for the create-assistant tool call, prepares arguments and invokes LLMConnector.create_assistant.elif name == "create-assistant": tools = [] if arguments.get("enable_file_search", True): tools.append({"type": "file_search"}) response = await connector.create_assistant( name=arguments["name"], instructions=arguments["instructions"], model=arguments["model"], temperature=arguments.get("temperature", 0.7), file_ids=arguments.get("file_ids"), tools=tools ) return [types.TextContent(type="text", text=f"Assistant created:\\n{response}")]
- Input schema definition for the create-assistant tool, registered in list_tools().types.Tool( name="create-assistant", description="Create a new OpenAI assistant", inputSchema={ "type": "object", "properties": { "name": {"type": "string", "description": "The name of the assistant"}, "instructions": {"type": "string", "description": "The assistant's instructions"}, "model": {"type": "string", "default": "gpt-4-turbo", "description": "The model to use"}, "temperature": {"type": "number", "default": 0.7, "description": "The sampling temperature"}, "file_ids": {"type": "array", "items": {"type": "string"}, "description": "A list of file IDs to attach to the assistant"}, "enable_file_search": {"type": "boolean", "default": True, "description": "Enable file search tool"} }, "required": ["name", "instructions", "model"] } ),
- src/mcp_server_openai/llm.py:42-55 (helper)Core implementation of assistant creation using OpenAI's AsyncOpenAI client.async def create_assistant(self, name: str, instructions: str, model: str, tools: list = None, file_ids: list = None, temperature: float = 0.7): try: assistant = await self.client.beta.assistants.create( name=name, instructions=instructions, model=model, tools=tools or [{"type": "code_interpreter"}], # Default tool tool_resources={'file_search': {'vector_store_ids': file_ids}} if file_ids else None, temperature=temperature ) return assistant except Exception as e: logger.error(f"Failed to create assistant: {str(e)}") raise