Skip to main content
Glama
snilld-ai

OpenAI Assistant MCP Server

by snilld-ai

create-assistant

Create a new OpenAI assistant with custom instructions, model selection, and file attachments to automate tasks and provide AI-powered responses.

Instructions

Create a new OpenAI assistant

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
nameYesThe name of the assistant
instructionsYesThe assistant's instructions
modelYesThe model to usegpt-4-turbo
temperatureNoThe sampling temperature
file_idsNoA list of file IDs to attach to the assistant
enable_file_searchNoEnable file search tool

Implementation Reference

  • MCP tool handler dispatch for 'create-assistant', prepares arguments and calls LLMConnector.create_assistant
    elif name == "create-assistant":
        tools = []
        if arguments.get("enable_file_search", True):
            tools.append({"type": "file_search"})
    
        response = await connector.create_assistant(
            name=arguments["name"],
            instructions=arguments["instructions"],
            model=arguments["model"],
            temperature=arguments.get("temperature", 0.7),
            file_ids=arguments.get("file_ids"),
            tools=tools
        )
        return [types.TextContent(type="text", text=f"Assistant created:\\n{response}")]
  • Core implementation of assistant creation using OpenAI Assistants API
    async def create_assistant(self, name: str, instructions: str, model: str, tools: list = None, file_ids: list = None, temperature: float = 0.7):
        try:
            assistant = await self.client.beta.assistants.create(
                name=name,
                instructions=instructions,
                model=model,
                tools=tools or [{"type": "code_interpreter"}], # Default tool
                tool_resources={'file_search': {'vector_store_ids': file_ids}} if file_ids else None,
                temperature=temperature
            )
            return assistant
        except Exception as e:
            logger.error(f"Failed to create assistant: {str(e)}")
            raise
  • Input schema and description for the create-assistant tool, registered in list_tools()
    types.Tool(
        name="create-assistant",
        description="Create a new OpenAI assistant",
        inputSchema={
            "type": "object",
            "properties": {
                "name": {"type": "string", "description": "The name of the assistant"},
                "instructions": {"type": "string", "description": "The assistant's instructions"},
                "model": {"type": "string", "default": "gpt-4-turbo", "description": "The model to use"},
                "temperature": {"type": "number", "default": 0.7, "description": "The sampling temperature"},
                "file_ids": {"type": "array", "items": {"type": "string"}, "description": "A list of file IDs to attach to the assistant"},
                "enable_file_search": {"type": "boolean", "default": True, "description": "Enable file search tool"}
            },
            "required": ["name", "instructions", "model"]
        }
    ),
  • Registration of the create-assistant tool via @server.list_tools()
    types.Tool(
        name="create-assistant",
        description="Create a new OpenAI assistant",
        inputSchema={
            "type": "object",
            "properties": {
                "name": {"type": "string", "description": "The name of the assistant"},
                "instructions": {"type": "string", "description": "The assistant's instructions"},
                "model": {"type": "string", "default": "gpt-4-turbo", "description": "The model to use"},
                "temperature": {"type": "number", "default": 0.7, "description": "The sampling temperature"},
                "file_ids": {"type": "array", "items": {"type": "string"}, "description": "A list of file IDs to attach to the assistant"},
                "enable_file_search": {"type": "boolean", "default": True, "description": "Enable file search tool"}
            },
            "required": ["name", "instructions", "model"]
        }
    ),

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/snilld-ai/openai-assistant-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server