Skip to main content
Glama
andybrandt

MCP Simple OpenAI Assistant

by andybrandt

create_new_assistant_thread

Create a persistent conversation thread with a custom name and description for ongoing interactions with OpenAI assistants, enabling easy identification and reuse across sessions.

Instructions

Creates a new, persistent conversation thread with a user-defined name and description for easy identification and reuse. These threads are stored in OpenAI's servers and are not deleted unless the user deletes them, which means you can re-use them for future conversations. Additionally, the thread name and description are stored in the local database, which means you can list them and update them later.

Think how you can utilize threads in your particular use case.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
descriptionNo
nameYes

Implementation Reference

  • FastMCP tool handler decorated with @app.tool(). Handles tool invocation, input parameters, calls AssistantManager, manages errors and returns formatted response.
    @app.tool( annotations={"title": "Create New Assistant Thread", "readOnlyHint": False} ) async def create_new_assistant_thread( name: str, description: Optional[str] = None ) -> str: """ Creates a new, persistent conversation thread with a user-defined name and description for easy identification and reuse. These threads are stored in OpenAI's servers and are not deleted unless the user deletes them, which means you can re-use them for future conversations. Additionally, the thread name and description are stored in the local database, which means you can list them and update them later. Think how you can utilize threads in your particular use case. """ if not manager: raise ToolError("AssistantManager not initialized.") try: thread = await manager.create_new_assistant_thread(name, description) return f"Created new thread '{name}' with ID: {thread.id}" except Exception as e: raise ToolError(f"Failed to create thread: {e}")
  • AssistantManager method implementing the core logic: creates OpenAI thread with metadata, persists to local ThreadStore, returns Thread object.
    async def create_new_assistant_thread( self, name: str, description: Optional[str] = None ) -> Thread: """Creates a new, persistent conversation thread.""" metadata = { "name": name, "description": description or "" } thread = self.client.beta.threads.create(metadata=metadata) self.thread_store.add_thread(thread.id, name, description) return thread
  • ThreadStore.add_thread persists the new thread's metadata (ID, name, description) to local SQLite database.
    def add_thread(self, thread_id: str, name: str, description: str | None) -> int: """Adds a new thread record to the database. Args: thread_id: The unique identifier for the thread from OpenAI. name: A user-defined name for the thread. description: A user-defined description for the thread. Returns: The row ID of the newly inserted thread. """ conn = self._get_connection() cursor = conn.cursor() cursor.execute(""" INSERT INTO threads (thread_id, name, description, last_used_at) VALUES (?, ?, ?, ?) """, (thread_id, name, description, datetime.now(timezone.utc))) conn.commit() return cursor.lastrowid

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/andybrandt/mcp-simple-openai-assistant'

If you have feedback or need assistance with the MCP directory API, please join our Discord server