Skip to main content
Glama
andybrandt

MCP Simple OpenAI Assistant

by andybrandt

ask_assistant_in_thread

Send messages to OpenAI assistants within existing conversation threads and receive streaming responses for continuous dialogue management.

Instructions

Sends a message to an assistant within a specific thread and streams the response. This provides progress updates and the final message in a single call.

Use this to continue a conversation with an assistant in a specific thread. The thread ID can be retrieved from the list_threads tool. The assistant ID can be retrieved from the list_assistants tool. Threads are not inherently linked to a particular assistant, so you can use this tool to talk to any assistant in any thread.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
thread_idYes
assistant_idYes
messageYes

Implementation Reference

  • Main handler function that executes the tool logic: sends message to thread, streams assistant response events, accumulates final message, and reports progress via ctx.
    async def ask_assistant_in_thread(thread_id: str, assistant_id: str, message: str, ctx: Context) -> str:
        """
        Sends a message to an assistant within a specific thread and streams the response.
        This provides progress updates and the final message in a single call.
    
        Use this to continue a conversation with an assistant in a specific thread.
        The thread ID can be retrieved from the list_threads tool.
        The assistant ID can be retrieved from the list_assistants tool.
        Threads are not inherently linked to a particular assistant, so you can use this tool to talk to any assistant in any thread.
        """
        if not manager:
            raise ToolError("AssistantManager not initialized.")
    
        final_message = ""
        try:
            await ctx.report_progress(progress=0, message="Starting assistant run...")
            async for event in manager.run_thread(thread_id, assistant_id, message):
                if event.event == 'thread.message.delta':
                    text_delta = event.data.delta.content[0].text
                    final_message += text_delta.value
                    await ctx.report_progress(progress=50, message=f"Assistant writing: {final_message}")
                elif event.event == 'thread.run.step.created':
                    await ctx.report_progress(progress=25, message="Assistant is performing a step...")
            
            await ctx.report_progress(progress=100, message="Run complete.")
            return final_message
    
        except Exception as e:
            raise ToolError(f"An error occurred during the run: {e}")
  • FastMCP decorator that registers the ask_assistant_in_thread function as a tool with title and readOnlyHint metadata.
    @app.tool(
        annotations={
            "title": "Ask Assistant in Thread and Stream Response",
            "readOnlyHint": False
        }
    )
  • Supporting async generator that updates thread usage, adds user message, creates and streams the assistant run events, used by the tool handler.
    async def run_thread(
        self,
        thread_id: str,
        assistant_id: str,
        message: str
    ):
        """
        Sends a message to a thread and streams the assistant's response.
        This is an async generator that yields events from the run.
        """
        # Update the last used timestamp
        self.thread_store.update_thread_last_used(thread_id)
    
        # Add the user's message to the thread
        self.client.beta.threads.messages.create(
            thread_id=thread_id,
            role="user",
            content=message
        )
    
        # Stream the assistant's response
        stream = self.client.beta.threads.runs.create(
            thread_id=thread_id,
            assistant_id=assistant_id,
            stream=True
        )
        for event in stream:
            yield event 

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/andybrandt/mcp-simple-openai-assistant'

If you have feedback or need assistance with the MCP directory API, please join our Discord server