Skip to main content
Glama
andybrandt

MCP Simple OpenAI Assistant

by andybrandt

ask_assistant_in_thread

Send a message to an OpenAI assistant within an existing conversation thread and receive streaming responses with progress updates.

Instructions

Sends a message to an assistant within a specific thread and streams the response. This provides progress updates and the final message in a single call.

Use this to continue a conversation with an assistant in a specific thread. The thread ID can be retrieved from the list_threads tool. The assistant ID can be retrieved from the list_assistants tool. Threads are not inherently linked to a particular assistant, so you can use this tool to talk to any assistant in any thread.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
assistant_idYes
messageYes
thread_idYes

Implementation Reference

  • The handler function that executes the tool logic. It reports progress during the assistant run, accumulates the response message, and delegates the OpenAI API streaming to AssistantManager.run_thread.
    async def ask_assistant_in_thread(thread_id: str, assistant_id: str, message: str, ctx: Context) -> str: """ Sends a message to an assistant within a specific thread and streams the response. This provides progress updates and the final message in a single call. Use this to continue a conversation with an assistant in a specific thread. The thread ID can be retrieved from the list_threads tool. The assistant ID can be retrieved from the list_assistants tool. Threads are not inherently linked to a particular assistant, so you can use this tool to talk to any assistant in any thread. """ if not manager: raise ToolError("AssistantManager not initialized.") final_message = "" try: await ctx.report_progress(progress=0, message="Starting assistant run...") async for event in manager.run_thread(thread_id, assistant_id, message): if event.event == 'thread.message.delta': text_delta = event.data.delta.content[0].text final_message += text_delta.value await ctx.report_progress(progress=50, message=f"Assistant writing: {final_message}") elif event.event == 'thread.run.step.created': await ctx.report_progress(progress=25, message="Assistant is performing a step...") await ctx.report_progress(progress=100, message="Run complete.") return final_message except Exception as e: raise ToolError(f"An error occurred during the run: {e}")
  • Registers the 'ask_assistant_in_thread' tool with the FastMCP app using the @app.tool decorator, including metadata like title and readOnlyHint.
    @app.tool( annotations={ "title": "Ask Assistant in Thread and Stream Response", "readOnlyHint": False } )
  • Supporting utility method in AssistantManager that handles the core OpenAI API interactions: updates thread timestamp, adds user message, creates and streams the run events.
    async def run_thread( self, thread_id: str, assistant_id: str, message: str ): """ Sends a message to a thread and streams the assistant's response. This is an async generator that yields events from the run. """ # Update the last used timestamp self.thread_store.update_thread_last_used(thread_id) # Add the user's message to the thread self.client.beta.threads.messages.create( thread_id=thread_id, role="user", content=message ) # Stream the assistant's response stream = self.client.beta.threads.runs.create( thread_id=thread_id, assistant_id=assistant_id, stream=True ) for event in stream: yield event

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/andybrandt/mcp-simple-openai-assistant'

If you have feedback or need assistance with the MCP directory API, please join our Discord server