Skip to main content
Glama

start_streaming_chat

Initiate a real-time chat session with an agent, enabling continuous message exchange. Specify the agent name and initial message to begin streaming interactions.

Instructions

Start a streaming chat session

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
agent_nameYesName of the agent
messageYesInitial message
streamingNoEnable streaming

Implementation Reference

  • Defines the input schema and description for the 'start_streaming_chat' MCP tool.
    { name: 'start_streaming_chat', description: 'Start a streaming chat session with real-time updates', inputSchema: { type: 'object', properties: { agent_name: { type: 'string', description: 'Name of the agent to chat with' }, message: { type: 'string', description: 'Initial message' }, streaming: { type: 'boolean', description: 'Enable real-time streaming' }, progress_token: { type: 'string', description: 'Token for progress notifications' }, }, required: ['agent_name', 'message'], }, },
  • Registers the 'start_streaming_chat' tool in the MCP server's ListToolsRequestSchema handler by including it in the static list of available tools.
    this.server.setRequestHandler(ListToolsRequestSchema, async () => ({ tools: [ { name: 'create_streaming_workflow', description: 'Create a workflow with real-time streaming and progress updates', inputSchema: { type: 'object', properties: { workflow_name: { type: 'string', description: 'Name for the workflow' }, workflow_type: { type: 'string', description: 'Type of workflow' }, agents: { type: 'array', description: 'List of agent configurations' }, streaming: { type: 'boolean', description: 'Enable streaming' }, progress_token: { type: 'string', description: 'Progress token' }, }, required: ['workflow_name', 'workflow_type', 'agents'], }, }, { name: 'start_streaming_chat', description: 'Start a streaming chat session with real-time updates', inputSchema: { type: 'object', properties: { agent_name: { type: 'string', description: 'Name of the agent to chat with' }, message: { type: 'string', description: 'Initial message' }, streaming: { type: 'boolean', description: 'Enable real-time streaming' }, progress_token: { type: 'string', description: 'Token for progress notifications' }, }, required: ['agent_name', 'message'], }, }, { name: 'create_agent', description: 'Create a new AutoGen agent with enhanced capabilities', inputSchema: { type: 'object', properties: { name: { type: 'string', description: 'Unique name for the agent' }, type: { type: 'string', description: 'Agent type' }, system_message: { type: 'string', description: 'System message' }, llm_config: { type: 'object', description: 'LLM configuration' }, }, required: ['name', 'type'], }, }, { name: 'execute_workflow', description: 'Execute a workflow with streaming support', inputSchema: { type: 'object', properties: { workflow_name: { type: 'string', description: 'Workflow name' }, input_data: { type: 'object', description: 'Input data' }, streaming: { type: 'boolean', description: 'Enable streaming' }, }, required: ['workflow_name', 'input_data'], }, }, ], }));
  • Dispatch logic in CallToolRequestSchema handler that identifies 'start_streaming_chat' as a streaming tool and routes it to handleStreamingTool.
    if (toolName === 'create_streaming_workflow' || toolName === 'start_streaming_chat') { return await this.handleStreamingTool(toolName, args, progressToken); }
  • Executes the 'start_streaming_chat' tool logic: initializes streaming progress notifications, calls the Python backend handler, sends SSE updates if streaming enabled, and completes with final notification.
    private async handleStreamingTool(toolName: string, args: any, progressToken?: string): Promise<any> { if (progressToken) { await this.sendProgressNotification(progressToken, 25, 'Initializing streaming...'); } const result = await this.callPythonHandler(toolName, args); if (args.streaming && this.sseTransports.size > 0) { for (const transport of this.sseTransports.values()) { try { await transport.send({ jsonrpc: '2.0', method: 'notifications/progress', params: { progressToken: progressToken || 'streaming', progress: 75, message: 'Streaming updates...', data: result, }, }); } catch (error) { console.error('Error sending streaming update:', error); } } } if (progressToken) { await this.sendProgressNotification(progressToken, 100, 'Streaming completed'); } return result;

Other Tools

Related Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/DynamicEndpoints/Autogen_MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server