Skip to main content
Glama

K8s MCP

by rahul007-bit
WEBSOCKET_IMPLEMENTATION.md3.46 kB
# WebSocket Real-Time Chat Implementation ## Overview Implemented real-time streaming of chat responses and tool execution via WebSocket. This eliminates the "(Processing tool calls...)" placeholder issue and provides live feedback. ## Backend Changes ### 1. New WebSocket Handler (`app/backend/app/api/websocket.py`) - Accepts WebSocket connections at `/ws/chat` - Handles incoming chat messages and streams responses back - Event types: `thinking`, `text`, `tool_call_start`, `tool_call_end`, `error`, `complete` ### 2. Chat Service Streaming (`app/backend/app/services/chat_service.py`) - Added `process_message_stream()` async generator method - Yields real-time events as the model processes requests: - **thinking**: Processing started - **text**: Model generated text chunk - **tool_call_start**: Tool execution starting - **tool_call_end**: Tool execution completed with result - **complete**: Full response finished - **error**: Error occurred ### 3. Main App Update (`app/backend/app/main.py`) - Added WebSocket route: `@app.websocket("/ws/chat")` - Integrates with lifespan management for service injection ## Frontend Changes ### 1. WebSocket Hook (`src/hooks/useChatWebSocket.js`) - Custom React hook for WebSocket lifecycle management - Methods: `connect()`, `send(data)`, `disconnect()` - Auto-reconnection handling - Proper protocol selection (ws/wss) ### 2. Chat Page (`src/pages/ChatPage.jsx`) - Replaced HTTP fetch with WebSocket streaming - Handles real-time events from server - Streams text content as it arrives - Updates tool call history immediately upon completion - No more "(Processing tool calls...)" placeholders ### 3. Zustand Store (`src/store/chatStore.js`) - Enhanced `addMessage()` to support streaming - Concatenates streamed text chunks to last assistant message - Flag `stream: true` marks messages as part of stream ## Message Flow ### User → Server ```javascript { "message": "list all pods", "conversation_id": "uuid" } ``` ### Server → Frontend (Real-time Events) ```javascript // 1. Thinking { "type": "thinking", "data": { "message": "Processing..." } } // 2. Text streaming { "type": "text", "data": { "text": "Here are the pods...", "conversation_id": "uuid" } } // 3. Tool execution start { "type": "tool_call_start", "data": { "tool_name": "list_pods", "arguments": { "all_namespaces": true }, "conversation_id": "uuid" } } // 4. Tool execution end { "type": "tool_call_end", "data": { "tool_name": "list_pods", "arguments": { "all_namespaces": true }, "result": "pod1, pod2, ...", "status": "completed", "conversation_id": "uuid" } } // 5. Complete { "type": "complete", "data": { "message": "Full response text", "tool_calls": [{ ... }], "conversation_id": "uuid" } } ``` ## Benefits ✅ Real-time feedback - see responses as they're generated ✅ No placeholder text - actual content displayed immediately ✅ Tool execution status visible - know when tools are running ✅ Better UX - smoother, more interactive experience ✅ Scalable - supports multiple concurrent connections ## Usage The frontend now automatically: 1. Connects to WebSocket on page load 2. Sends messages via WebSocket instead of HTTP 3. Streams responses in real-time 4. Updates UI immediately (no waiting) 5. Disconnects on page unload No changes needed to existing API - HTTP endpoint still available as fallback.

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/rahul007-bit/k8s-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server