Skip to main content
Glama
haasonsaas

Deep Code Reasoning MCP Server

by haasonsaas

continue_conversation

Extend ongoing code analysis conversations by providing session ID and follow-up message. Enable inclusion of code snippets for detailed responses. Integrates Claude Code and Google's Gemini AI for comprehensive debugging and long-trace analysis.

Instructions

Continue an ongoing analysis conversation

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
include_code_snippetsNoWhether to include code snippets in response
messageYesClaude's response or follow-up question
session_idYesID of the conversation session

Implementation Reference

  • Top-level MCP tool handler for 'continue_conversation' that validates input and delegates to DeepCodeReasonerV2.continueConversation.
    case 'continue_conversation': { const parsed = ContinueConversationSchema.parse(args); const result = await deepReasoner.continueConversation( parsed.session_id, parsed.message, parsed.include_code_snippets, ); return { content: [ { type: 'text', text: JSON.stringify(result, null, 2), }, ], }; }
  • Zod input schema validation for the continue_conversation tool parameters.
    const ContinueConversationSchema = z.object({ session_id: z.string(), message: z.string(), include_code_snippets: z.boolean().optional(), });
  • src/index.ts:362-383 (registration)
    Registration of the continue_conversation tool in the MCP server's tools list with description and input schema.
    { name: 'continue_conversation', description: 'Continue an ongoing analysis conversation', inputSchema: { type: 'object', properties: { session_id: { type: 'string', description: 'ID of the conversation session', }, message: { type: 'string', description: 'Claude\'s response or follow-up question', }, include_code_snippets: { type: 'boolean', description: 'Whether to include code snippets in response', }, }, required: ['session_id', 'message'], }, },
  • Handler in DeepCodeReasonerV2 managing session locking, history tracking, and delegating to ConversationalGeminiService.continueConversation.
    async continueConversation( sessionId: string, message: string, includeCodeSnippets?: boolean, ): Promise<{ response: string; analysisProgress: number; canFinalize: boolean; status: string; }> { // Acquire lock before processing const lockAcquired = this.conversationManager.acquireLock(sessionId); if (!lockAcquired) { throw new ConversationLockedError(sessionId); } try { // Validate session const session = this.conversationManager.getSession(sessionId); if (!session) { throw new SessionNotFoundError(sessionId); } // Add Claude's message to conversation history this.conversationManager.addTurn(sessionId, 'claude', message); // Continue with Gemini const { response, analysisProgress, canFinalize } = await this.conversationalGemini.continueConversation( sessionId, message, includeCodeSnippets, ); // Track Gemini's response this.conversationManager.addTurn(sessionId, 'gemini', response); // Update progress this.conversationManager.updateProgress(sessionId, { confidenceLevel: analysisProgress, }); return { response, analysisProgress, canFinalize, status: session.status, }; } catch (error) { console.error('Failed to continue conversation:', error); throw error; } finally { // Always release lock this.conversationManager.releaseLock(sessionId); } }
  • Core handler implementing the Gemini API call for continuing the conversation, including prompt sanitization and response processing.
    async continueConversation( sessionId: string, message: string, includeCodeSnippets?: boolean, ): Promise<{ response: string; analysisProgress: number; canFinalize: boolean }> { const chat = this.activeSessions.get(sessionId); const context = this.sessionContexts.get(sessionId); if (!chat || !context) { throw new SessionNotFoundError(sessionId); } // Sanitize the incoming message const sanitizedMessage = PromptSanitizer.sanitizeString(message); // Check for potential injection attempts if (PromptSanitizer.containsInjectionAttempt(message)) { console.warn(`Potential injection attempt in session ${sessionId}:`, message.substring(0, 100)); } // Process Claude's message with safety wrapper let processedMessage = `REMINDER: The following is a message from Claude in our ongoing analysis conversation. Focus on the technical analysis task. <CLAUDE_MESSAGE> ${sanitizedMessage} </CLAUDE_MESSAGE>`; if (includeCodeSnippets && this.hasCodeReference(message)) { const enrichedContent = this.enrichMessageWithCode(sanitizedMessage, context.codeFiles); processedMessage += `\n\n${enrichedContent}`; } // Send message to Gemini const result = await chat.sendMessage(processedMessage); const response = result.response.text(); // Calculate analysis progress const progress = this.calculateProgress(chat, context); const canFinalize = progress >= 0.8; return { response, analysisProgress: progress, canFinalize, }; }

Other Tools

Related Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/haasonsaas/deep-code-reasoning-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server