Skip to main content
Glama
niko91i

Deepseek-Thinking-Claude-3.5-Sonnet-CLINE-MCP

by niko91i

generate_response

Generate AI-powered responses by integrating DeepSeek's structured reasoning with Claude 3.5 Sonnet’s advanced output creation. Enhances accuracy and context understanding for user prompts.

Instructions

Generate a response using DeepSeek's reasoning and Claude's response generation through OpenRouter.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
clearContextNoClear conversation history before this request
includeHistoryNoInclude Cline conversation history for context
promptYesThe user's input prompt
showReasoningNoWhether to include reasoning in response

Implementation Reference

  • src/index.ts:307-336 (registration)
    Registration of the 'generate_response' tool in the ListToolsRequestSchema handler, including name, description, and JSON input schema.
    { name: "generate_response", description: "Generate a response using DeepSeek's reasoning and Claude's response generation through OpenRouter.", inputSchema: { type: "object", properties: { prompt: { type: "string", description: "The user's input prompt", }, showReasoning: { type: "boolean", description: "Whether to include reasoning in response", default: false, }, clearContext: { type: "boolean", description: "Clear conversation history before this request", default: false, }, includeHistory: { type: "boolean", description: "Include Cline conversation history for context", default: true, }, }, required: ["prompt"], }, },
  • TypeScript interface defining the input arguments for generate_response.
    interface GenerateResponseArgs { prompt: string; showReasoning?: boolean; clearContext?: boolean; includeHistory?: boolean; }
  • Type guard/validator function for GenerateResponseArgs used in the handler.
    const isValidGenerateResponseArgs = (args: any): args is GenerateResponseArgs => typeof args === "object" && args !== null && typeof args.prompt === "string" && (args.showReasoning === undefined || typeof args.showReasoning === "boolean") && (args.clearContext === undefined || typeof args.clearContext === "boolean") && (args.includeHistory === undefined || typeof args.includeHistory === "boolean");
  • Entry point handler for generate_response tool call: validates arguments, initializes asynchronous task status, starts background processing via processTask, and returns taskId immediately for polling.
    if (request.params.name === "generate_response") { if (!isValidGenerateResponseArgs(request.params.arguments)) { throw new McpError( ErrorCode.InvalidParams, "Invalid generate_response arguments" ); } const taskId = uuidv4(); const { prompt, showReasoning, clearContext, includeHistory } = request.params.arguments; // Initialize task status with les propriétés de suivi pour le polling this.activeTasks.set(taskId, { status: "pending", prompt, showReasoning, timestamp: Date.now(), lastChecked: Date.now(), nextCheckDelay: INITIAL_STATUS_CHECK_DELAY_MS, checkAttempts: 0 }); // Start processing in background this.processTask(taskId, clearContext, includeHistory).catch( (error) => { log("Error processing task:", error); this.activeTasks.set(taskId, { ...this.activeTasks.get(taskId)!, status: "error", error: error.message, }); } ); // Return task ID immediately return { content: [ { type: "text", text: JSON.stringify({ taskId, suggestedWaitTime: Math.round(INITIAL_STATUS_CHECK_DELAY_MS / 1000) // Temps suggéré en secondes }), }, ], };
  • Core asynchronous processor for generate_response: fetches conversation history, generates reasoning with DeepSeek, generates final response with DeepSeek, updates task status throughout, and adds to conversation context.
    private async processTask( taskId: string, clearContext?: boolean, includeHistory?: boolean ): Promise<void> { const task = this.activeTasks.get(taskId); if (!task) { throw new Error(`No task found with ID: ${taskId}`); } try { if (clearContext) { this.context.entries = []; } // Update status to reasoning this.activeTasks.set(taskId, { ...task, status: "reasoning", }); // Get Cline conversation history if requested let history: ClaudeMessage[] | null = null; if (includeHistory !== false) { history = await findActiveConversation(); } // Get DeepSeek reasoning with limited history const reasoningHistory = history ? formatHistoryForModel(history, true) : ""; const reasoningPrompt = reasoningHistory ? `${reasoningHistory}\n\nNew question: ${task.prompt}` : task.prompt; const reasoning = await this.getDeepseekReasoning(reasoningPrompt); // Update status with reasoning this.activeTasks.set(taskId, { ...task, status: "responding", reasoning, }); // Get final response with full history const responseHistory = history ? formatHistoryForModel(history, false) : ""; const fullPrompt = responseHistory ? `${responseHistory}\n\nCurrent task: ${task.prompt}` : task.prompt; const response = await this.getFinalResponse(fullPrompt, reasoning); // Add to context after successful response this.addToContext({ timestamp: Date.now(), prompt: task.prompt, reasoning, response, model: DEEPSEEK_MODEL, // Utiliser DEEPSEEK_MODEL au lieu de CLAUDE_MODEL }); // Update status to complete this.activeTasks.set(taskId, { ...task, status: "complete", reasoning, response, timestamp: Date.now(), }); } catch (error) { // Update status to error this.activeTasks.set(taskId, { ...task, status: "error", error: error instanceof Error ? error.message : "Unknown error", timestamp: Date.now(), }); throw error; } }

Other Tools

Related Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/niko91i/MCP-deepseek-V3-et-claude-desktop'

If you have feedback or need assistance with the MCP directory API, please join our Discord server