Skip to main content
Glama

gemini-chat

Engage in AI-powered conversations, ask questions, and receive tailored assistance using contextual inputs with Gemini AI via the Gemini MCP Server for Claude Desktop.

Instructions

Chat with Gemini AI for conversations, questions, and general assistance (with learned user preferences)

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
contextNoOptional additional context for the conversation (e.g., "aurora", "debugging", "code")
messageYesYour message or question to chat with Gemini AI

Implementation Reference

  • Core handler function for 'gemini-chat' tool: validates input, enhances prompt with intelligence if available, generates Gemini response, learns from interaction, formats output as MCP content block.
    async execute(args) { const message = validateNonEmptyString(args.message, 'message'); const context = args.context ? validateString(args.context, 'context') : null; log(`Processing chat message: "${message}" with context: ${context || 'general'}`, this.name); try { let enhancedMessage = message; if (this.intelligenceSystem.initialized) { try { enhancedMessage = await this.intelligenceSystem.enhancePrompt(message, context); log('Applied Tool Intelligence enhancement', this.name); } catch (err) { log(`Tool Intelligence enhancement failed: ${err.message}`, this.name); } } const responseText = await this.geminiService.generateText('CHAT', enhancedMessage); if (responseText) { if (this.intelligenceSystem.initialized) { try { await this.intelligenceSystem.learnFromInteraction(message, enhancedMessage, responseText, context, this.name); log('Tool Intelligence learned from interaction', this.name); } catch (err) { log(`Tool Intelligence learning failed: ${err.message}`, this.name); } } log('Chat response completed successfully', this.name); let finalResponse = responseText; if (context && this.intelligenceSystem.initialized) { finalResponse += `\n\n---\n_Enhancement applied based on context: ${context}_`; // eslint-disable-line max-len } return { content: [ { type: 'text', text: finalResponse, }, ], }; } log('No response text generated', this.name); return { content: [ { type: 'text', text: `I couldn't generate a response to: "${message}". Please try rephrasing your message.`, }, ], }; } catch (error) { log(`Error processing chat: ${error.message}`, this.name); throw new Error(`Error processing chat: ${error.message}`); } }
  • Input schema for 'gemini-chat' tool defining parameters: required 'message' string and optional 'context' string.
    properties: { message: { type: 'string', description: 'Your message or question to chat with Gemini AI', }, context: { type: 'string', description: 'Optional additional context for the conversation (e.g., "aurora", "debugging", "code")', }, }, required: ['message'], },
  • Registers the ChatTool instance (named 'gemini-chat') with the central tool registry using shared intelligenceSystem and geminiService.
    registerTool(new ChatTool(intelligenceSystem, geminiService));

Other Tools

Related Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Garblesnarff/gemini-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server