Skip to main content
Glama

gemini-chat

Chat with Gemini AI for conversations, questions, and general assistance with learned user preferences. Use this tool to ask questions or get help through Claude Desktop.

Instructions

Chat with Gemini AI for conversations, questions, and general assistance (with learned user preferences)

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
messageYesYour message or question to chat with Gemini AI
contextNoOptional additional context for the conversation (e.g., "aurora", "debugging", "code")

Implementation Reference

  • The main execution handler for the 'gemini-chat' tool. It validates input, optionally enhances the prompt using the intelligence system, generates a response via the Gemini service, learns from the interaction, and returns a formatted text response.
    async execute(args) { const message = validateNonEmptyString(args.message, 'message'); const context = args.context ? validateString(args.context, 'context') : null; log(`Processing chat message: "${message}" with context: ${context || 'general'}`, this.name); try { let enhancedMessage = message; if (this.intelligenceSystem.initialized) { try { enhancedMessage = await this.intelligenceSystem.enhancePrompt(message, context); log('Applied Tool Intelligence enhancement', this.name); } catch (err) { log(`Tool Intelligence enhancement failed: ${err.message}`, this.name); } } const responseText = await this.geminiService.generateText('CHAT', enhancedMessage); if (responseText) { if (this.intelligenceSystem.initialized) { try { await this.intelligenceSystem.learnFromInteraction(message, enhancedMessage, responseText, context, this.name); log('Tool Intelligence learned from interaction', this.name); } catch (err) { log(`Tool Intelligence learning failed: ${err.message}`, this.name); } } log('Chat response completed successfully', this.name); let finalResponse = responseText; if (context && this.intelligenceSystem.initialized) { finalResponse += `\n\n---\n_Enhancement applied based on context: ${context}_`; // eslint-disable-line max-len } return { content: [ { type: 'text', text: finalResponse, }, ], }; } log('No response text generated', this.name); return { content: [ { type: 'text', text: `I couldn't generate a response to: "${message}". Please try rephrasing your message.`, }, ], }; } catch (error) { log(`Error processing chat: ${error.message}`, this.name); throw new Error(`Error processing chat: ${error.message}`); } }
  • The JSON schema defining the input parameters for the 'gemini-chat' tool, including required 'message' and optional 'context'.
    type: 'object', properties: { message: { type: 'string', description: 'Your message or question to chat with Gemini AI', }, context: { type: 'string', description: 'Optional additional context for the conversation (e.g., "aurora", "debugging", "code")', }, }, required: ['message'], },
  • Registers the 'gemini-chat' tool instance (ChatTool) with the central tool registry by calling registerTool.
    registerTool(new ChatTool(intelligenceSystem, geminiService));

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Garblesnarff/gemini-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server