Skip to main content
Glama
dakrin

Gemini MCP Server

by dakrin

getModelInfo

Retrieve details about the Gemini AI model configuration and capabilities to understand its parameters and functionality.

Instructions

Get information about the Gemini model being used

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault

No arguments

Implementation Reference

  • src/index.ts:310-322 (registration)
    Registers the 'getModelInfo' tool using server.tool, providing name, description, empty schema, and inline handler function.
    server.tool( "getModelInfo", "Get information about the Gemini model being used", {}, async () => { return { content: [{ type: "text", text: `Using Gemini 2.5 Pro Experimental (${betaModelName})\n\nThis is Google's latest experimental model from the beta API, with:\n- 1,048,576 token input limit\n- 65,536 token output limit\n- Enhanced reasoning capabilities\n- Improved instruction following` }] }; } );
  • The handler function for 'getModelInfo' tool, which returns static information about the Gemini model being used.
    async () => { return { content: [{ type: "text", text: `Using Gemini 2.5 Pro Experimental (${betaModelName})\n\nThis is Google's latest experimental model from the beta API, with:\n- 1,048,576 token input limit\n- 65,536 token output limit\n- Enhanced reasoning capabilities\n- Improved instruction following` }] }; }

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/dakrin/mcp-gemini-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server