Skip to main content
Glama
dakrin

Gemini MCP Server

by dakrin

getModelInfo

Retrieve details about the Gemini AI model configuration, including capabilities and settings, for verification and integration purposes.

Instructions

Get information about the Gemini model being used

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault

No arguments

Implementation Reference

  • src/index.ts:310-322 (registration)
    Registration of the getModelInfo tool including inline handler function that returns static text about the Gemini model used, referencing the betaModelName constant.
    server.tool( "getModelInfo", "Get information about the Gemini model being used", {}, async () => { return { content: [{ type: "text", text: `Using Gemini 2.5 Pro Experimental (${betaModelName})\n\nThis is Google's latest experimental model from the beta API, with:\n- 1,048,576 token input limit\n- 65,536 token output limit\n- Enhanced reasoning capabilities\n- Improved instruction following` }] }; } );
  • Inline handler for getModelInfo tool that returns model information as text content.
    async () => { return { content: [{ type: "text", text: `Using Gemini 2.5 Pro Experimental (${betaModelName})\n\nThis is Google's latest experimental model from the beta API, with:\n- 1,048,576 token input limit\n- 65,536 token output limit\n- Enhanced reasoning capabilities\n- Improved instruction following` }] }; }
  • Constant defining the model name used in the getModelInfo response.
    const betaModelName = 'models/gemini-2.5-pro-exp-03-25';
  • Empty schema (no parameters) for the getModelInfo tool.
    {},
Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/dakrin/mcp-gemini-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server