Skip to main content
Glama

MCP AI Bridge

ask_gemini

Query Google Gemini AI models to generate responses using customizable parameters like model type and temperature via the MCP AI Bridge server.

Instructions

Ask Google Gemini AI a question

Input Schema

NameRequiredDescriptionDefault
modelNoThe model to use (default: gemini-1.5-flash-latest)gemini-1.5-flash-latest
promptYesThe prompt to send to Gemini
temperatureNoTemperature for response generation (0-1)

Input Schema (JSON Schema)

{ "properties": { "model": { "default": "gemini-1.5-flash-latest", "description": "The model to use (default: gemini-1.5-flash-latest)", "enum": [ "gemini-1.5-pro-latest", "gemini-1.5-pro-002", "gemini-1.5-pro", "gemini-1.5-flash-latest", "gemini-1.5-flash", "gemini-1.5-flash-002", "gemini-1.5-flash-8b", "gemini-1.0-pro-vision-latest", "gemini-pro-vision" ], "type": "string" }, "prompt": { "description": "The prompt to send to Gemini", "type": "string" }, "temperature": { "default": 0.7, "description": "Temperature for response generation (0-1)", "maximum": 1, "minimum": 0, "type": "number" } }, "required": [ "prompt" ], "type": "object" }
Install Server

Other Tools from MCP AI Bridge

Related Tools

    MCP directory API

    We provide all the information about MCP servers via our MCP API.

    curl -X GET 'https://glama.ai/api/mcp/v1/servers/fakoli/mcp-ai-bridge'

    If you have feedback or need assistance with the MCP directory API, please join our Discord server