Skip to main content
Glama

MCP AI Bridge

by fakoli

ask_gemini

Query Google Gemini AI models (gemini-pro, gemini-1.5-pro, gemini-1.5-flash) using prompts and adjustable temperature settings via the MCP AI Bridge server for contextual responses.

Instructions

Ask Google Gemini AI a question

Input Schema

NameRequiredDescriptionDefault
modelNoThe model to use (default: gemini-pro)gemini-pro
promptYesThe prompt to send to Gemini
temperatureNoTemperature for response generation (0-1)

Input Schema (JSON Schema)

{ "properties": { "model": { "default": "gemini-pro", "description": "The model to use (default: gemini-pro)", "enum": [ "gemini-pro", "gemini-1.5-pro", "gemini-1.5-flash" ], "type": "string" }, "prompt": { "description": "The prompt to send to Gemini", "type": "string" }, "temperature": { "default": 0.7, "description": "Temperature for response generation (0-1)", "maximum": 1, "minimum": 0, "type": "number" } }, "required": [ "prompt" ], "type": "object" }

You must be authenticated.

Other Tools from MCP AI Bridge

Related Tools

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/fakoli/mcp-ai-bridge'

If you have feedback or need assistance with the MCP directory API, please join our Discord server