ask_gemini
Query Google Gemini AI models (gemini-pro, gemini-1.5-pro, gemini-1.5-flash) using prompts and adjustable temperature settings via the MCP AI Bridge server for contextual responses.
Instructions
Ask Google Gemini AI a question
Input Schema
Name | Required | Description | Default |
---|---|---|---|
model | No | The model to use (default: gemini-pro) | gemini-pro |
prompt | Yes | The prompt to send to Gemini | |
temperature | No | Temperature for response generation (0-1) |