ask_gemini
Query Google Gemini AI models to generate responses using customizable parameters like model type and temperature via the MCP AI Bridge server.
Instructions
Ask Google Gemini AI a question
Input Schema
Name | Required | Description | Default |
---|---|---|---|
model | No | The model to use (default: gemini-1.5-flash-latest) | gemini-1.5-flash-latest |
prompt | Yes | The prompt to send to Gemini | |
temperature | No | Temperature for response generation (0-1) |
Input Schema (JSON Schema)
{
"properties": {
"model": {
"default": "gemini-1.5-flash-latest",
"description": "The model to use (default: gemini-1.5-flash-latest)",
"enum": [
"gemini-1.5-pro-latest",
"gemini-1.5-pro-002",
"gemini-1.5-pro",
"gemini-1.5-flash-latest",
"gemini-1.5-flash",
"gemini-1.5-flash-002",
"gemini-1.5-flash-8b",
"gemini-1.0-pro-vision-latest",
"gemini-pro-vision"
],
"type": "string"
},
"prompt": {
"description": "The prompt to send to Gemini",
"type": "string"
},
"temperature": {
"default": 0.7,
"description": "Temperature for response generation (0-1)",
"maximum": 1,
"minimum": 0,
"type": "number"
}
},
"required": [
"prompt"
],
"type": "object"
}