Skip to main content
Glama
xumingjun5208

Gemini MCP Server

gemini_chat

Send messages to Google Gemini AI for responses, supporting multi-turn conversations, file analysis (text, code, images), and customizable parameters like temperature and output format.

Instructions

Send a message to Google Gemini and get a response. Args: params (GeminiChatInput): Chat parameters including: - prompt (str): The prompt to send - file (Optional[list[str]]): Files to include (text, code, images) - session_id (Optional[str]): Session ID for multi-turn chat, use 'last' for recent - model (Optional[str]): Override model selection - system_prompt (Optional[str]): System context - temperature (Optional[float]): Creativity (0.0-2.0) - max_tokens (Optional[int]): Max response length - response_format: Output format - 'markdown' or 'json' Returns: str: Response with SESSION_ID for continuation. Examples: - Simple: prompt="What is AI?" - With file: prompt="Review", file=["main.py"] - With image: prompt="Describe", file=["photo.jpg"] - Continue: prompt="Tell me more", session_id="last"

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
paramsYes
Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/xumingjun5208/aistudio-gemini-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server