Skip to main content
Glama

gemini_startChat

Start a stateful chat session with a Gemini model, returning a unique sessionId for continued interaction. Customize with initial history, generation settings, and safety configurations.

Instructions

Initiates a new stateful chat session with a specified Gemini model. Returns a unique sessionId to be used in subsequent chat messages. Optionally accepts initial conversation history and session-wide generation/safety configurations.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
generationConfigNoOptional. Session-wide generation configuration settings.
historyNoOptional. An array of initial conversation turns to seed the chat session. Must alternate between 'user' and 'model' roles, starting with 'user'.
modelNameNoOptional. The name of the Gemini model to use for this chat session (e.g., 'gemini-1.5-flash'). If omitted, the server's default model (from GOOGLE_GEMINI_MODEL env var) will be used.
safetySettingsNoOptional. Session-wide safety settings to apply.
toolsNoOptional. A list of tools (currently only supporting function declarations) the model may use during the chat session.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/bsmi021/mcp-gemini-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server