Skip to main content
Glama

Gemini MCP Server

by reouno

Gemini MCP Server

An MCP (Model Context Protocol) server that provides access to Google Gemini AI models.

Quick Start

  1. Install dependencies:

npm install
  1. Create a .env file with your Gemini API key:

GEMINI_API_KEY=your_api_key_here
  1. Start the server:

npm run dev

The server will run at http://localhost:3333/mcp

Available Tool

gemini.generateText

Generate text using Google Gemini models.

Parameters:

  • prompt (string, required): The text prompt

  • model (string, optional): Gemini model to use (default: gemini-2.5-pro)

  • temperature (number, optional): Temperature for generation, 0-2 (default: 1)

Returns:

  • text: Generated text response

  • model: Model used

  • temperature: Temperature setting used

Usage Example

import { Client as McpClient } from '@modelcontextprotocol/sdk/client/index.js'; import { StreamableHTTPClientTransport } from '@modelcontextprotocol/sdk/client/streamableHttp.js'; const transport = new StreamableHTTPClientTransport( new URL('http://localhost:3333/mcp') ); const client = new McpClient( { name: 'my-client', version: '1.0.0' }, { capabilities: {} } ); await client.connect(transport); const result = await client.callTool({ name: 'gemini.generateText', arguments: { prompt: 'Explain AI in simple terms', model: 'gemini-2.5-pro', temperature: 0.7 } }); console.log(result); await client.close();

Testing

Run the included test client (requires server to be running):

npm test

Configuration

Environment variables:

  • GEMINI_API_KEY (required): Your Google Gemini API key

  • PORT (optional): Server port (default: 3333)

License

ISC

-
security - not tested
F
license - not found
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

Enables interaction with Google Gemini AI models through MCP protocol. Provides text generation capabilities with configurable model selection and temperature settings.

  1. Quick Start
    1. Available Tool
      1. gemini.generateText
    2. Usage Example
      1. Testing
        1. Configuration
          1. License

            MCP directory API

            We provide all the information about MCP servers via our MCP API.

            curl -X GET 'https://glama.ai/api/mcp/v1/servers/reouno/gemini-mcp'

            If you have feedback or need assistance with the MCP directory API, please join our Discord server