Skip to main content
Glama

MCP with Gemini Integration

by ImDPS

MCP Project with Gemini Integration

This project implements a Model Control Protocol (MCP) server with Google Gemini LLM integration, providing a flexible framework for building AI-powered applications.

Project Structure

. ├── .venv/ # Virtual environment (gitignored) ├── client-server/ # MCP client and server implementation │ ├── client-sse.py # SSE client │ ├── client-stdio.py # stdio client │ └── server.py # MCP server ├── gemini-llm-integration/ # Gemini LLM integration │ ├── client-simple.py # Simple Gemini client │ ├── server.py # Gemini server implementation │ └── data/ # Knowledge base and data files ├── .env # Environment variables ├── .env.example # Example environment variables ├── requirements.txt # Project dependencies └── test_gemini.py # Test script for Gemini API

Prerequisites

  • Python 3.8+
  • UV package manager (pip install uv)
  • Google Gemini API key (for Gemini integration)

Setup

  1. Clone the repository and navigate to the project directory.
  2. Create and activate a virtual environment:
    uv venv .venv source .venv/bin/activate # On Windows: .venv\Scripts\activate
  3. Install dependencies:
    uv pip install -r requirements.txt
  4. Copy .env.example to .env and update with your API keys:
    cp .env.example .env # Edit .env with your API keys

Running the Project

MCP Server

  1. Start the MCP server:
    cd client-server python server.py
  2. In a separate terminal, run a client:
    # For SSE client python client-sse.py # For stdio client python client-stdio.py

Gemini Integration

  1. Start the Gemini server:
    cd gemini-llm-integration python server.py
  2. Run the Gemini client:
    python client-simple.py

Development

  • Format code:
    black . isort .
  • Run tests:
    pytest
  • Type checking:
    mypy .

License

[Specify your license here]

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Commit your changes
  4. Push to the branch
  5. Create a new Pull Request
-
security - not tested
F
license - not found
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

Implements a Model Control Protocol server integrated with Google Gemini LLM, providing a flexible framework for building AI-powered applications.

  1. Project Structure
    1. Prerequisites
      1. Setup
        1. Running the Project
          1. MCP Server
          2. Gemini Integration
        2. Development
          1. License
            1. Contributing

              Related MCP Servers

              • A
                security
                A
                license
                A
                quality
                A dedicated server that wraps Google's Gemini AI models in a Model Context Protocol (MCP) interface, allowing other LLMs and MCP-compatible systems to access Gemini's capabilities like content generation, function calling, chat, and file handling through standardized tools.
                Last updated -
                16
                30
                TypeScript
                MIT License
                • Linux
                • Apple
              • -
                security
                F
                license
                -
                quality
                A server that allows interaction with Google's Gemini AI through the Gemini CLI tool using the Model Context Protocol, providing a standardized interface for querying Gemini with various options and configurations.
                Last updated -
                JavaScript
              • -
                security
                F
                license
                -
                quality
                A Model Context Protocol server that connects to Google AI Studio/Gemini API, enabling content generation with support for various file types, conversation history, and system prompts.
                Last updated -
                589
                13
                JavaScript
              • A
                security
                F
                license
                A
                quality
                A Model Context Protocol server that enables AI assistants to interact with Google Gemini CLI, allowing them to leverage Gemini's large token window for analyzing files and codebases using natural language commands.
                Last updated -
                6
                1,558
                911
                TypeScript
                • Apple
                • Linux

              View all related MCP servers

              MCP directory API

              We provide all the information about MCP servers via our MCP API.

              curl -X GET 'https://glama.ai/api/mcp/v1/servers/ImDPS/MCP'

              If you have feedback or need assistance with the MCP directory API, please join our Discord server