Skip to main content
Glama

Gemini Agent MCP Server

by LeeSinLiang
gemini.py1.22 kB
"""This module contains the Gemini-related tools for the MCP server.""" import logging import subprocess logger = logging.getLogger(__name__) def call_gemini(prompt: str, context: str = '') -> dict: """Calls the Gemini CLI with a given prompt and context.""" logger.info("Executing call_gemini with prompt: '%s...'") full_prompt = f"{context}\n\n{prompt}" try: result = subprocess.run( ['gemini', full_prompt], capture_output=True, text=True, check=True, encoding='utf-8') logger.info("call_gemini executed successfully.") return {"response": result.stdout} except FileNotFoundError as exc: logger.error("'gemini' command not found.") raise ValueError( "'gemini' command not found. Make sure the Gemini CLI is installed and in your PATH." ) from exc except subprocess.CalledProcessError as exc: logger.error("Error calling Gemini CLI: %s", exc.stderr) raise RuntimeError(f"Error calling Gemini CLI: {exc.stderr}") from exc except Exception as exc: logger.error("An unexpected error occurred in call_gemini: %s", exc) raise RuntimeError(f"An unexpected error occurred: {exc}") from exc

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/LeeSinLiang/GeminiAgentMCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server