Skip to main content
Glama

Gemini MCP Server

__init__.pyโ€ข1.01 kB
""" System prompts for Gemini tools """ from .analyze_prompt import ANALYZE_PROMPT from .chat_prompt import CHAT_PROMPT from .codereview_prompt import CODEREVIEW_PROMPT from .consensus_prompt import CONSENSUS_PROMPT from .debug_prompt import DEBUG_ISSUE_PROMPT from .docgen_prompt import DOCGEN_PROMPT from .generate_code_prompt import GENERATE_CODE_PROMPT from .planner_prompt import PLANNER_PROMPT from .precommit_prompt import PRECOMMIT_PROMPT from .refactor_prompt import REFACTOR_PROMPT from .secaudit_prompt import SECAUDIT_PROMPT from .testgen_prompt import TESTGEN_PROMPT from .thinkdeep_prompt import THINKDEEP_PROMPT from .tracer_prompt import TRACER_PROMPT __all__ = [ "THINKDEEP_PROMPT", "CODEREVIEW_PROMPT", "DEBUG_ISSUE_PROMPT", "DOCGEN_PROMPT", "GENERATE_CODE_PROMPT", "ANALYZE_PROMPT", "CHAT_PROMPT", "CONSENSUS_PROMPT", "PLANNER_PROMPT", "PRECOMMIT_PROMPT", "REFACTOR_PROMPT", "SECAUDIT_PROMPT", "TESTGEN_PROMPT", "TRACER_PROMPT", ]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/BeehiveInnovations/gemini-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server