Skip to main content
Glama

Gemini MCP Server

by InfolabAI

run_gemini

Execute prompts with Gemini AI by specifying a prompt, file, directory, or URL path and a working directory to analyze large files efficiently while optimizing token usage.

Instructions

Gemini를 사용하여 프롬프트를 실행합니다. Args: prompt: Gemini에 전달할 프롬프트 file_dir_url_path: 분석할 파일, 디렉토리 또는 URL 경로 working_directory: 작업 디렉토리 (필수) Returns: dict: 실행 결과 또는 에러 메시지

Input Schema

NameRequiredDescriptionDefault
file_dir_url_pathYes
promptYes
working_directoryYes

Input Schema (JSON Schema)

{ "properties": { "file_dir_url_path": { "title": "File Dir Url Path", "type": "string" }, "prompt": { "title": "Prompt", "type": "string" }, "working_directory": { "title": "Working Directory", "type": "string" } }, "required": [ "prompt", "file_dir_url_path", "working_directory" ], "title": "run_geminiArguments", "type": "object" }
Install Server

Other Tools from Gemini MCP Server

Related Tools

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/InfolabAI/gemini-cli-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server