Process natural language queries using internal Vertex AI model knowledge, save responses to specified files for structured data retention and future reference.
Retrieves answers to natural language queries using Google Search, processes them with Vertex AI models, and saves the results to a specified file path for easy access and storage.
Generate and save detailed explanations for software-related queries using official documentation, powered by Vertex AI Gemini models. Input topic, query, and output path for results.
Saves precise code snippets or concise answers from official documentation to a specified file, using a Vertex AI model with Google Search for technical queries. Requires topic, query, and output path.
Answer natural language queries by combining Vertex AI's Gemini model with real-time Google Search results, delivering accurate and up-to-date information on demand.
Provides answers to natural language queries using Vertex AI's internal knowledge, without web search. Input a query string to extract precise information directly from the model.
Implementation of Model Context Protocol (MCP) server that provides tools for accessing Google Cloud's Vertex AI Gemini models, supporting features like web search grounding and direct knowledge answering for coding assistance and general queries.
A server that enables document searching using Vertex AI with Gemini grounding, improving search results by grounding responses in private data stored in Vertex AI Datastore.
Enables AI-powered image and video analysis using Google Gemini and Vertex AI models. Supports analyzing single or multiple images, detecting objects with bounding boxes, and video content analysis through natural language prompts.