Retrieves answers to natural language queries using Google Search, processes them with Vertex AI models, and saves the results to a specified file path for easy access and storage.
Answer natural language queries by combining Vertex AI's Gemini model with real-time Google Search results, delivering accurate and up-to-date information on demand.
Saves precise code snippets or concise answers from official documentation to a specified file, using a Vertex AI model with Google Search for technical queries. Requires topic, query, and output path.
Process natural language queries using internal Vertex AI model knowledge, save responses to specified files for structured data retention and future reference.
Provides AI-powered search and documentation tools using Google Vertex AI or Gemini API with real-time web search grounding, enabling technical queries, code analysis, documentation retrieval, and architecture recommendations to overcome LLM knowledge gaps.
Implementation of Model Context Protocol (MCP) server that provides tools for accessing Google Cloud's Vertex AI Gemini models, supporting features like web search grounding and direct knowledge answering for coding assistance and general queries.
A server that enables document searching using Vertex AI with Gemini grounding, improving search results by grounding responses in private data stored in Vertex AI Datastore.