Skip to main content
Glama

Vertex AI MCP Server

save_answer_query_direct

Process natural language queries using internal Vertex AI model knowledge, save responses to specified files for structured data retention and future reference.

Instructions

Answers a natural language query using only the internal knowledge of the configured Vertex AI model (gemini-2.5-pro-exp-03-25), does not use web search, and saves the answer to a file. Requires 'query' and 'output_path'.

Input Schema

NameRequiredDescriptionDefault
output_pathYesThe relative path where the generated answer should be saved.
queryYesThe natural language question to answer using only the model's internal knowledge.

Input Schema (JSON Schema)

{ "$schema": "http://json-schema.org/draft-07/schema#", "additionalProperties": false, "properties": { "output_path": { "description": "The relative path where the generated answer should be saved.", "type": "string" }, "query": { "description": "The natural language question to answer using only the model's internal knowledge.", "type": "string" } }, "required": [ "query", "output_path" ], "type": "object" }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/shariqriazz/vertex-ai-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server