Skip to main content
Glama

get_drive_revisions

List revision history for a Drive file to find revision IDs and audit who changed what, including timestamps and user info.

Instructions

List the revision history for a Drive file, newest first.

Returns each revision's ID, modification timestamp, last-modifying user (display name + email), size in bytes (when available), MIME type, and whether it's pinned via keepForever. Use this to discover revision IDs before calling restore_drive_revision, or to audit who changed what.

Requires OAuth scope: https://www.googleapis.com/auth/drive.readonly (or broader). Read-only.

Limitation: Google-native files (Docs, Sheets, Slides) expose revisions in the API list but their binary content is not retrievable — only non-native files (PDF, DOCX, images, etc.) support content restore. By default, Drive retains up to 100 revisions or 30 days, whichever comes first, unless a revision is pinned (keepForever: true).

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
user_google_emailYes
file_idYesDrive file ID (from a file URL like `drive.google.com/file/d/<file_id>/view`, or from `search_drive_files`, or from `get_drive_file_metadata`).
page_sizeNoMaximum number of revisions to return. Clamped to `[1, 1000]`. Default `25`. No pagination token support in this tool — if the file has more than `page_size` revisions, only the most recent are returned.

Output Schema

TableJSON Schema
NameRequiredDescriptionDefault
resultYes
Behavior4/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

No annotations provided, so description carries full burden. Discloses read-only nature via OAuth scope, limitations on content retrieval, retention policy, and default page_size behavior with no pagination token support. Adds significant transparency beyond structured fields.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness4/5

Is the description appropriately sized, front-loaded, and free of redundancy?

Description is well-structured and informative at about 150 words. Every sentence adds value, no fluff. Could be slightly tighter but remains clear and front-loaded.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness5/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

For a list tool with an output schema (not shown but referenced), the description provides complete context: purpose, parameters, behavioral traits, limitations, sibling reference, and security/scope info. No gaps identified.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters3/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

Schema coverage is 67% (2 of 3 params described). Description adds no additional parameter details beyond schema; however, schema descriptions for file_id and page_size are detailed. The missing user_google_email parameter is not addressed. Baseline 3 is appropriate since schema does most work but leaves a gap.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose5/5

Does the description clearly state what the tool does and how it differs from similar tools?

Clearly states it lists revision history for a Drive file, newest first, and lists specific return fields (ID, timestamp, user info, size, MIME type, pinned status). Distinguishes itself from sibling tool restore_drive_revision by stating its purpose is to discover revision IDs before calling that tool.

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines4/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

Explicitly states when to use it (discovering revision IDs before restore, auditing changes) and provides OAuth scope. Mentions limitations (Google-native files not restorable, retention policy). Missing explicit 'when not to use' but context makes it clear.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/HuntsDesk/ve-gws'

If you have feedback or need assistance with the MCP directory API, please join our Discord server