Skip to main content
Glama
orzcls

Gemini CLI MCP Server

by orzcls

timeout-test

Test timeout prevention mechanisms by running processes for specified durations to verify system stability and response handling.

Instructions

Test timeout prevention by running for a specified duration

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
durationYesDuration in milliseconds (minimum 10ms)

Implementation Reference

  • The handler implementation for the 'timeout-test' tool. It destructures the duration argument, logs the call, and returns a Promise that resolves after the specified duration using setTimeout, simulating a timeout test.
    case "timeout-test":
        const { duration } = args;
        console.error('[GMCPT] timeout-test tool called with duration: ' + duration + 'ms');
        
        return new Promise((resolve) => {
            setTimeout(() => {
                resolve({
                    content: [{
                        type: "text",
                        text: `Timeout test completed after ${duration}ms`
                    }]
                });
            }, Math.max(10, duration));
        });
  • The schema definition for the 'timeout-test' tool, specifying the name, description, and input schema requiring a 'duration' number parameter (minimum 10ms). This schema is returned by the ListTools endpoint.
    {
        name: "timeout-test",
        description: "Test timeout prevention by running for a specified duration",
        inputSchema: {
            type: "object",
            properties: {
                duration: {
                    type: "number",
                    minimum: 10,
                    description: "Duration in milliseconds (minimum 10ms)"
                }
            },
            required: ["duration"]
        }
    }
  • Registration of the ListTools request handler, which exposes the 'timeout-test' tool schema (included in the 'tools' array) to MCP clients.
    server.setRequestHandler(ListToolsRequestSchema, async () => {
        return { tools };
    });
Behavior2/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

No annotations are provided, so the description carries full burden. It states the tool 'runs for a specified duration' to test timeout prevention, but doesn't disclose behavioral traits such as whether it's safe (non-destructive), what happens after the duration (e.g., returns a result or error), or any side effects. This is inadequate for a tool with potential timing implications.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness5/5

Is the description appropriately sized, front-loaded, and free of redundancy?

The description is a single, efficient sentence: 'Test timeout prevention by running for a specified duration.' It's front-loaded with the core purpose, has zero waste, and is appropriately sized for a simple tool.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness2/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

Given no annotations and no output schema, the description is incomplete. It doesn't explain what the tool returns (e.g., success/failure, timing data), potential errors, or behavioral details like whether it's idempotent. For a tool that interacts with system timeouts, more context is needed for safe and effective use.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters3/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

Schema description coverage is 100%, with the parameter 'duration' fully documented in the schema as 'Duration in milliseconds (minimum 10ms).' The description adds no additional meaning beyond implying the duration controls runtime, so it meets the baseline of 3 where the schema does the heavy lifting.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose4/5

Does the description clearly state what the tool does and how it differs from similar tools?

The description clearly states the tool's purpose: 'Test timeout prevention by running for a specified duration.' It includes a specific verb ('test') and resource ('timeout prevention'), but doesn't differentiate from siblings (e.g., 'ping' which might also test system responsiveness). The purpose is unambiguous but lacks sibling comparison.

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines2/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

No guidance is provided on when to use this tool versus alternatives like 'ping' or other siblings. The description implies usage for testing timeouts but doesn't specify scenarios, prerequisites, or exclusions. This leaves the agent without context for tool selection.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/orzcls/gemini-mcp-tool-windows-fixed'

If you have feedback or need assistance with the MCP directory API, please join our Discord server