Skip to main content
Glama

mobile_click_on_screen_at_coordinates

Simulate screen taps at specific coordinates for mobile automation testing and interaction. Use with element detection tools to locate precise positions.

Instructions

Click on the screen at given x,y coordinates. If clicking on an element, use the list_elements_on_screen tool to find the coordinates.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
deviceYesThe device identifier to use. Use mobile_list_available_devices to find which devices are available to you.
xYesThe x coordinate to click on the screen, in pixels
yYesThe y coordinate to click on the screen, in pixels

Implementation Reference

  • src/server.ts:263-275 (registration)
    Registers the MCP tool 'mobile_click_on_screen_at_coordinates' with Zod input schema for x and y coordinates (numbers in pixels). The handler requires a selected robot/device and delegates the tap action to the platform-specific Robot implementation, returning a confirmation message.
    tool( "mobile_click_on_screen_at_coordinates", "Click on the screen at given x,y coordinates. If clicking on an element, use the list_elements_on_screen tool to find the coordinates.", { x: z.number().describe("The x coordinate to click on the screen, in pixels"), y: z.number().describe("The y coordinate to click on the screen, in pixels"), }, async ({ x, y }) => { requireRobot(); await robot!.tap(x, y); return `Clicked on screen at coordinates: ${x}, ${y}`; } );
  • Interface definition for the tap method in the Robot class, which all platform implementations (Android, iOS, Simulator) must provide. This is the core abstraction used by the tool handler.
    */ tap(x: number, y: number): Promise<void>;
  • Android-specific handler implementation using ADB shell 'input tap' command with the provided x,y coordinates.
    public async tap(x: number, y: number): Promise<void> { this.adb("shell", "input", "tap", `${x}`, `${y}`); }
  • iOS physical device handler: delegates tap to WebDriverAgent after ensuring tunnel and WDA are running.
    public async tap(x: number, y: number): Promise<void> { const wda = await this.wda(); await wda.tap(x, y); }
  • Core tap implementation for iOS/Simulator via WebDriverAgent: sends pointer actions (move to x,y, down, pause 100ms, up) to WDA /actions endpoint within a session.
    public async tap(x: number, y: number) { await this.withinSession(async sessionUrl => { const url = `${sessionUrl}/actions`; await fetch(url, { method: "POST", headers: { "Content-Type": "application/json", }, body: JSON.stringify({ actions: [ { type: "pointer", id: "finger1", parameters: { pointerType: "touch" }, actions: [ { type: "pointerMove", duration: 0, x, y }, { type: "pointerDown", button: 0 }, { type: "pause", duration: 100 }, { type: "pointerUp", button: 0 } ] } ] }), }); }); }

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/EmpathySlainLovers/MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server