Uber Eats MCP Server

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

Integrations

  • Enables interaction with Uber Eats, allowing for food ordering and delivery services through the platform

Uber Eats MCP Server

This is a POC of how you can build an MCP servers on top of Uber Eats

https://github.com/user-attachments/assets/05efbf51-1b95-4bd2-a327-55f1fe2f958b

What is MCP?

The Model Context Protocol (MCP) is an open protocol that enables seamless integration between LLM applications and external tools.

Prerequisites

  • Python 3.12 or higher
  • Anthropic API key or other supported LLM provider

Setup

  1. Ensure you have a virtual environment activated:
    uv venv source .venv/bin/activate # On Unix/Mac
  2. Install required packages:
    uv pip install -r requirements.txt playwright install
  3. Update the .env file with your API key:
    ANTHROPIC_API_KEY=your_openai_api_key_here

Note

Since we're using stdio as MCP transport, we have disable all output from browser use

Debugging

You can run the MCP inspector tool with this command

uv run mcp dev server.py
-
security - not tested
F
license - not found
-
quality - not tested

A proof-of-concept Model Context Protocol server that enables LLM applications to interact with Uber Eats, allowing AI agents to browse and order food through natural language.

  1. What is MCP?
    1. Prerequisites
      1. Setup
        1. Note
          1. Debugging