Skip to main content
Glama

Todo MCP

A persistent todo list accessible from any AI assistant that supports MCP. One server, every client.

Works with Claude (chat, Code, Cowork), Codex, and any MCP-compatible tool. ChatGPT support is possible too, with one current auth caveat noted below.


Quick start (local)

git clone https://github.com/YOUR_USER/todo-mcp.git
cd todo-mcp
cp .env.example .env
docker compose up -d

MCP endpoint: http://localhost:8000/mcp


Deploy on a remote server

# 1. Copy the folder to your server
scp -r ~/Downloads/todo-mcp user@YOUR_SERVER:~/todo-mcp

# 2. SSH in and configure
ssh user@YOUR_SERVER
cd ~/todo-mcp
cp .env.example .env

# 3. Generate an auth token and add it to .env
python3 -c "import secrets; print(secrets.token_urlsafe(32))"
# → paste the output as AUTH_TOKEN=... in .env

# 4. Start
docker compose up -d

HTTP endpoint: http://YOUR_SERVER:8000/mcp


Self-signed certs are blocked by many corporate firewalls (e.g. FortiGuard). Use Let's Encrypt for a free, trusted certificate.

1. Point a DNS A record at your server:

Type

Name

Value

A

todo

YOUR_SERVER_IP

2. Get a certificate (requires port 80 free temporarily):

certbot certonly --standalone -d todo.yourdomain.com

If port 80 is occupied: stop whatever is using it, run certbot, then restart it.

3. Edit nginx.conf — replace yourdomain.com with your actual domain in:

  • server_name

  • ssl_certificate

  • ssl_certificate_key

4. Start with HTTPS:

docker compose --profile https up -d

If port 443 is already taken, set HTTPS_PORT=8443 in .env. Your endpoint becomes https://todo.yourdomain.com:8443/mcp.

5. Open the firewall if needed:

ufw allow 443/tcp   # or 8443/tcp if using a custom port

Note: The nginx container mounts all of /etc/letsencrypt (not just live/), because Let's Encrypt certs are symlinks into archive/. Mounting only live/ breaks symlink resolution inside the container.

HTTPS endpoint: https://todo.yourdomain.com/mcp


Connect your AI clients

Use the full MCP endpoint URL here: https://todo.yourdomain.com/mcp

For the companion slash commands / skill in skills/, use the base server URL without /mcp, for example: https://todo.yourdomain.com

Claude (claude.ai)

Settings → Integrations → Add custom integration

  • URL: https://todo.yourdomain.com/mcp

  • If auth is enabled, add header: Authorization: Bearer YOUR_TOKEN

Claude Code

Add to ~/.claude.json:

{
  "mcpServers": {
    "todo": {
      "type": "url",
      "url": "https://todo.yourdomain.com/mcp",
      "headers": {
        "Authorization": "Bearer YOUR_TOKEN"
      }
    }
  }
}

Codex (CLI, app, IDE extension)

Codex uses ~/.codex/config.toml (or project-local .codex/config.toml). The Codex CLI and app share this config.

No auth:

[mcp_servers.todo]
url = "https://todo.yourdomain.com/mcp"

Bearer token auth:

[mcp_servers.todo]
url = "https://todo.yourdomain.com/mcp"
bearer_token_env_var = "TODO_MCP_TOKEN"

Then export the token before starting Codex:

export TODO_MCP_TOKEN="YOUR_TOKEN"

You can also use static headers with http_headers or environment-backed headers with env_http_headers.

ChatGPT

As of March 26, 2026, OpenAI's ChatGPT developer mode supports remote MCP servers over streaming HTTP, but the documented auth modes are OAuth, no authentication, and mixed auth. This server currently supports no auth or bearer token auth.

That means:

  • AUTH_TOKEN empty: easiest path for ChatGPT testing

  • AUTH_TOKEN set: works in Claude / Codex, but ChatGPT may not connect unless you add OAuth support

This is an inference from OpenAI's current docs: bearer-only remote MCP servers do not appear to be the smooth path in ChatGPT today.

If you want ChatGPT support right now, leave AUTH_TOKEN empty during testing or place the server behind another trusted access layer. If you want secure internet-facing ChatGPT access, the next step is adding OAuth.

Any other MCP client

Streamable HTTP transport, URL https://todo.yourdomain.com/mcp. Include header Authorization: Bearer YOUR_TOKEN if auth is enabled.


Slash commands and skill triggers

Install the companion skill from the skills/ folder:

Command

Also triggers on

/setup-todo <url>

"setup todo", "configure todo", "set todo url"

/todo <text>

"add todo", "remind me to", "note to self", "don't let me forget"

/next-todo

"what's next", "what should I work on", "most urgent", "what's overdue"

/list-todo [filter]

"show my todos", "what's on my plate", "todo stats", "show ASF todos"

cd skills && chmod +x install.sh && ./install.sh
/setup-todo https://todo.yourdomain.com

For Codex, see:

  • codex-config.toml.example

  • skills/AGENTS.md.example


MCP tools

Tool

What it does

todo_list

List todos with short #1, #2, #3 references for follow-up actions

todo_add

Add a todo with optional tag, priority, and due date

todo_update

Edit text, toggle done, change priority / tag / due using a short reference or internal ID

todo_complete

Mark a todo done using a short reference or internal ID

todo_delete

Permanently delete a todo using a short reference or internal ID

todo_stats

Summary counts by status, tag, priority, and overdue

Example flow:

Found 3 todos — 3 open, 0 done, 0 overdue
Use the `#` number to refer to an item in follow-up commands.

#1 ○ Buy groceries
  priority:high  due:2026-03-27

#2 ○ Review pages and letter of HubX

#3 ○ Read MCP spec (streamable HTTP section)
  priority:med  tag:dev

That lets the assistant say things like complete 1 or delete 2 without surfacing the long internal storage IDs in the normal presentation.


Authentication

Set AUTH_TOKEN in .env to require a bearer token on all requests. Leave it empty to disable auth (fine for local, not recommended for remote).

Compatibility summary:

  • Claude / Claude Code: works with bearer auth

  • Codex: works with bearer auth

  • ChatGPT: easiest with no auth today; OAuth would be needed for the cleanest authenticated setup

# Generate a token
python3 -c "import secrets; print(secrets.token_urlsafe(32))"

The /health endpoint is always public (used by Docker healthchecks).


Architecture

  • Protocol: MCP over streamable HTTP

  • Runtime: Python 3.12 + FastMCP

  • Storage: JSON file in a Docker named volume (/data/todos.json)

  • Auth: Optional bearer token middleware

  • TLS: Optional nginx reverse proxy (--profile https)


Notes for OpenAI clients

I verified the current OpenAI docs before writing this section:

  • Codex supports streamable HTTP MCP servers, bearer tokens, http_headers, and env_http_headers

  • ChatGPT developer mode supports streaming HTTP MCP servers, but its docs currently describe OAuth / no auth / mixed auth rather than arbitrary bearer header entry in the UI

Sources:


Backup & restore

# Backup
docker cp $(docker compose ps -q todo-mcp):/data/todos.json ./backup.json

# Restore
docker cp ./backup.json $(docker compose ps -q todo-mcp):/data/todos.json

License

MIT

-
security - not tested
A
license - permissive license
-
quality - not tested

Resources

Unclaimed servers have limited discoverability.

Looking for Admin?

If you are the server author, to access and configure the admin panel.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/peterfabakker/todo-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server