Skip to main content
Glama

MCP Todo Server

A distributed task management server built on the Model Context Protocol (MCP). Uses TypeScript, Redis for shared state, and OpenAI for smart task analysis.

Features

  • MCP protocol implementation with 6 custom tools

  • Multi-node setup with load balancing via Caddy

  • Redis for distributed state (works across nodes)

  • AI task prioritization using OpenAI

  • Docker Compose for easy deployment

  • Health monitoring and graceful shutdown

Available Tools

  • todo_add - Add new tasks with priority levels

  • todo_list - List todos with status filtering

  • todo_remove - Remove specific tasks

  • todo_clear - Clear all tasks

  • todo_mark_done - Mark tasks as completed

  • todo_analyze - Get AI-powered task prioritization

Architecture

┌─────────────┐ │ Caddy │ Load Balancer │ Port 3000 │ └──────┬──────┘ │ ├────────────┬────────────┐ ▼ ▼ ▼ ┌──────────┐ ┌──────────┐ ┌──────────┐ │ Node 1 │ │ Node 2 │ │ Node N │ │ Port 3001│ │ Port 3002│ │ Port 300N│ └────┬─────┘ └────┬─────┘ └────┬─────┘ │ │ │ └────────────┴────────────┘ │ ┌─────▼──────┐ │ Redis │ Shared State │ Port 6379 │ └────────────┘

Getting Started

Prerequisites

  • Docker and Docker Compose

  • Node.js 18+ (for local dev)

  • OpenAI API key

Installation

  1. Clone the repository

    git clone https://github.com/yourusername/mcp-todo-server.git cd mcp-todo-server
  2. Set up environment variables

    cp .env.example .env # Edit .env and add your OPENAI_API_KEY
  3. Start with Docker Compose

    docker compose up --build

    This starts:

    • Redis on port 6379

    • MCP Server Node 1 on port 3001

    • MCP Server Node 2 on port 3002

    • Caddy load balancer on port 3000

  4. Verify the deployment

    curl http://localhost:3000/health

API Endpoints

Health Check

GET http://localhost:3000/health

MCP Endpoint

GET/POST http://localhost:3000/mcp

Uses Server-Sent Events (SSE) for communication.

Development

Running Locally (without Docker)

# Install dependencies npm install # Start Redis docker run -d -p 6379:6379 redis:7-alpine # Set environment variables export REDIS_URL=redis://localhost:6379 export OPENAI_API_KEY=your_key_here export NODE_ID=dev-node # Run development server npm run dev

Build for Production

npm run build npm start

Testing

Run the test suite:

./test.sh

Or try the example client:

npm run test:client

Using with MCP Clients

VS Code / Cursor

Add to your MCP config:

{ "mcpServers": { "todo": { "url": "http://localhost:3000/mcp", "transport": "sse" } } }

Claude Desktop

Add to ~/Library/Application Support/Claude/claude_desktop_config.json:

{ "mcpServers": { "todo": { "command": "node", "args": ["/path/to/mcp-todo-server/dist/main.js"], "env": { "REDIS_URL": "redis://localhost:6379", "OPENAI_API_KEY": "your-key-here" } } } }

Project Structure

. ├── src/ │ ├── main.ts # Express server & MCP transport │ ├── mcp-tools.ts # MCP tool implementations │ ├── redis-client.ts # State management │ └── ai-service.ts # OpenAI integration ├── docker-compose.yml # Multi-node orchestration ├── Dockerfile # Container definition ├── package.json # Dependencies ├── tsconfig.json # TypeScript config ├── test.sh # Automated testing └── example-client.js # MCP client example

Security Notes

For production, add:

  • Authentication

  • HTTPS

  • Redis password

  • Input validation

  • Rate limiting

Resources

License

MIT

-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/xenagarage/MCP-Todo-Server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server