Skip to main content
Glama
SNYCFIRE-CORE

Letta MCP Server Railway Edition

πŸš‚ Letta MCP Server Railway Edition

Deploy on Railway Version License Railway Compatible Python MCP

Cloud-optimized HTTP transport edition of Letta MCP Server - Deploy to Railway in 30 seconds.

Universal MCP server connecting any AI client to Letta.ai's powerful stateful agents via streamable HTTP for seamless cloud deployment.


πŸš€ Quick Deploy to Railway

Deploy on Railway

Prerequisites

  • Letta API key from api.letta.com (free tier available)

  • Railway account (free tier includes 500 hours/month)

One-Click Deployment

  1. Click the deploy button above

  2. Connect your GitHub account to Railway

  3. Add environment variable: LETTA_API_KEY=your_letta_api_key_here

  4. Deploy - your MCP server will be live in under 2 minutes!

Your MCP Server URL

https://your-app-name.up.railway.app/mcp

Related MCP server: Redshift MCP Server

⚑ Integration with AI Clients

Claude Desktop (ADE Integration)

Add to your Claude Desktop configuration file:

macOS: ~/Library/Application Support/Claude/claude_desktop_config.json Windows: %APPDATA%/Claude/claude_desktop_config.json

{
  "mcpServers": {
    "letta-railway": {
      "url": "https://your-app.up.railway.app/mcp",
      "transport": "streamable_http",
      "timeout": 300,
      "headers": {
        "User-Agent": "Claude-Desktop-MCP/1.0"
      }
    }
  }
}

MCP Inspector Testing

Test your deployment with the MCP Inspector:

npx @modelcontextprotocol/inspector https://your-app.up.railway.app/mcp

GitHub Copilot & VS Code

{
  "mcp.servers": {
    "letta-railway": {
      "transport": "streamable_http",
      "url": "https://your-app.up.railway.app/mcp"
    }
  }
}

Other MCP Clients

  • Cursor: Add server to MCP configuration

  • Replit: Use MCP-compatible endpoint configuration

  • Sourcegraph Cody: Configure via OpenCtx bridge

  • Any MCP Client: Use streamable HTTP transport


πŸ”§ Configuration

Environment Variables

Variable

Required

Default

Description

LETTA_API_KEY

βœ… Yes

-

Your Letta API key from api.letta.com

LETTA_BASE_URL

No

https://api.letta.com

Letta API endpoint (for self-hosted)

PORT

No

8000

Railway auto-assigns this

LETTA_TIMEOUT

No

60

Request timeout in seconds

LETTA_MAX_RETRIES

No

3

Max retry attempts for failed requests

Letta Cloud Setup

  1. Sign up: Create account at letta.com

  2. Get API key: Visit api.letta.com β†’ Settings β†’ API Keys

  3. Create agent: Use the web interface to create your first agent

  4. Test connection: Use letta_health_check tool to verify


πŸ› οΈ Available Tools (20+ Letta Functions)

πŸ€– Agent Management

  • letta_list_agents - List all agents with pagination and filtering

  • letta_create_agent - Create new agents with memory blocks and tools

  • letta_get_agent - Get detailed agent information

  • letta_update_agent - Update agent configuration (name, description, model)

  • letta_delete_agent - Safely delete agents with confirmation

πŸ’¬ Conversations

  • letta_send_message - Send messages to agents with streaming support

  • letta_get_conversation_history - Retrieve chat history with pagination

  • letta_export_conversation - Export conversations (markdown, JSON, text)

🧠 Memory Management

  • letta_get_memory - View all memory blocks for an agent

  • letta_update_memory - Update memory blocks (human, persona, custom)

  • letta_create_memory_block - Create custom memory blocks

  • letta_search_memory - Search through agent conversation memory

πŸ”§ Tool Management

  • letta_list_tools - List all available tools with filtering

  • letta_get_agent_tools - View tools attached to specific agents

  • letta_attach_tool - Add tools to agents

  • letta_detach_tool - Remove tools from agents

πŸ“Š Monitoring & Health

  • letta_health_check - Verify API connection and service status

  • letta_get_usage_stats - Get usage statistics and analytics


πŸ—οΈ Technical Architecture

Railway-Optimized Features

  • Streamable HTTP Transport: Optimized for cloud deployment vs stdio

  • Connection Pooling: Maintains persistent connections for performance

  • Auto-scaling: Railway automatically scales based on demand

  • Zero-downtime Deploys: Hot reloading without connection loss

  • Built-in Monitoring: Railway dashboard shows metrics and logs

Performance Optimizations

# Optimized for Railway cloud environment
- HTTP keep-alive connections
- Request/response compression  
- Intelligent retry logic with backoff
- Memory-efficient JSON streaming
- Automatic connection pool management

Transport Comparison

Feature

stdio (local)

streamable-http (Railway)

Cloud deployment

❌ No

βœ… Yes

Load balancing

❌ No

βœ… Auto

Horizontal scaling

❌ No

βœ… Yes

Health monitoring

❌ Limited

βœ… Full

Zero-downtime deploys

❌ No

βœ… Yes


πŸ’» Local Development

Quick Local Setup

# Clone the repository
git clone https://github.com/SNYCFIRE-CORE/letta-mcp-server-railway.git
cd letta-mcp-server-railway

# Install dependencies  
pip install -e .

# Set environment variables
export LETTA_API_KEY=your_api_key_here

# Run locally
python -m letta_mcp_server_railway.server

Local Testing

# Test with MCP Inspector
npx @modelcontextprotocol/inspector http://localhost:8000/mcp

# Or use curl
curl -X POST http://localhost:8000/mcp \
  -H "Content-Type: application/json" \
  -d '{"jsonrpc": "2.0", "id": 1, "method": "tools/list"}'

Development Commands

# Run tests
pytest tests/

# Format code
black src/ tests/

# Type checking  
mypy src/

# Lint
ruff check src/

πŸ” Troubleshooting

Common Issues

1. "Connection refused" error

# Check if your Railway app is running
curl https://your-app.up.railway.app/health

# Verify environment variables in Railway dashboard
# Ensure LETTA_API_KEY is set correctly

2. "Invalid API key" error

# Test your Letta API key directly
curl -H "Authorization: Bearer your_api_key" https://api.letta.com/v1/agents

3. "Timeout" errors

# Increase timeout in your MCP client configuration
{
  "mcpServers": {
    "letta-railway": {
      "url": "https://your-app.up.railway.app/mcp",
      "transport": "streamable_http", 
      "timeout": 300  // Increase to 5 minutes
    }
  }
}

4. Claude Desktop not connecting

  • Restart Claude Desktop after configuration changes

  • Check configuration file syntax with a JSON validator

  • Verify the URL is accessible from your browser

Getting Help

  1. Check Railway logs: View deployment logs in Railway dashboard

  2. Test health endpoint: Visit https://your-app.up.railway.app/health

  3. Verify MCP endpoint: Test with MCP Inspector

  4. Community support: Join Letta Discord

  5. Report issues: GitHub Issues


πŸš€ Production Deployment

Railway Deployment Best Practices

Environment Management

# Production environment variables
LETTA_API_KEY=your_production_api_key
LETTA_BASE_URL=https://api.letta.com
PORT=8000  # Railway manages this automatically
LETTA_TIMEOUT=300
LETTA_MAX_RETRIES=5

Health Monitoring

Railway provides built-in monitoring, but you can also:

  • Set up custom health checks

  • Monitor response times and error rates

  • Configure alerts for downtime

Scaling Configuration

# railway.toml - Production settings
[build]
builder = "DOCKERFILE"

[deploy] 
restartPolicyType = "ON_FAILURE"

[[deploy.environmentVariables]]
name = "PORT"
value = "8000"

πŸ“– Resources

Documentation

Community & Support

Examples & Tutorials


🀝 Contributing

We welcome contributions to make Letta MCP Server Railway even better!

Quick Contribution Guide

  1. Fork the repository

  2. Create a feature branch: git checkout -b feature/amazing-feature

  3. Make your changes and add tests

  4. Test locally: pytest tests/

  5. Commit with clear messages: git commit -m "Add amazing feature"

  6. Push to your fork: git push origin feature/amazing-feature

  7. Submit a Pull Request

Development Setup

# Fork and clone your fork
git clone https://github.com/YOUR_USERNAME/letta-mcp-server-railway.git
cd letta-mcp-server-railway

# Install development dependencies
pip install -e ".[dev]"

# Install pre-commit hooks
pre-commit install

# Run tests
pytest tests/ -v

Areas We Need Help

  • πŸ“– Documentation improvements

  • πŸ§ͺ Additional test coverage

  • πŸ”§ Railway deployment optimizations

  • 🌐 Multi-language client examples

  • πŸ› Bug fixes and performance improvements


πŸ“œ License

MIT License - see LICENSE for details.


πŸ™ Acknowledgments

Built with ❀️ by the community for seamless AI agent deployment.

Special Thanks:

  • Letta.ai for revolutionary stateful agents

  • Railway for exceptional deployment platform

  • Anthropic for MCP specification leadership

  • FastMCP for HTTP transport framework

  • All contributors making this project possible


-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/SNYCFIRE-CORE/letta-mcp-server-railway'

If you have feedback or need assistance with the MCP directory API, please join our Discord server