Skip to main content
Glama
nikhilkumaragarwal-shyftlabs

MCP Executor Server

MCP Executor Server

A secure, Docker-based code execution server that provides HTTP API endpoints for running Python code in isolated containers. Perfect for integration with automation tools like n8n, web applications, or any HTTP client.

πŸš€ Features

  • πŸ”’ Secure Execution - Code runs in isolated Docker containers

  • ⚑ Fast Performance - Optimized for quick code execution

  • πŸ“‘ Real-time Streaming - Server-Sent Events (SSE) for live output

  • 🌐 HTTP API - RESTful endpoints for easy integration

  • πŸ“Š Health Monitoring - Built-in health check endpoint

  • 🐍 Python Support - Execute Python code with full library access

  • πŸ”§ Docker Integration - Uses my-llm-sandbox container for execution

Related MCP server: Python MCP Sandbox

πŸ“‹ Prerequisites

  • Docker Desktop - Must be running and in Linux containers mode

  • Node.js - Version 18+ recommended

  • Docker Image - my-llm-sandbox image must be available locally

πŸ› οΈ Installation & Setup

1. Clone and Navigate

cd src

2. Install Dependencies

npm install

3. Start the Server

Development Mode (Recommended):

npm run dev

Production Mode:

npm run build npm start

The server will start on http://localhost:3000

πŸ“‘ API Endpoints

1. Execute Code

POST /execute

Execute Python code in a secure Docker container.

Request Body:

{ "code": "print('Hello, World!')", "language": "python", "timeout": 30, "libraries": [] }

Response:

{ "content": [ { "type": "text", "text": "Hello, World!\n" } ], "exitCode": 0 }

2. Real-time Streaming

GET /sse

Connect to Server-Sent Events for real-time output streaming.

Usage:

  • Open in browser: http://localhost:3000/sse

  • Or use curl: curl http://localhost:3000/sse

Output Format:

data: {"type":"stdout","data":"Hello, World!\n"} data: {"type":"stderr","data":"Error message\n"}

3. Health Check

GET /health

Check if the server is running properly.

Response:

{ "status": "ok", "message": "MCP Executor server is healthy", "timestamp": "2025-07-10T18:12:11.711Z" }

πŸ§ͺ Testing

Using Postman

  1. Test Health Check:

    • Method: GET

    • URL: http://localhost:3000/health

  2. Test Code Execution:

    • Method: POST

    • URL: http://localhost:3000/execute

    • Headers: Content-Type: application/json

    • Body:

    { "code": "print('Hello from Postman!')", "language": "python" }
  3. Test SSE Streaming:

    • Method: GET

    • URL: http://localhost:3000/sse

    • Keep connection open while testing /execute

Using curl

# Health check curl http://localhost:3000/health # Execute code curl -X POST http://localhost:3000/execute \ -H "Content-Type: application/json" \ -d '{"code": "print(\"Hello from curl!\")", "language": "python"}' # SSE streaming curl http://localhost:3000/sse

πŸ”§ n8n Integration

Basic Setup

  1. Add HTTP Request Node

  2. Configure:

    • Method: POST

    • URL: http://localhost:3000/execute

    • Headers: Content-Type: application/json

    • Body (JSON):

    { "code": "print('Hello from n8n!')", "language": "python" }

Advanced Examples

Dynamic Code Execution:

{ "code": "{{ $json.code }}", "language": "python" }

Data Processing:

{ "code": "import json; data = {{ $json.data }}; print('Processed:', len(data))", "language": "python" }

File Operations:

{ "code": "with open('output.txt', 'w') as f: f.write('{{ $json.content }}'); print('File written')", "language": "python" }

πŸ“ Project Structure

src/ β”œβ”€β”€ index.ts # Main Express server β”œβ”€β”€ server.ts # Docker execution logic β”œβ”€β”€ utils/ β”‚ └── sse.ts # Server-Sent Events handling β”œβ”€β”€ types/ # TypeScript type definitions β”œβ”€β”€ tools/ # Tool definitions β”œβ”€β”€ docs/ # Documentation β”œβ”€β”€ package.json # Dependencies and scripts β”œβ”€β”€ tsconfig.json # TypeScript configuration β”œβ”€β”€ Dockerfile # Docker configuration └── README.md # This file

πŸ” Troubleshooting

Common Issues

  1. Docker not running:

    • Ensure Docker Desktop is running

    • Check that it's in Linux containers mode

  2. Container not found:

    • Verify my-llm-sandbox image exists: docker images

    • Rebuild if needed: docker build -t my-llm-sandbox .

  3. Permission denied:

    • Ensure Docker has proper permissions

    • Check Docker socket access

  4. Slow responses:

    • First request may take 2-3 seconds (container startup)

    • Subsequent requests should be fast (1-2 seconds)

Debug Mode

Enable detailed logging by checking the server console output for:

  • [STDOUT] - Docker container output

  • [STDERR] - Docker container errors

  • Executing code in Docker container... - Request processing

πŸš€ Deployment

Docker Deployment

  1. Build the image:

    docker build -t mcp-executor .
  2. Run the container:

    docker run -p 3000:3000 -v /var/run/docker.sock:/var/run/docker.sock mcp-executor

Environment Variables

  • PORT - Server port (default: 3000)

  • NODE_ENV - Environment (development/production)

πŸ“ Examples

Simple Python Code

{ "code": "print('Hello, World!')", "language": "python" }

Mathematical Operations

{ "code": "import math; print(f'Pi: {math.pi}'); print(f'2^10: {2**10}')", "language": "python" }

Data Processing

{ "code": "data = [1, 2, 3, 4, 5]; print(f'Sum: {sum(data)}'); print(f'Average: {sum(data)/len(data)}')", "language": "python" }

File Operations

{ "code": "with open('test.txt', 'w') as f: f.write('Hello from Docker!'); print('File created')", "language": "python" }

🀝 Contributing

  1. Fork the repository

  2. Create a feature branch

  3. Make your changes

  4. Test thoroughly

  5. Submit a pull request

πŸ“„ License

This project is licensed under the ISC License.

πŸ†˜ Support

For issues and questions:

  1. Check the troubleshooting section

  2. Review the console logs

  3. Test with simple code examples

  4. Verify Docker setup


Happy coding! πŸŽ‰

-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/nikhilkumaragarwal-shyftlabs/MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server