Skip to main content
Glama

MCP Executor Server

MCP Executor Server

A secure, Docker-based code execution server that provides HTTP API endpoints for running Python code in isolated containers. Perfect for integration with automation tools like n8n, web applications, or any HTTP client.

๐Ÿš€ Features

  • ๐Ÿ”’ Secure Execution - Code runs in isolated Docker containers

  • โšก Fast Performance - Optimized for quick code execution

  • ๐Ÿ“ก Real-time Streaming - Server-Sent Events (SSE) for live output

  • ๐ŸŒ HTTP API - RESTful endpoints for easy integration

  • ๐Ÿ“Š Health Monitoring - Built-in health check endpoint

  • ๐Ÿ Python Support - Execute Python code with full library access

  • ๐Ÿ”ง Docker Integration - Uses my-llm-sandbox container for execution

๐Ÿ“‹ Prerequisites

  • Docker Desktop - Must be running and in Linux containers mode

  • Node.js - Version 18+ recommended

  • Docker Image - my-llm-sandbox image must be available locally

๐Ÿ› ๏ธ Installation & Setup

1. Clone and Navigate

cd src

2. Install Dependencies

npm install

3. Start the Server

Development Mode (Recommended):

npm run dev

Production Mode:

npm run build npm start

The server will start on http://localhost:3000

๐Ÿ“ก API Endpoints

1. Execute Code

POST /execute

Execute Python code in a secure Docker container.

Request Body:

{ "code": "print('Hello, World!')", "language": "python", "timeout": 30, "libraries": [] }

Response:

{ "content": [ { "type": "text", "text": "Hello, World!\n" } ], "exitCode": 0 }

2. Real-time Streaming

GET /sse

Connect to Server-Sent Events for real-time output streaming.

Usage:

  • Open in browser: http://localhost:3000/sse

  • Or use curl: curl http://localhost:3000/sse

Output Format:

data: {"type":"stdout","data":"Hello, World!\n"} data: {"type":"stderr","data":"Error message\n"}

3. Health Check

GET /health

Check if the server is running properly.

Response:

{ "status": "ok", "message": "MCP Executor server is healthy", "timestamp": "2025-07-10T18:12:11.711Z" }

๐Ÿงช Testing

Using Postman

  1. Test Health Check:

    • Method: GET

    • URL: http://localhost:3000/health

  2. Test Code Execution:

    • Method: POST

    • URL: http://localhost:3000/execute

    • Headers: Content-Type: application/json

    • Body:

    { "code": "print('Hello from Postman!')", "language": "python" }
  3. Test SSE Streaming:

    • Method: GET

    • URL: http://localhost:3000/sse

    • Keep connection open while testing /execute

Using curl

# Health check curl http://localhost:3000/health # Execute code curl -X POST http://localhost:3000/execute \ -H "Content-Type: application/json" \ -d '{"code": "print(\"Hello from curl!\")", "language": "python"}' # SSE streaming curl http://localhost:3000/sse

๐Ÿ”ง n8n Integration

Basic Setup

  1. Add HTTP Request Node

  2. Configure:

    • Method: POST

    • URL: http://localhost:3000/execute

    • Headers: Content-Type: application/json

    • Body (JSON):

    { "code": "print('Hello from n8n!')", "language": "python" }

Advanced Examples

Dynamic Code Execution:

{ "code": "{{ $json.code }}", "language": "python" }

Data Processing:

{ "code": "import json; data = {{ $json.data }}; print('Processed:', len(data))", "language": "python" }

File Operations:

{ "code": "with open('output.txt', 'w') as f: f.write('{{ $json.content }}'); print('File written')", "language": "python" }

๐Ÿ“ Project Structure

src/ โ”œโ”€โ”€ index.ts # Main Express server โ”œโ”€โ”€ server.ts # Docker execution logic โ”œโ”€โ”€ utils/ โ”‚ โ””โ”€โ”€ sse.ts # Server-Sent Events handling โ”œโ”€โ”€ types/ # TypeScript type definitions โ”œโ”€โ”€ tools/ # Tool definitions โ”œโ”€โ”€ docs/ # Documentation โ”œโ”€โ”€ package.json # Dependencies and scripts โ”œโ”€โ”€ tsconfig.json # TypeScript configuration โ”œโ”€โ”€ Dockerfile # Docker configuration โ””โ”€โ”€ README.md # This file

๐Ÿ” Troubleshooting

Common Issues

  1. Docker not running:

    • Ensure Docker Desktop is running

    • Check that it's in Linux containers mode

  2. Container not found:

    • Verify my-llm-sandbox image exists: docker images

    • Rebuild if needed: docker build -t my-llm-sandbox .

  3. Permission denied:

    • Ensure Docker has proper permissions

    • Check Docker socket access

  4. Slow responses:

    • First request may take 2-3 seconds (container startup)

    • Subsequent requests should be fast (1-2 seconds)

Debug Mode

Enable detailed logging by checking the server console output for:

  • [STDOUT] - Docker container output

  • [STDERR] - Docker container errors

  • Executing code in Docker container... - Request processing

๐Ÿš€ Deployment

Docker Deployment

  1. Build the image:

    docker build -t mcp-executor .
  2. Run the container:

    docker run -p 3000:3000 -v /var/run/docker.sock:/var/run/docker.sock mcp-executor

Environment Variables

  • PORT - Server port (default: 3000)

  • NODE_ENV - Environment (development/production)

๐Ÿ“ Examples

Simple Python Code

{ "code": "print('Hello, World!')", "language": "python" }

Mathematical Operations

{ "code": "import math; print(f'Pi: {math.pi}'); print(f'2^10: {2**10}')", "language": "python" }

Data Processing

{ "code": "data = [1, 2, 3, 4, 5]; print(f'Sum: {sum(data)}'); print(f'Average: {sum(data)/len(data)}')", "language": "python" }

File Operations

{ "code": "with open('test.txt', 'w') as f: f.write('Hello from Docker!'); print('File created')", "language": "python" }

๐Ÿค Contributing

  1. Fork the repository

  2. Create a feature branch

  3. Make your changes

  4. Test thoroughly

  5. Submit a pull request

๐Ÿ“„ License

This project is licensed under the ISC License.

๐Ÿ†˜ Support

For issues and questions:

  1. Check the troubleshooting section

  2. Review the console logs

  3. Test with simple code examples

  4. Verify Docker setup


Happy coding! ๐ŸŽ‰

Related MCP Servers

  • -
    security
    F
    license
    -
    quality
    Provides isolated Docker environments for code execution, enabling users to create and manage containers, execute multi-language code, save and reproduce development environments, ensuring security and isolation.
    Last updated -
    11
    • Apple
  • -
    security
    A
    license
    -
    quality
    An interactive Python code execution environment that allows users and LLMs to safely execute Python code and install packages in isolated Docker containers.
    Last updated -
    27
    Apache 2.0
  • -
    security
    F
    license
    -
    quality
    A secure Docker-based environment that allows AI assistants to safely execute code without direct access to the host system by running all code within isolated containers.
    Last updated -
    3
    • Linux
    • Apple
  • -
    security
    A
    license
    -
    quality
    An interactive Python code execution tool that allows users and LLMs to safely execute Python code and install packages in isolated Docker containers.
    Last updated -
    27
    Apache 2.0

View all related MCP servers

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/nikhilkumaragarwal-shyftlabs/MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server