Skip to main content
Glama

MCP Executor Server

README.md6.44 kB
# MCP Executor Server A secure, Docker-based code execution server that provides HTTP API endpoints for running Python code in isolated containers. Perfect for integration with automation tools like n8n, web applications, or any HTTP client. ## 🚀 Features - **🔒 Secure Execution** - Code runs in isolated Docker containers - **⚡ Fast Performance** - Optimized for quick code execution - **📡 Real-time Streaming** - Server-Sent Events (SSE) for live output - **🌐 HTTP API** - RESTful endpoints for easy integration - **📊 Health Monitoring** - Built-in health check endpoint - **🐍 Python Support** - Execute Python code with full library access - **🔧 Docker Integration** - Uses `my-llm-sandbox` container for execution ## 📋 Prerequisites - **Docker Desktop** - Must be running and in Linux containers mode - **Node.js** - Version 18+ recommended - **Docker Image** - `my-llm-sandbox` image must be available locally ## 🛠️ Installation & Setup ### 1. Clone and Navigate ```bash cd src ``` ### 2. Install Dependencies ```bash npm install ``` ### 3. Start the Server **Development Mode (Recommended):** ```bash npm run dev ``` **Production Mode:** ```bash npm run build npm start ``` The server will start on `http://localhost:3000` ## 📡 API Endpoints ### 1. Execute Code **POST** `/execute` Execute Python code in a secure Docker container. **Request Body:** ```json { "code": "print('Hello, World!')", "language": "python", "timeout": 30, "libraries": [] } ``` **Response:** ```json { "content": [ { "type": "text", "text": "Hello, World!\n" } ], "exitCode": 0 } ``` ### 2. Real-time Streaming **GET** `/sse` Connect to Server-Sent Events for real-time output streaming. **Usage:** - Open in browser: `http://localhost:3000/sse` - Or use curl: `curl http://localhost:3000/sse` **Output Format:** ``` data: {"type":"stdout","data":"Hello, World!\n"} data: {"type":"stderr","data":"Error message\n"} ``` ### 3. Health Check **GET** `/health` Check if the server is running properly. **Response:** ```json { "status": "ok", "message": "MCP Executor server is healthy", "timestamp": "2025-07-10T18:12:11.711Z" } ``` ## 🧪 Testing ### Using Postman 1. **Test Health Check:** - Method: `GET` - URL: `http://localhost:3000/health` 2. **Test Code Execution:** - Method: `POST` - URL: `http://localhost:3000/execute` - Headers: `Content-Type: application/json` - Body: ```json { "code": "print('Hello from Postman!')", "language": "python" } ``` 3. **Test SSE Streaming:** - Method: `GET` - URL: `http://localhost:3000/sse` - Keep connection open while testing `/execute` ### Using curl ```bash # Health check curl http://localhost:3000/health # Execute code curl -X POST http://localhost:3000/execute \ -H "Content-Type: application/json" \ -d '{"code": "print(\"Hello from curl!\")", "language": "python"}' # SSE streaming curl http://localhost:3000/sse ``` ## 🔧 n8n Integration ### Basic Setup 1. **Add HTTP Request Node** 2. **Configure:** - Method: `POST` - URL: `http://localhost:3000/execute` - Headers: `Content-Type: application/json` - Body (JSON): ```json { "code": "print('Hello from n8n!')", "language": "python" } ``` ### Advanced Examples **Dynamic Code Execution:** ```json { "code": "{{ $json.code }}", "language": "python" } ``` **Data Processing:** ```json { "code": "import json; data = {{ $json.data }}; print('Processed:', len(data))", "language": "python" } ``` **File Operations:** ```json { "code": "with open('output.txt', 'w') as f: f.write('{{ $json.content }}'); print('File written')", "language": "python" } ``` ## 📁 Project Structure ``` src/ ├── index.ts # Main Express server ├── server.ts # Docker execution logic ├── utils/ │ └── sse.ts # Server-Sent Events handling ├── types/ # TypeScript type definitions ├── tools/ # Tool definitions ├── docs/ # Documentation ├── package.json # Dependencies and scripts ├── tsconfig.json # TypeScript configuration ├── Dockerfile # Docker configuration └── README.md # This file ``` ## 🔍 Troubleshooting ### Common Issues 1. **Docker not running:** - Ensure Docker Desktop is running - Check that it's in Linux containers mode 2. **Container not found:** - Verify `my-llm-sandbox` image exists: `docker images` - Rebuild if needed: `docker build -t my-llm-sandbox .` 3. **Permission denied:** - Ensure Docker has proper permissions - Check Docker socket access 4. **Slow responses:** - First request may take 2-3 seconds (container startup) - Subsequent requests should be fast (1-2 seconds) ### Debug Mode Enable detailed logging by checking the server console output for: - `[STDOUT]` - Docker container output - `[STDERR]` - Docker container errors - `Executing code in Docker container...` - Request processing ## 🚀 Deployment ### Docker Deployment 1. **Build the image:** ```bash docker build -t mcp-executor . ``` 2. **Run the container:** ```bash docker run -p 3000:3000 -v /var/run/docker.sock:/var/run/docker.sock mcp-executor ``` ### Environment Variables - `PORT` - Server port (default: 3000) - `NODE_ENV` - Environment (development/production) ## 📝 Examples ### Simple Python Code ```json { "code": "print('Hello, World!')", "language": "python" } ``` ### Mathematical Operations ```json { "code": "import math; print(f'Pi: {math.pi}'); print(f'2^10: {2**10}')", "language": "python" } ``` ### Data Processing ```json { "code": "data = [1, 2, 3, 4, 5]; print(f'Sum: {sum(data)}'); print(f'Average: {sum(data)/len(data)}')", "language": "python" } ``` ### File Operations ```json { "code": "with open('test.txt', 'w') as f: f.write('Hello from Docker!'); print('File created')", "language": "python" } ``` ## 🤝 Contributing 1. Fork the repository 2. Create a feature branch 3. Make your changes 4. Test thoroughly 5. Submit a pull request ## 📄 License This project is licensed under the ISC License. ## 🆘 Support For issues and questions: 1. Check the troubleshooting section 2. Review the console logs 3. Test with simple code examples 4. Verify Docker setup --- **Happy coding! 🎉**

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/nikhilkumaragarwal-shyftlabs/MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server