Skip to main content
Glama

Code Execution Server

Code Execution Server

This repository provides a basic implementation of a code execution server, designed primarily for Xmaster (paper, code) and Browse Master (paper code). The full implementation is used in SciMaster.

Due to the proprietary nature of the full code, this repository only includes an open-source framework and the basic components required for code execution. It also includes a simple network search tool implementation.

⚠️ Warning: This is a basic code execution server without virtualization or safety protections. For added security, consider running it within Docker or Apptainer containers as necessary.


🛠️ Setup

Environment

Clone this repository and navigate to the project directory and install the required dependencies:

cd mcp_sandbox/ pip install -r requirements.txt

Tools

  • setup the serper key in configs/web_agent.json
  • setup the models' api key in configs/llm_call.json

🚀 Deploy the Code Execution Server

Step 1: Start the API Server

We will first start the API server used by the tools. This API server proxies all search-related services, including:

  • Serper's Google Search Service
  • A series of Model APIs

Navigate to the api_proxy directory and start the API server:

cd api_proxy python api_server.py

Step 2: Deploy the Server

Deploy the server by running the following script in the MCP directory:

cd MCP bash deploy_server.sh

📝 Usage

Sending a Request

To send a request to the server, use the following curl command:

curl -X POST "http://<your-server-url>/execute" \ -H "Content-Type: application/json" \ -d '{"code": "<your code here>"}'

⚡ Benchmarking

For benchmarking, you can run the following command to test the server's performance:

bash benchmarking/pressure.sh 100 100 10 benchmarking/script.lua http://127.0.0.1:30008

Example output:

Running 10s test @ http://127.0.0.1:30008/execute 100 threads and 100 connections Thread Stats Avg Stdev Max +/- Stdev Latency 50.21ms 47.15ms 296.96ms 53.20% Req/Sec 24.13 13.58 130.00 54.99% 23185 requests in 10.10s, 4.27MB read Requests/sec: 2295.61 Transfer/sec: 432.74KB

-
security - not tested
F
license - not found
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

Enables execution of code in a sandbox environment with integrated web search capabilities. Provides a basic framework for running code safely, primarily designed for AI agents and research applications.

  1. 🛠️ Setup
    1. Environment
    2. Tools
  2. 🚀 Deploy the Code Execution Server
    1. Step 1: Start the API Server
    2. Step 2: Deploy the Server
  3. 📝 Usage
    1. Sending a Request
    2. ⚡ Benchmarking

Related MCP Servers

  • A
    security
    F
    license
    A
    quality
    Create sandboxed local development environments directly from Github and execute tests, see coverage and more! Supports Python, Node, Bun and many test runners.
    Last updated -
    4
    25
  • -
    security
    A
    license
    -
    quality
    The sessionless code interpreter. Securely run AI-generated code in stateful sandboxes that run forever.
    Last updated -
    6
    219
    MIT License
  • -
    security
    F
    license
    -
    quality
    A local server that provides powerful code analysis and search capabilities for software projects, helping AI assistants and development tools understand codebases for tasks like code generation and refactoring.
    Last updated -
    3
    • Apple
    • Linux
  • A
    security
    F
    license
    A
    quality
    A simple AI development tool that helps users interact with AI through natural language commands, offering 29 tools across thinking, memory, browser, code quality, planning, and time management capabilities.
    Last updated -
    33
    6

View all related MCP servers

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/sjtu-sai-agents/mcp_sandbox'

If you have feedback or need assistance with the MCP directory API, please join our Discord server