Skip to main content
Glama

Code Sandbox MCP Server

by philschmid
MIT License
67

Code Sandbox MCP Server

The Code Sandbox MCP Server is a lightweight, STDIO-based Model Context Protocol (MCP) Server, allowing AI assistants and LLM applications to safely execute code snippets using containerized environments. It is uses the llm-sandbox package to execute the code snippets.

Code Sandbox MCP

How It Works:

  1. Starts a container session (podman, docker, etc.) and ensures the session is open.
  2. Writes the code to a temporary file on the host.
  3. Copies this temporary file into the container at the configured workdir.
  4. Executes the language-specific commands to run the code, e.g. python python3 -u code.py or javascript node -u code.js
  5. Captures the output and error streams from the container.
  6. Returns the output and error streams to the client.
  7. Stops and removes the container.

Available Tools:

  • run_python_code - Executes a snippet of Python code in a secure, isolated sandbox.
    • code (string, required): The Python code to execute.
  • run_js_code - Executes a snippet of JavaScript (Node.js) code in a secure, isolated sandbox.
    • code (string, required): The JavaScript code to execute.

Installation

pip install git+https://github.com/philschmid/code-sandbox-mcp.git

Getting Started: Usage with an MCP Client

Examples:

To use the Code Sandbox MCP server, you need to add it to your MCP client's configuration file (e.g., in your AI assistant's settings). The server is designed to be launched on-demand by the client.

Add the following to your mcpServers configuration:

{ "mcpServers": { "code-sandbox": { "command": "code-sandbox-mcp", } } }

Provide Secrets and pass through environment variables

You can pass through environment variables to the sandbox by setting the --pass-through-env flag when starting the MCP server and providing the env when starting the server

{ "mcpServers": { "code-sandbox": { "command": "code-sandbox-mcp", "args": ["--pass-through-env", "API_KEY,SECRET_TOKEN"] "env": { "API_KEY": "1234567890", "SECRET_TOKEN": "1234567890" } } } }

Provide a custom container image

You can provide a custom container image by setting the CONTAINER_IMAGE and CONTAINER_LANGUAGE environment variables when starting the MCP server. Both variables are required as the CONTAINER_LANGUAGE is used to determine the commands to run in the container and the CONTAINER_IMAGE is used to determine the image to use.

Note: When providing a custom container image both tools will use the same container image.

{ "mcpServers": { "code-sandbox": { "command": "code-sandbox-mcp", "env": { "CONTAINER_IMAGE": "your-own-image", "CONTAINER_LANGUAGE": "python" # or "javascript" } } } }

Use with Gemini SDK

The code-sandbox-mcp server can be used with the Gemini SDK by passing the tools parameter to the generate_content method.

from fastmcp import Client from google import genai import asyncio mcp_client = Client( { "local_server": { "transport": "stdio", "command": "code-sandbox-mcp", } } ) gemini_client = genai.Client() async def main(): async with mcp_client: response = await gemini_client.aio.models.generate_content( model="gemini-2.5-flash", contents="Use Python to ping the google.com website and return the response time.", config=genai.types.GenerateContentConfig( temperature=0, tools=[mcp_client.session], # Pass the FastMCP client session ), ) print(response.text) if __name__ == "__main__": asyncio.run(main())

Use with Gemini CLI

The code-sandbox-mcp server can be used with the Gemini CLI. You can configure MCP servers at the global level in the ~/.gemini/settings.json file or in your project's root directory, create or open the .gemini/settings.json file. Within the file, add the mcpServers configuration block.

Gemini CLI Settings

See settings.json for an example and read more about the Gemini CLI

{ "mcpServers": { "code-sandbox": { "command": "code-sandbox-mcp", } } }

Customize/Build new Container Images

The repository comes with 2 container images, which are published on Docker Hub:

  • philschmi/code-sandbox-python:latest
  • philschmi/code-sandbox-js:latest
docker build -t philschmi/code-sandbox-python:latest -f containers/Dockerfile.python . docker build -t philschmi/code-sandbox-js:latest -f containers/Dockerfile.nodejs .

The script will build the image using the current user's account. To update the images you want to use you can either pass the --python-image or --js-image flags when starting the MCP server or update the const.py file.

To push the images to Docker Hub you need to retag the images to your own account and push them.

docker tag philschmi/code-sandbox-python:latest <your-account>/code-sandbox-python:latest docker push <your-account>/code-sandbox-python:latest

To customize or install additional dependencies you can add them to the Dockerfile and build the image again.

Testing

With MCP Inspector

Start the server with streamable-http and test your server using the MCP inspector. Alternatively start inspector and run the server with stdio.

npx @modelcontextprotocol/inspector

To run the test suite for code-sandbox-mcp and its components, clone the repository and run:

# You may need to install development dependencies first pip install -e ".[dev]" # Run the tests pytest tests/

License

Code Sandbox MCP Server is open source software licensed under the MIT License.

-
security - not tested
A
license - permissive license
-
quality - not tested

A secure Model Context Protocol server that allows AI assistants and LLM applications to safely execute Python and JavaScript code snippets in containerized environments.

  1. Installation
    1. Getting Started: Usage with an MCP Client
      1. Provide Secrets and pass through environment variables
      2. Provide a custom container image
      3. Use with Gemini SDK
      4. Use with Gemini CLI
    2. Customize/Build new Container Images
      1. Testing
        1. With MCP Inspector
      2. License

        Related MCP Servers

        • A
          security
          F
          license
          A
          quality
          A Model Context Protocol server that allows secure execution of pre-approved commands, enabling AI assistants to safely interact with the user's system.
          Last updated -
          1
          6
          20
          JavaScript
        • -
          security
          F
          license
          -
          quality
          A Model Context Protocol server that enables AI assistants like Claude to perform Python development tasks through file operations, code analysis, project management, and safe code execution.
          Last updated -
          5
          Python
          • Linux
          • Apple
        • -
          security
          A
          license
          -
          quality
          A secure, container-based implementation of the Model Context Protocol (MCP) that provides sandboxed environments for AI systems to safely execute code, run commands, access files, and perform web operations.
          Last updated -
          10
          Python
          Apache 2.0
          • Linux
        • -
          security
          F
          license
          -
          quality
          A comprehensive Model Context Protocol server implementation that enables AI assistants to interact with file systems, databases, GitHub repositories, web resources, and system tools while maintaining security and control.
          Last updated -
          16
          1
          TypeScript

        View all related MCP servers

        MCP directory API

        We provide all the information about MCP servers via our MCP API.

        curl -X GET 'https://glama.ai/api/mcp/v1/servers/philschmid/code-sandbox-mcp'

        If you have feedback or need assistance with the MCP directory API, please join our Discord server