Skip to main content
Glama

PRIMS – Python Runtime Interpreter MCP Server

PRIMS – Python Runtime Interpreter MCP Server

PRIMS is a tiny open-source Model Context Protocol (MCP) server that lets LLM agents run arbitrary Python code in a secure, throw-away sandbox.

One tool, one job. Exposes a single MCP tool – run_code – that executes user-supplied Python and streams back stdout / stderr.

Isolated & reproducible. Each call spins up a fresh virtual-env, installs any requested pip packages, mounts optional read-only files, then nukes the workspace.

Zero config. Works over MCP/stdio or drop it in Docker.


Quick-start

1. Local development environment

chmod +x scripts/setup_env.sh # once, to make the script executable ./scripts/setup_env.sh # creates .venv & installs deps # activate the venv in each new shell source .venv/bin/activate

2. Launch the server

python -m server.main # binds http://0.0.0.0:9000/mcp

3. Docker

# Quick one-liner (build + run) chmod +x scripts/docker_run.sh ./scripts/docker_run.sh # prints the MCP URL when ready

Examples

List available tools

You can use the provided script to list all tools exposed by the server:

python examples/list_tools.py

Expected output (tool names and descriptions may vary):

Available tools: - run_code: Execute Python code in a secure sandbox with optional dependencies & file mounts. - list_dir: List files/directories in your session workspace. - preview_file: Preview up to 8 KB of a text file from your session workspace. - persist_artifact: Upload an output/ file to a presigned URL for permanent storage. - mount_file: Download a remote file once per session to `mounts/<path>`.

Run code via the MCP server

python examples/run_code.py

Mount a dataset once & reuse it

python examples/mount_and_run.py

This mounts a CSV with mount_file and then reads it inside run_code without re-supplying the URL.

Inspect your session workspace

python examples/inspect_workspace.py

This shows how to use the list_dir and preview_file tools to browse files your code created.

Persist an artifact to permanent storage

The persist_artifact tool uploads a file from your output/ directory to a presigned URL.

Example (Python):

await client.call_tool("persist_artifact", { "relative_path": "plots/plot.png", "presigned_url": "https://bucket.s3.amazonaws.com/...signature...", })

Download an artifact

Small artifacts can be fetched directly:

curl -H "mcp-session-id: <your-session-id>" \ http://localhost:9000/artifacts/plots/plot.png -o plot.png

Available tools

ToolPurpose
run_codeExecute Python in an isolated sandbox with optional pip deps.
list_dirList files/directories inside your session workspace.
preview_fileReturn up to 8 KB of a text file for quick inspection.
persist_artifactUpload an output/ file to a client-provided presigned URL.
mount_fileDownload a remote file once per session to mounts/<path>.

See the examples/ directory for end-to-end demos.

Contributing

Contributions are welcome! Feel free to open issues, suggest features, or submit pull requests to help improve PRIMS.

If you find this project useful, please consider leaving a ⭐ to show your support.

-
security - not tested
A
license - permissive license
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

Enables LLM agents to execute arbitrary Python code in secure, isolated sandbox environments with automatic dependency management and file handling. Each code execution runs in a fresh virtual environment that gets destroyed after completion, providing safe and reproducible Python code execution.

  1. Quick-start
    1. 1. Local development environment
    2. 2. Launch the server
    3. 3. Docker
  2. Examples
    1. List available tools
    2. Run code via the MCP server
    3. Mount a dataset once & reuse it
    4. Inspect your session workspace
    5. Persist an artifact to permanent storage
    6. Download an artifact
  3. Available tools
    1. Contributing

      Related MCP Servers

      • -
        security
        A
        license
        -
        quality
        An interactive Python code execution environment that allows users and LLMs to safely execute Python code and install packages in isolated Docker containers.
        Last updated -
        24
        Apache 2.0
      • -
        security
        A
        license
        -
        quality
        An interactive Python code execution tool that allows users and LLMs to safely execute Python code and install packages in isolated Docker containers.
        Last updated -
        24
        Apache 2.0
      • -
        security
        F
        license
        -
        quality
        A secure Node.js execution environment that allows coding agents and LLMs to run JavaScript dynamically, install NPM packages, and retrieve results while adhering to the Model Control Protocol.
        Last updated -
        77
        4
      • A
        security
        A
        license
        A
        quality
        A secure Model Context Protocol server that allows AI assistants and LLM applications to safely execute Python and JavaScript code snippets in containerized environments.
        Last updated -
        2
        138
        MIT License

      View all related MCP servers

      MCP directory API

      We provide all the information about MCP servers via our MCP API.

      curl -X GET 'https://glama.ai/api/mcp/v1/servers/hileamlakB/Python-Runtime-Interpreter-MCP-Server'

      If you have feedback or need assistance with the MCP directory API, please join our Discord server