Executes arbitrary Python code in secure, isolated sandbox environments with support for pip package installation, file mounting, and artifact persistence.
PRIMS – Python Runtime Interpreter MCP Server
PRIMS is a tiny open-source Model Context Protocol (MCP) server that lets LLM agents run arbitrary Python code in a secure, throw-away sandbox.
• One tool, one job. Exposes a single MCP tool – run_code
– that executes user-supplied Python and streams back stdout / stderr
.
• Isolated & reproducible. Each call spins up a fresh virtual-env, installs any requested pip packages, mounts optional read-only files, then nukes the workspace.
• Zero config. Works over MCP/stdio or drop it in Docker.
Quick-start
1. Local development environment
2. Launch the server
3. Docker
Examples
List available tools
You can use the provided script to list all tools exposed by the server:
Expected output (tool names and descriptions may vary):
Run code via the MCP server
Mount a dataset once & reuse it
This mounts a CSV with mount_file
and then reads it inside run_code
without re-supplying the URL.
Inspect your session workspace
This shows how to use the list_dir
and preview_file
tools to browse files your code created.
Persist an artifact to permanent storage
The persist_artifact
tool uploads a file from your output/
directory to a presigned URL.
Example (Python):
Download an artifact
Small artifacts can be fetched directly:
Available tools
Tool | Purpose |
---|---|
run_code | Execute Python in an isolated sandbox with optional pip deps. |
list_dir | List files/directories inside your session workspace. |
preview_file | Return up to 8 KB of a text file for quick inspection. |
persist_artifact | Upload an output/ file to a client-provided presigned URL. |
mount_file | Download a remote file once per session to mounts/<path> . |
See the examples/
directory for end-to-end demos.
Contributing
Contributions are welcome! Feel free to open issues, suggest features, or submit pull requests to help improve PRIMS.
If you find this project useful, please consider leaving a ⭐ to show your support.
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
Enables LLM agents to execute arbitrary Python code in secure, isolated sandbox environments with automatic dependency management and file handling. Each code execution runs in a fresh virtual environment that gets destroyed after completion, providing safe and reproducible Python code execution.
Related MCP Servers
- -securityAlicense-qualityAn interactive Python code execution environment that allows users and LLMs to safely execute Python code and install packages in isolated Docker containers.Last updated -24Apache 2.0
- -securityAlicense-qualityAn interactive Python code execution tool that allows users and LLMs to safely execute Python code and install packages in isolated Docker containers.Last updated -24Apache 2.0
- -securityFlicense-qualityA secure Node.js execution environment that allows coding agents and LLMs to run JavaScript dynamically, install NPM packages, and retrieve results while adhering to the Model Control Protocol.Last updated -774
- AsecurityAlicenseAqualityA secure Model Context Protocol server that allows AI assistants and LLM applications to safely execute Python and JavaScript code snippets in containerized environments.Last updated -2138MIT License