colab-mcp
Controls Google Colab as a development, shell, file, and training runtime, providing tools for notebook editing, runtime management, file transfer, GPU monitoring, and more.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@colab-mcpconnect to Colab runtime with T4 GPU"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
The Repositories was supported by Linux.do
Colab MCP
Local-first MCP server for controlling Google Colab as a development, shell, file, and training runtime.
Chinese documentation: README.zh-CN.md
What This Provides
This fork extends the upstream static Colab MCP baseline with:
notebook editing, batch execution, status polling, and output reading
controlled Edge browser startup and Colab frontend MCP repair
explicit Python runtime connection with
connect_runtimeMCP-native runtime accelerator switching, including T4 GPU selection
Colab Terminal-backed shell commands and background jobs
local-path file upload/download tools
runtime file list/stat/delete/directory tools
GPU checks, snapshots, and monitor jobs
environment variable helpers and runtime restart/shutdown tools
Project Layout
src/colab_mcp/: Python MCP server, Colab session proxy, runtime/browser tools.tests/: unit tests for proxy, tool behavior, and websocket server behavior.docs/: usage, tool inventory, examples, troubleshooting, and API conventions.TODO.md: implementation checklist and follow-up notes.
Setup
uv sync --dev
uv run pytestRun the MCP server:
uv run colab-mcpMCP client configuration:
{
"mcpServers": {
"colab-mcp": {
"command": "uv",
"args": ["run", "--directory", "<path-to-colab_mcp>", "colab-mcp"],
"startup_timeout_sec": 120,
"env": {
"COLAB_MCP_EDGE_CDP_PORT": "9333",
"COLAB_MCP_EDGE_URL_CONTAINS": "colab.research.google.com"
}
}
}
}Replace <path-to-colab_mcp> with the absolute path to your local checkout.
Runtime Readiness
Do not assume the Python runtime is ready just because the browser MCP is connected. Shell, file, GPU, and training tools require a real Colab runtime and terminal socket.
Result Contract
Tools use structured status fields:
ok=true,status="ok": operation completed without known warnings.ok=false,status="warning": prerequisite missing, partial/no-op result, or recoverable state. ReadwarningsandrecommendedNextActions.ok=false,status="error": execution failed, timed out, returned a non-zero exit code, or hit an explicit tool error. Inspecterror,stdout,stderr, andrecommendedNextActionsbefore retrying.
Do not continue a workflow after warning or error without handling the returned instructions.
Browser And Runtime Flow
There are two separate connection layers:
Browser/frontend MCP connection.
Colab Python runtime connection.
For normal runtime work:
open_colab_browser_connection()
connect_runtime(waitSeconds=180)For GPU work:
open_colab_browser_connection()
set_runtime_accelerator(accelerator="T4 GPU", apply=true)
open_colab_browser_connection()
connect_runtime(waitSeconds=180)
check_gpu()Changing the accelerator can restart or disconnect the runtime. Always reconnect the browser MCP and then call connect_runtime before shell, file, GPU, or training tools.
Common Training Flow
All core training steps can run through MCP tools:
open_colab_browser_connection()
set_runtime_accelerator(accelerator="T4 GPU", apply=true)
open_colab_browser_connection()
connect_runtime(waitSeconds=180)
check_gpu()
upload_local_file(localPath="<path-to-train.py>", path="/content/train.py", overwrite=true)
start_background_command(
name="train",
command="python /content/train.py",
logPath="/content/train.log",
cwd="/content"
)
watch_background_command(name="train", lines=100)
download_file_to_local(path="/content/model.pt", localPath="<path-to-model.pt>", overwrite=true)
shutdown_runtime(reason="training finished")Use start_background_command for training and long jobs. Do not use run_shell_command for training; it is for short bounded commands.
Release Or Close The Runtime Instance
Always release the Colab runtime when work is finished, cancelled, or no longer needs CPU/GPU resources:
shutdown_runtime(reason="training finished")Recommended completion flow:
stat_file(path="/content/model.pt")
download_file_to_local(path="/content/model.pt", localPath="<path-to-model.pt>", overwrite=true)
shutdown_runtime(reason="training finished")Recommended cancellation flow:
check_background_command(name="train")
stop_background_command(name="train")
shutdown_runtime(reason="training cancelled")Important notes:
Download weights, logs, and other artifacts before shutdown. Files under
/contentcan be lost after the runtime is released.shutdown_runtimereleases/disconnects the active Colab CPU/GPU runtime instance. It does not uninstall this MCP server and does not close the browser tab.After shutdown, future runtime work must reconnect with
open_colab_browser_connection()andconnect_runtime(waitSeconds=180).
Local File Transfer
Prefer local-path tools when moving files between this machine and Colab:
upload_local_file(localPath="<local-file>", path="/content/input.txt", overwrite=true)
download_file_to_local(path="/content/output.txt", localPath="<local-output>", overwrite=true)Base64 tools remain available for clients that already have content in memory:
upload_file(path="/content/input.txt", contentBase64="...")
download_file(path="/content/output.txt")
upload_file_chunk(...)
download_file_chunk(...)
complete_upload(...)Tool Groups
Connection:
get_connection_info
Browser, runtime, and accelerator:
open_colab_browser_connectionconnect_runtimecheck_runtimerestart_runtimeshutdown_runtimeset_runtime_accelerator
Shell and long jobs:
run_shell_commandstart_background_commandcheck_background_commandwatch_background_commandlist_background_commandsstop_background_commandtail_file
Runtime files:
upload_local_filedownload_file_to_localupload_filedownload_fileupload_file_chunkcomplete_uploaddownload_file_chunkstat_filelist_filesmake_directorydelete_file
GPU and resource monitoring:
check_gpuresource_usage_snapshotsample_gpu_usagestart_gpu_monitorread_gpu_monitorstop_gpu_monitor
Notebook cells:
get_cellsget_celladd_code_celladd_text_cellupdate_cellpatch_celldelete_cellmove_cellfind_cellsreplace_cellsrun_code_cellrun_code_cellsrun_cell_rangerun_all_cellscancel_queued_cellsget_cell_statuswait_for_cellsread_cell_outputswatch_cell_outputs
Environment:
set_env_varsget_env_varsunset_env_varsload_env_filererun_env_setup_cells
Notebook import/export:
import_notebookexport_notebookupload_notebookalias forimport_notebookdownload_notebookalias forexport_notebook
Browser Control
open_colab_browser_connection can start a dedicated Microsoft Edge instance, open Colab, and connect the Colab frontend to the local MCP server.
Defaults:
CDP port:
9333, override withCOLAB_MCP_EDGE_CDP_PORTEdge profile:
~/.codex/edge-colab-mcp-profile, override withCOLAB_MCP_EDGE_PROFILEEdge executable: auto-detected, override with
COLAB_MCP_EDGE_PATH
colabctl remains available for diagnostics and manual repair:
uv run colabctl status
uv run colabctl connect
uv run colabctl smoke-mcp
uv run colabctl set-accelerator --accelerator GPUNormal automation should prefer MCP tools over colabctl.
Verification
Run local tests:
uv run pytest
uv run python -m compileall srcThe current flow has been validated with a full MCP-only ResCNN smoke:
selected T4 through MCP
connected the Python runtime through MCP
confirmed
Tesla T4withcheck_gpuuploaded a local training script
trained with
start_background_commandwatched logs with
watch_background_commanddownloaded weights with
download_file_to_localshut down the runtime with
shutdown_runtime
Documentation
docs/USAGE.md: usage notes
docs/TOOLS.md: tool inventory
docs/EXAMPLES.md: common examples
docs/TROUBLESHOOTING.md: recovery guidance
docs/API_CONVENTIONS.md: schema and result conventions
Thanks to the linux.do laoyou for their support. "Laoyou" is the community's own friendly term for respected peers.
Privacy Notes
The README uses placeholders such as <path-to-colab_mcp> and does not include local usernames, personal checkout paths, tokens, or runtime artifacts. Generated training artifacts should stay under artifacts/, which is ignored by git.
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/404F0X/better_colab_MCP'
If you have feedback or need assistance with the MCP directory API, please join our Discord server