Provides a secure workspace for executing Python scripts and managing files, enabling AI agents to perform data processing, code execution, and automation in a sandboxed environment.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@AI Workspace MCP Servercreate a python script to calculate Fibonacci numbers and run it"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
AI Workspace MCP Server
A Model Context Protocol (MCP) server that provides AI with a secure workspace for file management and Python script execution. Designed to run on Vercel as a serverless function.
Features
File Management Tools
create_file - Create new files with content
read_file - Read file contents
update_file - Update existing files
delete_file - Delete files
list_files - List files and directories
create_directory - Create new directories
Code Execution
execute_python - Execute Python scripts with arguments (30-second timeout)
Setup on Vercel
1. Install Vercel CLI (Optional)
2. Project Structure
Your project should look like this:
3. Deploy to Vercel
Option A: Deploy via Vercel Dashboard
Go to vercel.com
Click "Add New" → "Project"
Import your Git repository (or upload files)
Vercel will auto-detect Python and deploy
Option B: Deploy via CLI
4. Get Your Deployment URL
After deployment, Vercel will give you a URL like:
https://your-project-name.vercel.app
API Endpoints
Once deployed, your server will have these endpoints:
GET /
Returns server information and status
GET /health
Health check endpoint
GET /tools
List all available tools
POST /execute
Execute a tool
Using with AI Clients
Claude Desktop Configuration
Add this to your Claude Desktop config:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%/Claude/claude_desktop_config.json
Using the API Directly
You can integrate this with any AI that supports HTTP tool calling:
Security Features
Sandboxed Workspace: All file operations are restricted to
/tmp/workspacePath Validation: Prevents directory traversal attacks
Execution Timeout: Python scripts are limited to 30 seconds
CORS Enabled: Allows cross-origin requests
Serverless Isolation: Each request runs in an isolated environment
Tool Examples
Create and Execute a Python Script
List Files
Create Directory Structure
Response Format
All tool executions return JSON:
Success Response:
Error Response:
Execute Python Response:
Important Notes
Vercel Limitations
Temporary Storage: Files in
/tmpare ephemeral and cleared between invocations10-second timeout: Vercel functions timeout after 10 seconds on free tier (25s on Pro)
Cold Starts: First request may be slower due to cold start
No Persistent State: Each function invocation starts fresh
For Persistent Storage
If you need persistent file storage, consider:
Using Vercel KV, Postgres, or Blob storage
Integrating with AWS S3, Google Cloud Storage, etc.
Using a database to store file contents
Environment Variables (Optional)
You can set environment variables in Vercel Dashboard:
WORKSPACE_PATH- Custom workspace path (default:/tmp/workspace)EXECUTION_TIMEOUT- Python execution timeout in seconds (default: 30)
Local Development
Test locally before deploying:
Then test with:
Troubleshooting
"Module not found" errors
Ensure requirements.txt is in the project root and contains all dependencies.
Timeout errors
Reduce Python script complexity
Upgrade to Vercel Pro for longer timeouts
Use async operations where possible
File not persisting
Remember: /tmp storage is ephemeral on Vercel. Files won't persist between invocations.
Advanced Usage
Custom MCP Client
Contributing
Feel free to extend this server with additional tools:
Add tool definition to
get_tools()Implement handler in
execute_tool()Update documentation
License
MIT License - modify and use as needed.
Support
Vercel Docs: vercel.com/docs
MCP Protocol: modelcontextprotocol.io
Issues: Open an issue or modify the code as needed