# ChatGPT Setup Guide
**Important:** ChatGPT requires a publicly accessible server (not just `localhost`). You'll need to use a tunneling service like ngrok or localtunnel to expose your local MCP server.
## Prerequisites
- Node.js (v18 or higher)
- npm
- ChatGPT Plus subscription with developer mode enabled
- A tunneling service (ngrok or localtunnel)
## Quick Start
### Step 1: Install Dependencies
```bash
npm install
```
### Step 2: Start the MCP Server
**For localhost development:**
```bash
node server.js
```
**For Netlify deployment:**
```bash
node server.js --browser-url https://your-app.netlify.app
```
Or using the short form:
```bash
node server.js -u https://your-app.netlify.app
```
⚠️ **Important:** When hosting the frontend on Netlify, you **must** use the `-u` (or `--browser-url`) parameter with your Netlify URL. This ensures that connection links generated by the MCP server point to your Netlify deployment instead of defaulting to localhost.
Server starts on:
- `http://localhost:3000/mcp` (MCP HTTP endpoint)
- `ws://localhost:3001` (WebSocket server)
### Step 3: Create a Tunnel
You need to expose your MCP server (port 3000) so ChatGPT can access it. Choose one of the options below.
## Configuration Options
### ✅ Configuration 1: MCP Server Tunnelled, WebSocket Local, App Local
**Best for:** Local development and testing
**Setup:**
- ✅ MCP server running locally on port 3000
- ✅ MCP server exposed via tunnel (ngrok/localtunnel) for ChatGPT access
- ✅ WebSocket server running locally on port 3001 (no tunnel needed)
- ✅ App running locally (`npm run dev`)
**Steps:**
1. **Start MCP server:**
```bash
node server.js
```
2. **Create tunnel for MCP HTTP endpoint (port 3000):**
**Option A: Using ngrok**
```bash
ngrok http 3000
```
Copy the HTTPS URL (e.g., `https://abc123.ngrok-free.app`)
**Option B: Using localtunnel**
```bash
lt --port 3000 --subdomain hello3dllm-mcpserver
```
Creates URL: `https://hello3dllm-mcpserver.loca.lt`
3. **Start local app:**
```bash
npm run dev
```
Open `http://localhost:5173` in your browser
4. **Configure ChatGPT:**
- Open ChatGPT → Settings → Personalization → Model Context Protocol
- Add server:
- **Name**: `3d-model-server`
- **URL**:
- If using ngrok: `https://your-ngrok-url.ngrok-free.app/mcp`
- If using localtunnel: `https://hello3dllm-mcpserver.loca.lt/mcp`
- ⚠️ **Include `/mcp` at the end!**
- **Transport**: HTTP or Streamable HTTP
5. **Connect to the 3D app:**
- Ask ChatGPT: "How do I connect to the 3D app?" or "Get browser URL"
- ChatGPT will provide a connection URL with your session ID
- Copy and paste the URL into your browser
- The browser will connect to your ChatGPT session
**Pros:**
- ✅ Simple setup (only one tunnel needed)
- ✅ Fast local WebSocket connection
- ✅ Works with ChatGPT
**Cons:**
- ❌ App must run locally (not accessible to others)
---
### ✅ Configuration 2: MCP Server Tunnelled, WebSocket Tunnelled, App on Netlify
**Best for:** Sharing your app with others or production use
**Setup:**
- ✅ MCP server running locally on port 3000
- ✅ MCP server exposed via tunnel for ChatGPT access
- ✅ WebSocket server running locally on port 3001
- ✅ WebSocket exposed via tunnel for Netlify app
- ✅ App deployed to Netlify
**Steps:**
1. **Start MCP server with Netlify URL:**
```bash
node server.js --browser-url https://your-app.netlify.app
```
Or using the short form:
```bash
node server.js -u https://your-app.netlify.app
```
⚠️ **Important:** The `-u` parameter (or `--browser-url`) is required so that the MCP server generates correct connection URLs pointing to your Netlify deployment.
2. **Create tunnel for MCP HTTP endpoint (port 3000):**
**Option A: Using ngrok**
```bash
ngrok http 3000
```
Copy the HTTPS URL (e.g., `https://abc123.ngrok-free.app`)
**Option B: Using localtunnel**
```bash
lt --port 3000 --subdomain hello3dllm-mcpserver
```
Creates URL: `https://hello3dllm-mcpserver.loca.lt`
3. **Create tunnel for WebSocket (port 3001):**
**Option A: Using ngrok**
```bash
ngrok http 3001
```
Copy the HTTPS URL (e.g., `https://xyz789.ngrok-free.app`)
**Option B: Using localtunnel**
```bash
lt --port 3001 --subdomain hello3dllm-websocket
```
Creates URL: `https://hello3dllm-websocket.loca.lt`
⚠️ **Important:** Use `wss://` protocol for WebSocket connections (e.g., `wss://hello3dllm-websocket.loca.lt`)
4. **Configure Netlify:**
- Go to your Netlify site settings
- Add environment variable: `VITE_WS_URL`
- Set value to your tunneled WebSocket URL:
- If using ngrok: `wss://your-websocket-ngrok-url.ngrok-free.app`
- If using localtunnel: `wss://hello3dllm-websocket.loca.lt`
- ⚠️ **Use `wss://` (not `ws://`) and the HTTPS tunnel URL**
- Redeploy your site
5. **Configure ChatGPT:**
- Open ChatGPT → Settings → Personalization → Model Context Protocol
- Add server:
- **Name**: `3d-model-server`
- **URL**:
- If using ngrok: `https://your-mcp-ngrok-url.ngrok-free.app/mcp`
- If using localtunnel: `https://hello3dllm-mcpserver.loca.lt/mcp`
- ⚠️ **Include `/mcp` at the end!**
- **Transport**: HTTP or Streamable HTTP
6. **Connect to the 3D app:**
- Ask ChatGPT: "How do I connect to the 3D app?" or "Get browser URL"
- ChatGPT will provide a Netlify URL with your session ID (e.g., `https://your-app.netlify.app?sessionId=abc-123...`)
- Open that URL in your browser
**Pros:**
- ✅ App accessible to anyone via Netlify
- ✅ Works with ChatGPT
- ✅ No backend hosting costs
**Cons:**
- ❌ Requires two tunnels (both must stay active)
- ❌ Local machine must be running 24/7
- ❌ Tunnel URLs may change (especially with ngrok free tier)
---
## Tunneling Services
### Option A: Using ngrok (Recommended for Testing)
1. **Install ngrok:**
- Download from https://ngrok.com or `brew install ngrok`
- Sign up for a free account (optional but recommended)
2. **Start ngrok:**
```bash
ngrok http 3000
```
For a custom domain (requires free ngrok account):
```bash
ngrok http 3000 --domain=your-name.ngrok-free.app
```
3. **Copy the HTTPS URL** from ngrok (e.g., `https://abc123.ngrok-free.app`)
**Note:** ngrok free tier URLs change each time you restart ngrok. For a more stable URL, consider:
- Using ngrok's paid plan with a custom domain
- Using localtunnel with a custom subdomain (see below)
### Option B: Using localtunnel (Alternative to ngrok)
1. **Install localtunnel:**
```bash
npm install -g localtunnel
```
2. **Start localtunnel:**
```bash
lt --port 3000 --subdomain hello3dllm-mcpserver
```
Creates URL: `https://hello3dllm-mcpserver.loca.lt`
**Benefits of localtunnel:**
- ✅ Custom subdomains that remain consistent (as long as the subdomain is available)
- ✅ No account required for basic usage
- ✅ Simple command-line interface
**Note:** Custom subdomains remain consistent as long as the subdomain is available and the tunnel stays active. This makes it easier to maintain stable URLs.
---
## Using the MCP Tools
Once connected, ask ChatGPT to manipulate the model using natural language:
- **Change color**: "Change the model to red" or "Make it blue"
- **Change size**: "Make the model bigger" or "Set size to 2.5"
- **Scale**: "Stretch horizontally" or "Make it tall and thin"
- **Background**: "Change background to black"
- **Combined**: "Make a red model that's tall and thin"
ChatGPT will automatically call the appropriate MCP tools, and changes appear in real-time in your browser.
---
## Troubleshooting
### 404 Not Found
- Make sure URL includes `/mcp` at the end
- Verify tunnel is running and accessible
### Connection Refused
- Verify MCP server is running (`node server.js`)
- Check that port 3000 is not blocked by firewall
- Ensure tunnel is pointing to the correct port
### Tools Not Available
- Refresh ChatGPT page after adding server
- Verify tunnel URL is correct and accessible
- Check that MCP server is running
### Changes Not Visible
- Ensure web app (`npm run dev`) is running (for local setup)
- Verify browser is connected with the correct session ID
- Check browser console for WebSocket connection errors
- For Netlify setup, verify `VITE_WS_URL` is set correctly
### Tunnel URL Changed
- **ngrok**: URLs change each time you restart ngrok on the free tier
- **localtunnel**: Custom subdomains remain consistent as long as available
- Update ChatGPT configuration and Netlify environment variables when URLs change
---
## Security Note
The server currently allows all origins (`origin: '*'`). For production, restrict CORS:
```javascript
cors({
origin: ['https://chat.openai.com', 'https://chatgpt.com']
})
```
---
## Important Notes
- **Keep tunnels active**: Both tunnels (MCP and WebSocket) must remain running while using the application
- **Tunnel URL stability**:
- **ngrok**: URLs change each time you restart ngrok on the free tier (unless you use a paid plan with a custom domain)
- **localtunnel**: Custom subdomains remain consistent as long as the subdomain is available
- **Redeploy Netlify after URL changes**: When your tunnel WebSocket URL changes, you must update `VITE_WS_URL` in Netlify and trigger a new deployment
- **Network dependency**: Your local machine must be connected to the internet and running 24/7 for production use
---
## Next Steps
- See the main [README.md](../README.md) for more information about available MCP tools
- Check [NETLIFY_SETUP.md](./NETLIFY_SETUP.md) for Netlify deployment details
- Review the project structure and architecture in the main README