Skip to main content
Glama

get_viewport_screenshot

Capture a screenshot of the current Blender 3D viewport to document progress, share visual results, or create reference images for modeling workflows.

Instructions

Capture a screenshot of the current Blender 3D viewport. Parameters: - max_size: Maximum size in pixels for the largest dimension (default: 800) Returns the screenshot as an Image.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
max_sizeNo

Implementation Reference

  • The primary handler function for the 'get_viewport_screenshot' MCP tool, decorated with @mcp.tool() for automatic registration. It handles the tool execution by connecting to Blender, sending the screenshot command, saving to temp file, reading bytes, and returning as Image artifact. Includes input schema in signature and docstring.
    @mcp.tool() def get_viewport_screenshot(ctx: Context, max_size: int = 800) -> Image: """ Capture a screenshot of the current Blender 3D viewport. Parameters: - max_size: Maximum size in pixels for the largest dimension (default: 800) Returns the screenshot as an Image. """ try: blender = get_blender_connection() # Create temp file path temp_dir = tempfile.gettempdir() temp_path = os.path.join(temp_dir, f"blender_screenshot_{os.getpid()}.png") result = blender.send_command("get_viewport_screenshot", { "max_size": max_size, "filepath": temp_path, "format": "png" }) if "error" in result: raise Exception(result["error"]) if not os.path.exists(temp_path): raise Exception("Screenshot file was not created") # Read the file with open(temp_path, 'rb') as f: image_bytes = f.read() # Delete the temp file os.remove(temp_path) return Image(data=image_bytes, format="png") except Exception as e: logger.error(f"Error capturing screenshot: {str(e)}") raise Exception(f"Screenshot failed: {str(e)}")
  • Helper function used by the tool handler to obtain a persistent socket connection to the Blender addon server.
    def get_blender_connection(): """Get or create a persistent Blender connection""" global _blender_connection, _polyhaven_enabled # Add _polyhaven_enabled to globals # If we have an existing connection, check if it's still valid if _blender_connection is not None: try: # First check if PolyHaven is enabled by sending a ping command result = _blender_connection.send_command("get_polyhaven_status") # Store the PolyHaven status globally _polyhaven_enabled = result.get("enabled", False) return _blender_connection except Exception as e: # Connection is dead, close it and create a new one logger.warning(f"Existing connection is no longer valid: {str(e)}") try: _blender_connection.disconnect() except: pass _blender_connection = None # Create a new connection if needed if _blender_connection is None: host = os.getenv("BLENDER_HOST", DEFAULT_HOST) port = int(os.getenv("BLENDER_PORT", DEFAULT_PORT)) _blender_connection = BlenderConnection(host=host, port=port) if not _blender_connection.connect(): logger.error("Failed to connect to Blender") _blender_connection = None raise Exception("Could not connect to Blender. Make sure the Blender addon is running.") logger.info("Created new persistent connection to Blender") return _blender_connection
  • Key method on BlenderConnection class used by the handler to send the 'get_viewport_screenshot' command to the Blender addon and receive/parse the response.
    def send_command(self, command_type: str, params: Dict[str, Any] = None) -> Dict[str, Any]: """Send a command to Blender and return the response""" if not self.sock and not self.connect(): raise ConnectionError("Not connected to Blender") command = { "type": command_type, "params": params or {} } try: # Log the command being sent logger.info(f"Sending command: {command_type} with params: {params}") # Send the command self.sock.sendall(json.dumps(command).encode('utf-8')) logger.info(f"Command sent, waiting for response...") # Set a timeout for receiving - use the same timeout as in receive_full_response self.sock.settimeout(15.0) # Match the addon's timeout # Receive the response using the improved receive_full_response method response_data = self.receive_full_response(self.sock) logger.info(f"Received {len(response_data)} bytes of data") response = json.loads(response_data.decode('utf-8')) logger.info(f"Response parsed, status: {response.get('status', 'unknown')}") if response.get("status") == "error": logger.error(f"Blender error: {response.get('message')}") raise Exception(response.get("message", "Unknown error from Blender")) return response.get("result", {}) except socket.timeout: logger.error("Socket timeout while waiting for response from Blender") # Don't try to reconnect here - let the get_blender_connection handle reconnection # Just invalidate the current socket so it will be recreated next time self.sock = None raise Exception("Timeout waiting for Blender response - try simplifying your request") except (ConnectionError, BrokenPipeError, ConnectionResetError) as e: logger.error(f"Socket connection error: {str(e)}") self.sock = None raise Exception(f"Connection to Blender lost: {str(e)}") except json.JSONDecodeError as e: logger.error(f"Invalid JSON response from Blender: {str(e)}") # Try to log what was received if 'response_data' in locals() and response_data: logger.error(f"Raw response (first 200 bytes): {response_data[:200]}") raise Exception(f"Invalid response from Blender: {str(e)}") except Exception as e: logger.error(f"Error communicating with Blender: {str(e)}") # Don't try to reconnect here - let the get_blender_connection handle reconnection self.sock = None raise Exception(f"Communication error with Blender: {str(e)}")
  • The main function that runs the FastMCP server instance, making all @mcp.tool()-decorated functions (including get_viewport_screenshot) available as MCP tools.
    def main(): """Run the MCP server""" mcp.run() if __name__ == "__main__": main()

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/drrodingo-del/blender-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server