Enables control of Blender using natural language prompts, allowing for 3D modeling tasks such as creating and modifying objects, applying materials, and rendering images through a Blender add-on that handles communication with the server.
Uses locally running Ollama models to process natural language commands, with the ability to switch between different models like llama3.2 or Gemma3, and query available models from the Ollama server.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Blender Open MCPcreate a cube with a metallic material"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
blender-open-mcp
Open Models MCP for Blender3D using Ollama
Control Blender 3D with natural language prompts via local AI models. Built on the Model Context Protocol (MCP), connecting Claude, Cursor, or any MCP client to Blender through a local Ollama LLM.
Architecture
MCP Client (Claude/Cursor/CLI)
│ HTTP / stdio
▼
┌─────────────────────┐
│ FastMCP Server │ ← server.py (port 8000)
│ blender-open-mcp │
└─────────────────────┘
│ TCP socket │ HTTP
▼ ▼
┌──────────────┐ ┌─────────────┐
│ Blender │ │ Ollama │ (port 11434)
│ Add-on │ │ llama3.2 │
│ addon.py │ │ gemma3... │
│ (port 9876) │ └─────────────┘
└──────────────┘
│ bpy
▼
Blender Python APIThree independent processes:
FastMCP Server (
server.py): Exposes MCP tools over HTTP or stdioBlender Add-on (
addon.py): TCP socket server running inside BlenderOllama: Local LLM serving natural language queries
Related MCP server: SupaUI MCP Server
Installation
Prerequisites
Dependency | Version | Install |
Blender | 3.0+ | |
Python | 3.10+ | System or python.org |
Ollama | Latest | |
uv | Latest |
|
1. Clone and set up
git clone https://github.com/dhakalnirajan/blender-open-mcp.git
cd blender-open-mcp
# Create virtual environment and install
uv venv
source .venv/bin/activate # Linux / macOS
# .venv\Scripts\activate # Windows
uv pip install -e .2. Install the Blender Add-on
Open Blender
Go to Edit → Preferences → Add-ons → Install...
Select
addon.pyfrom the repository rootEnable "Blender MCP"
Open the 3D Viewport, press N, find the Blender MCP panel
Click "Start MCP Server" (default port: 9876)
3. Pull an Ollama model
ollama pull ollama run llama3.2(Other models like
Setup
Start the Ollama Server: Ensure Ollama is running in the background.
Start the MCP Server:
blender-mcpCustom options:
blender-mcp \
--host 127.0.0.1 \
--port 8000 \
--blender-host localhost \
--blender-port 9876 \
--ollama-url http://localhost:11434 \
--ollama-model llama3.2For stdio transport (Claude Desktop, Cursor):
blender-mcp --transport stdioUsage
MCP Client CLI
# Interactive shell
blender-mcp-client interactive
# One-shot scene info
blender-mcp-client scene
# Call a specific tool
blender-mcp-client tool blender_get_scene_info
blender-mcp-client tool blender_create_object '{"primitive_type": "SPHERE", "name": "MySphere"}'
# Natural language prompt
blender-mcp-client prompt "Create a metallic sphere at position 0, 0, 2"
# List all available tools
blender-mcp-client toolsPython API
import asyncio
from client.client import BlenderMCPClient
async def demo():
async with BlenderMCPClient("http://localhost:8000") as client:
# Scene inspection
print(await client.get_scene_info())
# Create objects
await client.create_object("CUBE", name="MyCube", location=(0, 0, 0))
await client.create_object("SPHERE", name="MySphere", location=(3, 0, 0))
# Apply materials
await client.set_material("MyCube", "GoldMat", color=[1.0, 0.84, 0.0, 1.0])
# Move objects
await client.modify_object("MySphere", location=(3, 0, 2), scale=(1.5, 1.5, 1.5))
# PolyHaven assets
categories = await client.get_polyhaven_categories("textures")
await client.download_polyhaven_asset("brick_wall_001", resolution="2k")
await client.set_texture("MyCube", "brick_wall_001")
# Render
await client.render_image("/tmp/my_render.png")
# AI assistance
response = await client.ai_prompt(
"Write bpy code to add a sun light pointing down"
)
print(response)
# Execute the generated code
await client.execute_code(response)
asyncio.run(demo())Claude Desktop / Cursor Integration
Add to your mcp.json (or ~/.cursor/mcp.json):
{
"mcpServers": {
"blender-open-mcp": {
"command": "blender-mcp",
"args": ["--transport", "stdio"]
}
}
}Available Tools
Tool | Description | Modifies Blender |
| Full scene summary: objects, camera, render settings | No |
| Detailed object info: transforms, materials, mesh stats | No |
| Add a primitive mesh (CUBE, SPHERE, CYLINDER, ...) | Yes |
| Change location, rotation, scale, visibility | Yes |
| Remove an object from the scene | Yes ⚠️ |
| Create and assign a Principled BSDF material | Yes |
| Render current scene to a file | Yes |
| Run arbitrary Python/bpy code in Blender | Yes ⚠️ |
| List PolyHaven asset categories | No |
| Search PolyHaven library with pagination | No |
| Download & import a PolyHaven asset | Yes |
| Apply a downloaded PolyHaven texture to an object | Yes |
| Send a natural language prompt to Ollama | No |
| List available local Ollama models | No |
| Switch the active Ollama model | No |
| Update the Ollama server URL | No |
Default Ports
Service | Port |
FastMCP Server | 8000 |
Blender Add-on (TCP) | 9876 |
Ollama | 11434 |
Development
# Install dev dependencies
uv pip install -e ".[dev]"
# Run tests
pytest tests/ -v
# Type checking
mypy src/
# Linting
ruff check src/ client/Troubleshooting
Problem | Solution |
| Open Blender → N-sidebar → Blender MCP → Start MCP Server |
| Run |
| Check exact object name via |
| Ensure the output directory exists and is writable |
| Check internet connection; try a lower resolution |
License
MIT License. See LICENSE for details.
This project is not affiliated with the Blender Foundation.
This server cannot be installed
Resources
Looking for Admin?
Admins can modify the Dockerfile, update the server description, and track usage metrics. If you are the server author, to access the admin panel.