STRUCTURE.txtβ’9.29 kB
FUSION360-MCP PROJECT STRUCTURE
================================
fusion360-mcp/
β
βββ π Core Documentation
β βββ README.md # Complete user guide (500+ lines)
β βββ QUICKSTART.md # 5-minute setup guide
β βββ SETUP_OLLAMA.md # Detailed Ollama setup (NEW!)
β βββ ARCHITECTURE.md # Technical documentation
β βββ PROJECT_SUMMARY.md # Project completion summary
β βββ LICENSE # MIT License
β
βββ π§ Configuration Files
β βββ config.json # Main config (create from example)
β βββ .env.example # Environment variables template
β βββ requirements.txt # Python dependencies
β βββ setup.py # Package installation
β βββ .gitignore # Git exclusions
β
βββ π Quick Start Scripts
β βββ start_server.sh # Start server (macOS/Linux)
β βββ start_server.bat # Start server (Windows)
β βββ run_tests.sh # Run test suite
β
βββ π₯οΈ MCP Server (Core Backend)
β βββ mcp_server/
β β βββ __init__.py
β β βββ server.py # FastAPI application
β β βββ router.py # Request routing logic
β β β
β β βββ schema/ # Pydantic Data Models
β β β βββ __init__.py
β β β βββ mcp_command.py # Input command schemas
β β β βββ fusion_action.py # CAD action schemas
β β β βββ llm_response.py # LLM response schemas
β β β
β β βββ llm_clients/ # LLM Implementations
β β β βββ __init__.py
β β β βββ ollama_client.py # Local Ollama (REST/CLI)
β β β βββ openai_client.py # OpenAI GPT-4o
β β β βββ gemini_client.py # Google Gemini
β β β βββ claude_client.py # Anthropic Claude
β β β
β β βββ utils/ # Utilities
β β βββ __init__.py
β β βββ logger.py # Loguru logging
β β βββ config_loader.py # Config management
β β βββ context_cache.py # Conversation cache
β β
β βββ API Endpoints:
β POST /mcp/command # Execute MCP command
β POST /mcp/execute # Log action execution
β GET /health # Health check
β GET /models # List available models
β GET /history # Conversation history
β
βββ π¨ Fusion 360 Add-in
β βββ fusion_addin/
β βββ FusionMCP.manifest # Add-in metadata
β βββ main.py # Entry point
β βββ ui_dialog.py # User interface
β βββ fusion_actions.py # Action executor
β βββ utils/
β βββ network.py # MCP client
β
β Supported Actions:
β β’ create_box # Rectangular boxes
β β’ create_cylinder # Cylinders
β β’ create_sphere # Spheres (revolve)
β β’ create_hole # Holes (cut operation)
β β’ extrude # Profile extrusion
β β’ fillet # Edge rounding
β β’ apply_material # Material assignment
β
βββ π§ System Prompt
β βββ prompts/
β βββ system_prompt.md # FusionMCP personality
β
β Defines:
β β’ Core principles (JSON output, safety)
β β’ Action schemas and templates
β β’ Clarification protocols
β β’ Example interactions
β β’ Multi-model orchestration
β
βββ π Examples
β βββ examples/
β βββ example_config.json # Full configuration
β βββ example_command.json # Sample MCP command
β βββ example_design_context.json # Design state example
β βββ test_conversation.json # Test scenarios
β
βββ π§ͺ Test Suite
βββ tests/
βββ __init__.py
βββ pytest.ini # Test configuration
βββ test_mcp_server.py # Server endpoint tests
βββ test_ollama_client.py # Ollama client tests
βββ test_schemas.py # Schema validation tests
βββ test_config_loader.py # Config loader tests
βββ test_context_cache.py # Cache operation tests
INSTALLATION LOCATIONS
======================
Python Server:
~/Desktop/fusion360-mcp/ # Project directory
~/Desktop/fusion360-mcp/venv/ # Virtual environment
~/Desktop/fusion360-mcp/logs/ # Server logs
~/Desktop/fusion360-mcp/context_cache.json # Conversation history
Fusion 360 Add-in:
macOS: ~/Library/Application Support/Autodesk/Autodesk Fusion 360/API/AddIns/FusionMCP/
Windows: %APPDATA%\Autodesk\Autodesk Fusion 360\API\AddIns\FusionMCP\
DATA FLOW
=========
1. User Input (Fusion 360)
ββ> Fusion Add-in (ui_dialog.py)
ββ> Network Client (utils/network.py)
ββ> HTTP POST to http://localhost:9000/mcp/command
2. MCP Server Processing
ββ> FastAPI Server (server.py)
ββ> Router (router.py)
ββ> LLM Client Selection (ollama/openai/gemini/claude)
ββ> LLM API Call with System Prompt
ββ> Response Parsing & Validation
3. Response Generation
ββ> Structured JSON Action (FusionAction schema)
ββ> Safety Validation (dimensions, units, feasibility)
ββ> MCPResponse (with actions_to_execute)
ββ> HTTP Response to Fusion Add-in
4. CAD Execution
ββ> Fusion Add-in receives response
ββ> Action Executor (fusion_actions.py)
ββ> Fusion 360 API calls
ββ> Geometry Creation
ββ> User Feedback
SUPPORTED LLM PROVIDERS
========================
1. Ollama (Local)
β
Free and offline
β
Privacy-first (data stays local)
β
Models: llama3, mistral, codellama, phi
β
No API key required
βοΈ Requires: ollama serve
2. OpenAI
β
GPT-4o, GPT-4o-mini, GPT-4-turbo
β
Native JSON mode
β
Best for creative designs
βοΈ Requires: API key ($$)
3. Google Gemini
β
Gemini 1.5 Pro, Gemini 1.5 Flash
β
Excellent geometric reasoning
β
Cost-effective
βοΈ Requires: API key ($)
4. Anthropic Claude
β
Claude 3.5 Sonnet, Claude 3 Opus
β
Superior reasoning and safety
β
Long context window
βοΈ Requires: API key ($$$)
QUICK START COMMANDS
=====================
1. Install Ollama:
macOS: brew install ollama
Linux: curl -fsSL https://ollama.com/install.sh | sh
Windows: Download from https://ollama.com
2. Pull Model:
ollama pull llama3
3. Start Ollama:
ollama serve
4. Setup Python:
cd ~/Desktop/fusion360-mcp
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
5. Configure:
cp examples/example_config.json config.json
# Edit config.json - set "default_model": "ollama:llama3"
6. Start MCP Server:
./start_server.sh
# or: python -m mcp_server.server
7. Install Fusion Add-in:
cp -r fusion_addin ~/Library/Application\ Support/Autodesk/Autodesk\ Fusion\ 360/API/AddIns/FusionMCP
8. Use in Fusion 360:
- Open Fusion 360
- Tools β Add-Ins β FusionMCP β Run
- Click "MCP Assistant"
- Try: "Create a 20mm cube"
FILE STATISTICS
===============
Total Files: 46+
Python Files: 22
Documentation: 6
Configuration: 6
Examples: 4
Tests: 6
Scripts: 3
Lines of Code: ~3,000+ (Python)
Lines of Docs: ~2,500+ (Markdown)
Total Size: ~250KB (code only)
KEY FEATURES
============
β
Multi-Model Support (4 providers)
β
Intelligent Fallback Chain
β
Type-Safe (Pydantic schemas)
β
Async Architecture (FastAPI)
β
Context Caching (JSON/SQLite)
β
Structured Logging (Loguru)
β
Safety Validation (dimensions, units)
β
Natural Language Interface
β
Parametric Design Support
β
Real-time Execution
β
Comprehensive Testing
β
Full Documentation
RUNTIME REQUIREMENTS
====================
Python: 3.11+
Fusion 360: 2025 (recommended)
Ollama: Latest (for local models)
RAM: 4GB minimum (8GB recommended)
Disk: ~5GB (with Ollama models)
Internet: Optional (only for cloud models)
NEXT STEPS
==========
1. β
Follow SETUP_OLLAMA.md for detailed setup
2. β
Read QUICKSTART.md for 5-minute guide
3. β
Check README.md for complete documentation
4. β
Try example commands in Fusion 360
5. β
Explore examples/ folder for workflows
6. β
Run tests: ./run_tests.sh
7. β
Star on GitHub and contribute!
SUPPORT
=======
Documentation: README.md, QUICKSTART.md, SETUP_OLLAMA.md
Architecture: ARCHITECTURE.md
Issues: GitHub Issues
Discussions: GitHub Discussions
PROJECT STATUS: β
COMPLETE & READY TO USE
Version: 1.0.0
License: MIT
Built: January 2025