Skip to main content
Glama

Fusion360MCP

by jaskirat1616
config.jsonβ€’1.38 kB
{ "server": { "host": "0.0.0.0", "port": 8888, "debug": false }, "ai": { "default_backend": "ollama", "default_models": { "ollama": "llama3", "openai": "gpt-4", "gemini": "gemini-1.5-pro-latest" }, "max_retries": 3, "temperature": 0.3, "max_tokens": 2000 }, "context": { "max_short_term": 10, "enable_long_term": true, "long_term_file": "~/mcp_memory.json" }, "database": { "path": "~/mcp_conversations.db", "backup_enabled": true, "backup_interval_hours": 24 }, "validation": { "forbidden_keywords": [ "delete", "remove", "destroy", "clear", "erase", "os.", "sys.", "subprocess.", "eval(", "exec(", "__import__", "open(", "file(", "input(", "shutil.", "rmtree", "unlink" ], "require_adsk_import": true, "syntax_check": true }, "logging": { "level": "INFO", "server_log": "~/mcp_server.log", "core_log": "~/mcp_core.log", "max_log_size_mb": 10, "backup_count": 5 }, "ui": { "theme": "light", "enable_code_preview": true, "enable_execution_confirmation": true, "show_execution_time": true }, "security": { "enable_api_key_encryption": false, "allowed_origins": ["*"], "rate_limit_per_minute": 60 } }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/jaskirat1616/Fusion360MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server