Skip to main content
Glama

Blender MCP Server

by llm-use
blender_polymcp.py•1 kB
#!/usr/bin/env python3 """Blender MCP Chat - Come filesystem MCP""" import asyncio import sys from pathlib import Path from polymcp.polyagent import UnifiedPolyAgent, OllamaProvider async def main(): llm = OllamaProvider(model="gpt-oss:20b", temperature=0.1) # Invece di stdio, usiamo HTTP endpoint mcp_servers = ["http://localhost:8000/mcp"] agent = UnifiedPolyAgent( llm_provider=llm, mcp_servers=mcp_servers, verbose=True ) async with agent: print("\nāœ… Blender MCP Server connesso!\n") # Chat loop while True: user_input = input("\nšŸŽØ Tu: ") if user_input.lower() in ['exit', 'quit']: print("Arrivederci!") break result = await agent.run_async(user_input, max_steps=5) print(f"\nšŸ¤– Blender: {result}") if __name__ == "__main__": asyncio.run(main())

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/llm-use/Blender-MCP-Server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server