Control Blender 3D with natural language prompts via local AI models. Built on the Model Context Protocol (MCP), connecting Claude, Cursor, or any MCP client to Blender through a local Ollama LLM.
Abstract: Large Language Models (LLMs) that generate executable code are opening a new path to text-to-3D by converting natural language prompts into scripts for modeling software like Blender.
Fusion Studio powered 125 historical VFX shots for A World Divided, delivered by a two-person team under episodic deadlines.
Claude ←→ MCP Server (Node.js) ←→ WebSocket ←→ CEP Panel ←→ ExtendScript ←→ Premiere Pro ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results