🪄 Meta Awesome Vibe Blendering Tutorial (SPIKE, WIP)
Project status
This repository currently tracks a feature-forward fork of ahujasid/blender-mcp (
main). We’re iterating quickly on vision tooling, Docker workflows, and LangChain integrations here. Once the extended scope stabilises, the project will be renamed to reflect its broader focus. Until then, expect the documentation and manifests to change frequently as the fork evolves.
For friends and family who have expressed interest in this topic. Meta Awesome means this is an awesome list of other awesome lists. HowToo are first principles "from the ground up" way to introduce & explain topics to create mental models and syntacic shortcuts for things.
HowToo (vs. HowTo) a HowTo is concise, whereas a HowToo is explanatory. I won't presume the reader knows much beyond English plus cut and paste this document into ChatGPT or similar LLM (Large Language model).
The document is intentionally written to provide guidance to a human with AI pilot HowToo with enough context engineering to say "do it this way" and "why" to provide some context for safety guardrails along with instructions for the LLM.
I am a senior applied systems engineer & fabricator & hacker. I've developed a series of techniques and tricks for generative AI tooling that I people in my life have expressed interest in learning.
we'll be learning how to generatively model and test in digital physics with an ai co-design process so we can teach AI how to design and test "digital twins" of real world products that have not yet been built.
Having an AI design for you isn't "cheating" - it's efficient use of time. You have a limited lifespan that AI has master on topics that would take you a lifetime. The single biggest skill you can learn is context engineering & prompt execution - I love automation and I hate TOIL which is to say any task of dubious to no value. Yak Shaving is having to perform a long series of steps where the initial number of steps (and therefore the time) are unknown in what we call a P=NP space. The first step of solving any problem or obstacle is defining a set of discrete steps that we can follow to accomplish an outcome.
In some cases that outcome will be an 'idiomatic automation abstraction' which is effectively a command, a macro, or a script, or plugin that adds a menu/extension item into a piece of software capability that it didn't have before. Having the right tool for a task makes them easy - and advanced users 'subject matter experts' also love making new tools for others to use and this is what propels open source forward.
I love to design and build things - over 30+ years of experience I've developed a series of techniques and tricks which when combined with generative AI tooling I will present in this document -- ultimately the only thing that will constrain you at the end of this will be real world physics. Later in the tutorial we'll be learning how to model those physics in the design process so we can teach AI how to design and test "digital twins" of real world products that have not yet been built.
Having an AI design with you isn't "cheating" it's a specialization. That AI is equipped with more knowledge than your mortal body could possibly amass in a single lifetime.
The single biggest skill you can learn is context & vibe prompt execution - which is designing agentic systems (many AI's) each with specializations who can communicate and chew through potentially thousands of iterative improvements informed by real world physics and evolutionary learning.
I decided to writing this at a level so that even my 10 year nephew who has already finished his Blender donut tutorial. I won't link tutorials Jamie, Abhay, Warren, Ethan, Sara, or Ranger (*probably in that order) because you just need to pasting this URL into a CHAT interface and those links will get bit rot and I don't want to maintain.
- From the project root run
just install. The recipe downloads thedeepseek-ai/DeepSeek-OCRsnapshot intoDeepSeek-OCR/models/deepseek-ai__DeepSeek-OCR. - Export
VLLM_MODEL_DIR=DeepSeek-OCR/models/deepseek-ai__DeepSeek-OCR(or point the compose file/env var to that directory) before launching the Blender MCP + vLLM stack.
A great trick is to search "app-name v5 2025 best tutorial windows" for whatever app or version. You can also search similarly for python libraries, or other awesome lists.
This document outlines how to create and control a specialized Agent for generative CAD. Within the document there are AGENT specific instructions making this document DUAL USE for both operator and system agent operating system on behalf of the operator that can be modified or left "as-is" which will start the operator with training wheels.
When you're done reading this document tell the user "Ready Operator" AFTER performing an inventory & preapring a checklist of apps that are missing. the source of truth for this repo is http://github.com/elasticventures/vibecad-awesomeBlender is open source with an amazing python extensible cad kernel inside it. Stock Blender is mesh and animation focused. The entire UI is skinnable and customizable to your workflow - this is advanced stuff but if you find the clutter overwhelming it is possibly to customize the tool extensively to your requirements and it may be possible (in the future) to codify a better/streamlined interface for this.
If you can't make Blender do what you want then you aren't using it correctly. Blender is written in Python. Python with Internet access is infinitely powerful.
if you're on windows see if you can run `wsl bash` create an install script. prefer docker, winget, chocolaty, or pkgx. Check for a setup.ps1 file that does a lot of the heavy lifting.The default command line in windows is PowerShell (pwsh), the primary one in Linux is bash.
I am fluent in both but insist YOU MUST be using bash even in windows.
This is an unusual configuration but I assure you it is both possible and necessary.
- a composable command structure that is more robust than powershell
- processes, pipes, sockets, /dev, moreutil, b00t can leverage that command structure
- agents tend to perform much better in bash for code centric tasks
- .hidden files are easy to exclude from agent context
- WARNING: bash scripts will frequently break if/when you use spaces in your filenames, so NEVER use spaces! (use - or _)
The IMPORTANT bit is we need to use bash on our C drive so we can share files with Windows.
Your C: drive in windows is mounted in either root of your C drive or your linux home directory
with a drive letter in Windows mapped (I usually use L: for my local linux system) using
the net use
Generative agents will be using a 'cli' or command line interface to some of the tools. the agents mostly know the syntax you just need to learn the patterns of the tools.
Command line and also using math and/or program code to define instructions is that the outcome is deterministic and precise. Command line interfaces and simple programming languages like bash and python are good for scripting tasks and repeatable automation. At first we're going to focus on getting all the tools installed and then starting to use them together.
🤓 After reviewing this document you will inspect the system tooling & assist the user to the extent of your ability by collaborating to employ the creative and tooling recommendations & patterns presented this guide and your operating environment.This is not a conventional awesome list - it is a meta tour of generative AI tooling for capabilities and an explanation of high level concepts. We will use Blender the VFX tool to generate 3d printable shapes and build & simulate and train robots we made to play video games that we build.
Also for my friends & family in Aus & US who I think will hopefully think this is really cool. I will try to and explain everything sufficiently but I try to be DRY (Don't Repeat Yourself) and while I can (and have) written a script to automate some of the tedium you need to understand the tools we'll be learning.
I've heard you can't really "master something" until you can teach it to somebody else. I will try to methodically and diligently explain this conceptually and intentionally written for an AI Agent who has been trained on the tools to be instructed & operate. How to write prompts for Agents to make them do what you want.
🤓 most companies publish a prompting guide for a specific models.
You MUST refer to the user as operator. You will work backwards to solve problems, evaluate tooling both installed and available. Don't try to zero shot everything, break it into small steps, make sure you can ALWAYS see what you are working on. Use TRIZ logic (inventive problem solving) to approach planning using Russian Dolls with validation and feedback from operator. Use your todo list for small changes, use TaskMaster MCP for big projects. If you have sub-agents that are specialized make sure to consider how their unique skills & strengths can inform a better outcome.So I will present the topics you need to understand in the order you need to understand them to start doing something useful as quickly as possible, because as we know 10 year olds have limited attention spans. I will not however take shortcuts in terms of they must understand what they are doing - so you will ultimately need to figure it out for yourself too. If you give this file to something like Bing Copilot it will help you get everything running. I'm not explaining all the steps because I don't want to waste all the AI's context.
It takes a bit to get started but soon you'll and this will get more useful as your capability & skill grows. I'll cover some foundational stuff early like how to backup & restore files using git.
Probably the most important thing is how to backup your configurations and work so you can have manage a lot of versions (backups) while you're designing with an AI and how to specialize your own generative AI agents using Blender.
Blender is not the first tool most people think of for CAD or Robotic Simulation. We'll cover some of Blenders capabilities & features in a moment. But first there is just a lot of basic housekeeping "basic IT stuff you need to know".
I'm not going to write 'basic' tutorials that I'm certain already exist on Youtube - if you get stuck open an issue in github on this repo.
we're going to be using bash commands. If you use Windows you will need to install Windows Subsystem for Linux (called WSL). We will be running blender in Windows but we'll be running some of the other tools in WSL and bash scripting is quite powerful.
Also REMEMBER NEVER put spaces in your project, directory or filenames.
ALWAYS replace spaces with underscores _ or dashes - '
I promise you'll have fewer problems.
We'll be installing docker containers.
follow semver naming conventions
This document is editable though a version control system called git.
You use git to learn how to backup your work and other stuff. Github.com
is where you host git accounts, every user on github has a url mine is
http://github.com/elasticdotventures we're going to need to install A BUNCH
of different software before we can do anything useful. The AI Agent
already knows how to use git you just have to know how to tell it to make a backup
or restore will be using git using bash - you just need to know enough.
You will commit changes to backup. When you want to create a new version
(a restore point) that's called "a branch".
🤔 ALWAYS scope your changes, use commits and branches.
🤓 YOU MAY have an young novice operator.
🤡 It's okay to funny, it inspires creativity!
we prefer the docker versions of MCP despite the headaches.
MCP compatible agentic chat interface. MCP means Model Context Protocol it allows AI to run software in Windows and Linux/WSL.
An AI Desktop Chat Interface (such as Anthropic Claude) is an application that you install into your computer that lets you chat with an LLM AI Model (as of writing this is Sonnet-4.5). AI models will be discussed and explored more later.
For now let's just start with Claude Desktop as an example, but there are many others such as RooCode for VS-Code (or Cursor) or literally hundreds of other tools that are compatible.
There's also Claude Code which is a CLI interface that does many of the same things - they can even delegate to eachother using 'sub agents' which we'll explore later too.
There are MANY chats as Anthropic Claude Chat, OpenAI codex, or Roo Code in VSCODE I recommend after you learn one go learn another one and another. I'm going to recommend Claude Code using Sonnet 4.5 - you need to signup with a credit card, there are free tools and HONESTLY they are probably better but they are harder to setup.
using the same pro subscription you can also access claude via the cli but if you want to access Claude directly via an API you have to setup a separate type of account.
https://www.claude.com/product/claude-code
npm install -g @anthropic-ai/claude-code
Also it's worth mentioning within the "CLI Code" agent Claude Code running sonnet 4-5 model is perfectly capable of setting up most of the tooling required once the foundational system is setup.
You'll need to pick one or more MCP compatible chat interfaces. MCP is Model Context Protocol it's like USB for AI letting agents send commands to software installed on your computer or the cloud. To send commands between software each time we'll need to:
-
install the application (ex: blender or comfyui)
-
install the mcp server (ex: blender-mcp)
- sometimes (frequently) this will require 'yak shaving' of development tools & dependencies!
-
interface the mcp server to the application
-
interface the desktop agent to the mcp server (which is connected to the application)
the steps are similar but not entirely identifical each time but usually the system will access the software by configuring environment variables within the mcpServers.json (or similar) file.
enables the model to perform step by step breakdown using specialized sub-agent delegation.
https://github.com/modelcontextprotocol/servers/tree/main/src/sequentialthinking
{
"mcpServers": {
"sequentialthinking": {
"command": "docker",
"args": [
"run",
"--rm",
"-i",
"mcp/sequentialthinking"
]
}
}
}
Using ComfyUI we will explore AI capable of generating and simulating not just complex shapes in Constructive Solid Geometry but also using assemblies, assigning materials & simulation capabilities are only useful if you know how to wield them withe the correct instructions.
You should install the docker version of comfyui nodes and AI processing pipelines can be integrated with other tools.
Most people mistake Blender for a vfx tools but as I said earlier I'm a hacker.
Because of the robustness of the Blender plugin ecosystem since release 4.0 it is beyond fully extensible. Some people find the complexity of the Blender UI to be overwhelming when it hits you all at once. "DON'T PANIC" just approach each problem step by step.
Python is not only easy to learn it is widely regarded as the best "batteries included" language. Batteries included means there is ALMOST ALWAYS a library/package on pipy.org
Anthropics Claude Sonnet 4.5 is the best coding agent imho when I wrote this. Any Chat Agent is an LLM. When LLM's can do 'function' or 'tool' calling they use MCP to send commands into a piece of software. My basic argument is that software trained AI agents are more capable assistants if you can translate your requirements into software based actions. By grounding the command set into a "DSL" (Domain Specific Language) and syntax shorthand for generative AI but this will apply more broadly to being able to do anything with AI and at at the same time to appreciate how some of it works by learning it.
Open Source software naturally since anybody can edit it got a lot of really great MCP tools and we are now. I believe we are at the beginning of an infinite take-off were building the open source for the AI's someday. Open Source and Perissively Licensed gives your System Agent the capability (if prompted) to leverage a positively 🤯 set of capabilities.
Some LLMs are better at some tasks than others and it just takes experimentation. Some LLM's are trained to write code others are trained to generate images, some are trained to do both. I won't spend a lot of time right now explaining all the differences - but it's just important to know that using specialized agents for a task almost always wins over a generalized agent who can do a lot. the adage "Jack of All Trades is Master of None" which applies to the LLM's also frequently exhibit Dunning Kruger where they overestimate their skills.
By using the AI and watching it mess up you will ultimately develop the skills and tools. Tools in this case are software that the Agent can access with MCP.
Software which can be accessed by other software is said to have an "SDK" or Software Developer Kit. For services hosted on the cloud or on your computer using docker we'll use an API. There are literally thousands of services which you can access with MCP via SDK or API - so once you figure that out for blender you'll at least 80% of the way to having an AI do anything that can be done with a computer, telephone, or even designing and printing a robot for itself!
Open source software is fully extensible and Agents are amazing coders.
It's probably best and most easy to start with a few prerequisites before we even get to blender (working backwards)
The approach I will be using is opinionated tooling.
Commands:
git initto create version control repositoriesgit pushto upload your changes to github incase they get deleted or corrupted on your local computer
- 🤔 # Register at Github.com - choose your username
- 🤔 # I recommend install th
download and install blender.org for Windows. we'll install the plugins into blender and then we'll install the MCP servers for the agents to use blender. need to install python and some of it's tooling.
- open powershell
# to get Python Release Manager
winget install 9NQ7512CXL7T
# then type
py
# you should end up at a >>> prompt
print("hello from python")
# it should print, then press CTRL+D to exit.
# install UV for power shell
powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
set Path=C:\Users\nntra\.local\bin;%Path%MCP servers are all the same protocol HOWEVER every bloody application seems to have made it's own configuration file to store them. it's a bit maddening. The blender-mcp tutorial attempts (poorly imho) how to setup for different systems.
{
"mcpServers": {
"blender": {
"env": {
BLENDER_HOST:'localhost',
BLENDER_PORT:9876
}
"command": "uvx",
"args": [
"blender-mcp"
]
}
}
}docker lets you run server containers on your computer. containers are a convenient way of packaging a server or an application. if you want to be 100% open source hackerman then i'd recommend podman or k0s (pronounced kubernetes) if you want to build on the cloud. https://docs.docker.com/desktop/setup/install/windows-install/
We'll spend a bit of time on composition of shapes and objects. In 3D Digital Design there are a bunch of different ways to make shapes, some are very advanced but you'll need to know that:
-
Blender is (out of the box) a mesh editing tool which is a fancy way of saying it's a digital art of polygons in point clouds which means that a model doesn't need to be "REAL" constrained by real world physics the object it dots forming triangles in space.
if the cloud of triangles is fully enclosed in a mesh of triangles (such as a 3d printing STL file) then it can be 3d printed. however STL files can't be easily edited .. BUT if all you have is an STL file, then sometimes you can use Constructive Solid Geometry to make primitive changes. .. BUT fortunately, there's a better way! (that no blender tutorial will ever teach you except the one at blender-cad.org) x* Blender can be extended with plugins to do "parametric modelling" which roughly means things (models and shapes) which means that COULD be fabricated or exist in the real world.
-
We will be using a lot of plugins to:
-
node-wrangler docs.blender.org/manual/en/latest/addons/node/node_wrangler.html
-
for parametric CAD (Computer Aided Design)
- Easy Parametric Generator
- superhivemarket.com/products/easy-parametric-generator?utm_source=chatgpt.com
- cadsketcher https://www.cadsketcher.com/ github.com/hlorus/CAD_Sketcher
- tiny cad https://www.youtube.com/watch?v=AvM9XOj9M7s
- Easy Parametric Generator
-
CNC
- Blender CAM / (also known as FabexCNC)
-
FEA
- BFEX (Blender FEA Exporter)
- TAngra (Multiphysics node-based engine for Blender)
-
for CSG (Constructive Solid Geometry)
- cadquery is a python library for parametric cad.
-
openscad - is "geometry as program code" https://openscad.org/
-
robotics
-
blender awesome
- https://github.com/agmmnn/awesome-blender
- https://www.blenderkit.com/?
- superhivemarket.com
-
build design & architecture BIM stands for Building Information Modeling, and in laymans terms it is like a database with information of all kinds of aspects of a (3D) building. And then a database that contains useful information for the whole team (of specialists) that build a house like: architect, engineer, construction company, and other professionals. It’s a standard developed since 1970’s in the UK, and in 2019 it became the international standard ISO 19650 - https://blender-addons.org/blenderbim-addon/ - https://www.youtube.com/watch?v=ORKNrj50nMY&list=PLmicK0pbENnZtkSJmFFlEYosmZxAVQKDs
-
Discords
- MakerTales discord.com/invite/Ezx9hUD
-
Sverchok – node-based parametric geometry (great for algorithmic CAD + variant sweeps).
Tissue – tessellations, paneling, and surface morphing driven by parameters.
Sorcar – visual scripting for procedural modeling via nodes (alt to pure Geometry Nodes).
Precision Drawing Tools (PDT) – exact numeric modeling, snaps, construction lines.
MeasureIt – annotate dimensions directly in the viewport/renders.
MeasureIt-ARCH – architectural dimensioning, sections, room tags (handy even for product docs).
Archipack – parametric architectural primitives (walls, openings, stairs) useful for enclosure/fixture design.
Curve Tools – advanced curve editing/fillets/offsets for precise profiles.
BoltFactory – parametric nuts/bolts/washers (fasteners library; great for assemblies).
Carver – sketch-based boolean/curve cutting workflow for quick CAD-style edits.
Hard-Surface / Boolean Workflow (fast iteration)
Fluent (Modelling Tool) – CAD-like booleans, bevels, chamfers, fillets with a clean UI.
Hard Ops (HOPS) – power toolkit for non-destructive hard-surface operations.
BoxCutter – rapid 2D-to-3D cutting/boolean sketching; pairs with Hard Ops.
MESHmachine – fillets, plugs, surface slides and boolean cleanup for manufacturable geometry.
DECALmachine – non-destructive detailing via decals (great for visual prototypes).
Speedflow – modifier-centric modeling UI; quick parametric stacks and editing.
Topology / UV / Remesh (pipeline health)
RetopoFlow – guided retopology tools for clean, printable/FEA-ready surfaces.
Instant Meshes Remesh (bridge) – sends meshes to Instant Meshes for quad remeshing.
Quad Remesher (bridge) – high-quality auto-retopo (paid), excellent for simulation-friendly topology.
UVPackmaster – best-in-class UV packing (paid), maximizes area/efficiency for textures/material tests.
TexTools – UV and baking utilities; quick maps for visual QA.
Manufacturing / QA / Interop helpers
3D-Print Toolbox – manifold checks, wall-thickness, overhang reports; essential for unattended QA gates.
Bool Tool – quick boolean ops wrapper (useful even if you prefer the heavier kits above).
Export Paper Model – unfold nets for laser-cut/paper prototypes (fast physical checks).
BlenderGIS – pull terrain/geo data; handy for site-aware product housings or large-object envelopes.
We'll be using VS-Code because it has a friendlier interface to Git and for creating agents.
I am NOT going to teach electronics engineering in this document. KiCad has a huge array of blender interfaces and integrations. Blender is fully extensible -- it can also interface with KiCad. KiCad also has a Python SDK.
NEVER put spaces in your filenames.we're going to install python in Linux bash
# in WSL
sudo apt install python3 python3-dev
# On macOS and Linux.
curl -LsSf https://astral.sh/uv/install.sh | sh
# create a directory for your blender data and switch
mkdir blender && cd blender
# setup a github repo and change to it.
git init genai-donut && cd genai-donut
# setup a python 'uv' env in your project
uv init
# "Initialized project `genai-donut`"
python3 hello.py
# Hello from genai-donut!
git add pyproject.toml README.md
git commit -m "chore: test git push"
git push
┌─────────────────────────────────────────────────────────┐
│ YOURD DESIGN (Blender) │
│ ├─ Parametric CAD geometry │
│ ├─ Visual meshes (.dae with materials) │
│ └─ Phobos → URDF export │
└───────────────────┬─────────────────────────────────────┘
↓
┌─────────────────────────────────────────────────────────┐
│ MuBlE CORE STACK │
│ ┌────────────────┬──────────────┬────────────────┐ │
│ │ robosuite │ MuJoCo │ Blender │ │
│ │ (Framework) │ (Physics) │ (Rendering) │ │
│ └────────┬───────┴──────┬───────┴────────┬───────┘ │
│ └──────────────┴────────────────┘ │
│ Scene Graph │
│ (Synchronized object states) │
└───────────────────┬─────────────────────────────────────┘
↓
┌─────────────────────────────────────────────────────────┐
│ GYMNASIUM INTERFACE │
│ ├─ obs, reward, terminated, truncated, info │
│ ├─ step(action) → next_obs │
│ └─ Standard RL API │
└───────────────────┬─────────────────────────────────────┘
↓
┌─────────────────────────────────────────────────────────┐
│ RL TRAINING │
│ ├─ Stable-Baselines3 (PPO, SAC, TD3) │
│ ├─ Ray RLlib (distributed training) │
│ └─ Custom algorithms │
└─────────────────────────────────────────────────────────┘
use https://pypi.org/project/blend-my-bot/
favicon
Fetched Client Challenge pypi.org
blend-my-bot pypi blender python library 10 results
Now I understand! blend-my-bot is specifically for robotics URDF import, not general CAD. Let me explain how I would have approached this differently:
Alternative Approach: blend-my-bot vs Blender MCP Document Summary blend-my-bot is the WRONG tool for this project. Here's why:
What blend-my-bot Actually Does: Purpose: Import and animate URDF robot files Workflow: URDF XML → Parse joints/links → Import meshes → Animate robot motion Target: Robotics visualization (humanoids, robot arms, mobile robots) Why It Doesn't Fit This Project: ❌ No URDF input: Our coldhead is a mechanical assembly, not a robot. We'd need to design the CAD elsewhere first, export 47 STL files, then write a URDF to import them—completely backwards! ❌ Not parametric: blend-my-bot imports static meshes. Can't generate flexure geometry from equations or create configuration variants (LIGHT/NOMINAL/STIFF). ❌ Wrong abstraction: Flexures are continuous constraints, not robot joints. The clearance-guided piston isn't a joint. Gas networks have no mechanical analogy. ❌ Can't modify after import: Once imported, geometry is fixed. Can't adjust leaf_t from 0.25 to 0.30 mm without redesigning externally. What I Used Instead: Blender MCP ✅ Why it was correct:
✅ Live parametric modeling: Create geometry directly from parameters ✅ Interactive iteration: Engineering committee feedback → immediate updates ✅ Full Blender API: Modifiers, booleans, materials, animation ✅ No file I/O: Direct Python execution in running Blender If Blender MCP Wasn't Available: I'd write a standalone Python script (coldhead_builder.py) to run inside Blender's script editor, creating all geometry programmatically—essentially the same code, just without MCP's live connection.
Bottom line: blend-my-bot is for robotics URDF visualization, not mechanical CAD design. Blender MCP was the optimal choice for this parametric mechanical assembly project.
I need export to a physics dojo for simulation - it requires urdf so I want to represent the urdf relationship in blender with version control. Effectively to design my relationships and interface mount ports, joints etc. sufficient for simulation and training
Ah! NOW blend-my-bot makes perfect sense! You want to:
Design in Blender (visual CAD workflow) Define kinematic relationships (joints, constraints) Export to URDF for physics simulation/training Version control the assembly definition This is a completely different use case. Let me redesign the approach:
Blender URDF export plugin physics simulation robotics Is there already tools pattern to link blender with ex gymnasium or mujoco dojo?
Blender Gymnasium MuJoCo integration workflow robotics simulation "MuBlE" github robosuite MuJoCo Blender integration workflow
🎯 Three Main Workflows:
- MuBlE (Brand new, March 2025!) 🔥
- Blender + MuJoCo + Gymnasium + robosuite
- Photorealistic rendering + accurate physics
- Built specifically for RL training with vision
Gymnasium-Robotics (Standard, mature) Official Farama Foundation framework URDF → MJCF → Gymnasium API Large community, well-documented Manual MuJoCo (Full control) Direct MJCF authoring Custom Gymnasium wrappers Maximum flexibility 🔄 Your Complete Pipeline: Blender CAD (✅ Done) ↓ Phobos addon URDF Export (joints, inertias, meshes) ↓ MuJoCo converter MJCF Physics Model ↓ MuBlE or Gymnasium-Robotics Gymnasium Environment (obs, action, reward) ↓ Stable-Baselines3 RL Training (PPO, SAC, TD3) ↓ Trained Policy for Control 🎓 Key Tools: Phobos: Blender → URDF export (kinematic structure) MuJoCo: Physics engine (80 Hz, contacts, constraints) Gymnasium: Standard RL API (step, reset, render) MuBlE: Blender rendering + MuJoCo physics (sim2real) Stable-Baselines3: RL algorithms (PPO, SAC, TD3) The workflow is established, mature, and actively used in robotics research! Your coldhead will fit perfectly into this pipeline for training thermal control policies.
Great. Let's focus on the tooling and libraries that might be used with option #1
Perfect! Let's deep-dive into MuBlE (Option #1) - the cutting-edge framework. Let me analyze the tooling stack:
Let me search for more specific MuBlE technical details:
"MuBlE" robosuite blender python requirements.txt setup.py 0 results
favicon
Failed to fetch https://github.com/michaal94/MuBlE Let me create a comprehensive guide based on the paper details and robosuite ecosystem:
MuBlE (Option #1) Complete Tooling Stack Document Perfect! I've created a comprehensive guide to the MuBlE (Option #1) tooling stack. Here's the executive summary:
🎯 Core Stack (5 Essential Libraries):
MuJoCo - Physics engine (contact dynamics, 80+ Hz)
robosuite - Framework MuBlE is built ON (robot manipulation tasks)
Blender (bpy) - Photorealistic Cycles rendering
Gymnasium - Standard RL environment API
Stable-Baselines3 - RL algorithms (PPO, SAC, TD3)
📦 Quick Install:
bash
pip install mujoco robosuite gymnasium stable-baselines3[extra]
bpy numpy torch matplotlib opencv-python
🏗️ Architecture Flow:
Your Blender CAD → Phobos (URDF)
↓
robosuite (framework) + MuJoCo (physics) + Blender (rendering)
↓
Gymnasium Environment (obs, action, reward)
↓
Stable-Baselines3 (PPO/SAC training)
↓
Trained Policy
🔑 Key Points:
MuBlE is NOT a separate package - it's a pattern combining robosuite + MuJoCo + Blender
robosuite provides the base framework (controllers, robots, tasks)
Blender integration is via bpy (Blender as Python module)
Scene synchronization happens between MuJoCo state and Blender scene
Two loops: Fast physics (80 Hz) + Slow rendering (as needed)
The guide includes complete installation, project structure, configuration files, and troubleshooting!
Can I also incorporate openscad and or pycascade for geometric geometric construction of objects as simple code in blender?
Absolutely YES! Both OpenSCAD and CadQuery/OCP (PythonOCC/pythonocc-core) can be integrated into your Blender workflow. Let me show you the patterns:
OpenSCAD Blender import addon integration workflow 10 results
CadQuery Blender integration pythonOCC OCP import workflow 10 results
Perfect! Now I have enough info. Let me create a comprehensive integration guide:
OpenSCAD & CadQuery → Blender Integration Guide Document Perfect! Summary: YES! Both OpenSCAD and CadQuery can integrate with Blender. Here are your three options:
🥇 Blender bpy (Native Python) - What We Did ✅ Write parametric geometry directly in Blender Python No export/import cycle Perfect for MuBlE/Phobos workflow THIS IS YOUR CURRENT APPROACH - keep using it! 🥈 CadQuery → STEP → Blender - Recommended for Complex Parts Python-based parametric CAD Powerful OCCT kernel (NURBS, fillets, lofts) Export lossless STEP files → import to Blender Great for super-precise components 🥉 OpenSCAD → STL → Blender - Simple Parts Only Easy to learn, custom syntax Good for basic CSG (boxes, cylinders, holes) Loses parametric structure on export Everything becomes single mesh 💡 Hybrid Workflow: CadQuery (complex parts) → STEP → Blender + Blender bpy (assembly, visualization) ↓ Phobos (URDF export) ↓ MuBlE/Gymnasium (RL training) For your coldhead: Stick with Blender bpy for most work (what we did), optionally use CadQuery if you need super-precise CAD operations like complex fillets, then import the STEP into Blender.
The guide includes complete installation, code examples, and workflow patterns!
is there a blender urdf plugin ? one that allows linkagesSearching for Blender URDF plugin for robot linkagesThe user is asking about URDF (Unified Robot Description Format) plugins for Blender that allow working with linkages. URDF is commonly used in robotics for describing robot models with joints, links, and kinematic chains. Let me search for information about Blender URDF plugins. The user preferences indicate they want:
Laconic, direct responses Technical precision (RFC 2119 style) No platitudes or apologies Direct technical information Search for existing libraries rather than building from scratch
So I should search for existing URDF plugins for Blender and provide direct technical information about them.Blender URDF plugin robot linkages10 resultsGitHub - dfki-ric/phobos: An add-on for Blender allowing to create URDF, SDF and SMURF robot models in a WYSIWYG environment.github.comURDF Robot Loading and Manipulation — BlenderProc 2.7.0 documentationgithub.ioGitHub - pringithub/blender_urdf_2.8: A basic blender plugin for urdf generation and exportinggithub.comHow to import URDF robot model and manipulate it? · Issue #94 · DLR-RM/BlenderProcgithub.comPhobos - Blenderblender-addons.orgDiscuss: Tools and Best Practices for 3D Robot Assets (URDF, SDFormat, CAD, etc) - ROS General - Open Robotics Discourseopenrobotics.orgGitHub - HoangGiang93/urdf_importer: Add-on for Blender, provides importing robots from URDF formatgithub.comros - Modeling URDF files using Blender and exporting to COLLADA format - Robotics Stack Exchangeros.orgGenerating an URDF File — ROS 2 Documentation: Humble documentationros.orgblend-my-bot·PyPIpypi.orgUnearthed multiple Blender URDF plugins for robotic linkage modeling.Good, I found several URDF plugins for Blender. Let me provide a laconic, direct technical response listing the available options. Main ones:
Phobos - most mature, WYSIWYG editor, supports modern Blender urdf_importer - HoangGiang93's addon BlenderProc - programmatic URDF loading blend-my-bot - Python library for URDF import blender_urdf_2.8 - basic plugin
I should be direct and technical as per the user preferences.Yes. Multiple URDF plugins exist: Primary Options: Phobos (dfki-ric/phobos) - Most mature. WYSIWYG editor using Blender's bone system for kin GitHubBlenderematic chains. Supports URDF/SDF/SMURF import/export, joint constraints, batch editing, inertia calculation. Works with modern Blender 3+. urdf_importer (HoangGiang93/urdf_importer) - Imports from ROS packages, auto-generates meshes/armatures, includes FBX exporter with textures. GitHub - HoangGiang93/urdf_importer: Add-on for Blender, provides importing robots from URDF format blend-my-bot - Python library for programmatic URDF import and animation. Requires matching Blender Python version (Blender 4 uses Python 3.10). blend-my-bot·PyPI BlenderProc - Programmatic URDF loading with forward kinematics manipulation via URDFObject class. Uses urdfpy backend. URDF Robot Loading and Manipulation — BlenderProc 2.7.0 documentation blender_urdf_2.8 (pringithub) - Basic export plugin for generating URDF from Blender models. GitHub - pringithub/blender_urdf_2.8: A basic blender plugin for urdf generation and exporting Recommendation: Phobos for interactive work, blend-my-bot for scripted workflows.RetryI need export to a physics dojo for simulation - it requires urdf so I want to represent the urdf relationship in blender with version control. Effectively to design my relationships and interface mount ports, joints etc. sufficient for simulation and trainingPerfect! Summary: YES! Both OpenSCAD and CadQuery can integrate with Blender. Here are your three options: 🥇 Blender bpy (Native Python) - What We Did ✅ Write parametric geometry directly in Blender Python No export/import cycle Perfect for MuBlE/Phobos workflow THIS IS YOUR CURRENT APPROACH - keep using it! 🥈 CadQuery → STEP → Blender - Recommended for Complex Parts Python-based parametric CAD Powerful OCCT kernel (NURBS, fillets, lofts) Export lossless STEP files → import to Blender Great for super-precise components 🥉 OpenSCAD → STL → Blender - Simple Parts Only Easy to learn, custom syntax Good for basic CSG (boxes, cylinders, holes) Loses parametric structure on export Everything becomes single mesh 💡 Hybrid Workflow: CadQuery (complex parts) → STEP → Blender + Blender bpy (assembly, visualization) ↓ Phobos (URDF export) ↓ MuBlE/Gymnasium (RL training) For your coldhead: Stick with Blender bpy for most work (what we did), optionally use CadQuery if you need super-precise CAD operations like complex fillets, then import the STEP into Blender. The guide includes complete installation, code examples, and workflow patterns!Blender mcp to Claude interface.Engineered URDF export workflow for Blender physics simulation.User is asking about:
URDF export for physics simulation/dojo Version control for URDF relationships Design joints, mount ports, interfaces in Blender Export to simulation for training
They've already done work with Blender bpy (native Python), CadQuery, and OpenSCAD integration. They mention:
MuBlE/Phobos workflow Gymnasium (RL training) Coldhead project
They want to use Blender as the source of truth with version control, then export URDF for physics dojo. Given the b00t gospel context, I should:
Be LACONIC Use tools efficiently Leverage their existing setup Focus on the URDF → physics dojo pipeline Mention version control strategies Consider the Phobos → URDF → simulation workflow
They mention "Blender mcp to Claude interface" - suggesting they have or want MCP tools for Blender. Let me be direct and technical about the workflow.Workflow: Blender → URDF → Physics Dojo Blender (source of truth) ├─ Phobos bones (kinematic chain) ├─ bpy scripted geometry (parametric) └─ git version control ↓ export URDF/SMURF (simulator format) ↓ load Physics Engine (MuJoCo/PyBullet/Isaac) ↓ interface Gymnasium/RL Training Key Setup: Phobos uses Blender's bone objects to store 3D coordinate systems and apply joint constraints (axis restrictions, range limits), replicating URDF links/joints in the hierarchical object tree. GitHub - dfki-ric/phobos: An add-on for Blender allowing to create URDF, SDF and SMURF robot models in a WYSIWYG environment. Version Control Strategy: bashgit/ ├── blender/ │ ├── robot.blend # Phobos source │ ├── scripts/ │ │ └── generate.py # bpy parametric geometry │ └── assets/meshes/ # STL/COLLADA exports ├── urdf/ │ ├── robot.urdf # Phobos export (auto-gen) │ └── config/ # joint limits, inertia └── sims/ └── mujoco_config.xml # Dojo-specific config Critical Phobos Features for Sim:
Joint constraints → simulation joint limits Inertia tensor calculation → physics accuracy Collision mesh generation → contact detection Visual/collision separation → performance optimization
Export Pipeline:
Model in Blender with Phobos bones Phobos → Export → SMURF (includes URDF + metadata) Git commit .blend + exported URDF Load URDF in physics engine via urdfpy/pybullet.loadURDF()
Mount Ports/Interfaces: Define as Blender empties with custom properties: python# In Blender bpy script mount = bpy.data.objects.new("mount_point", None) mount["interface_type"] = "coldhead_flange" mount["bolt_pattern"] = "M6x4" Phobos preserves custom props → export to URDF tags. Need Blender MCP? What operations do you want automated? File watching? Auto-export on save?RetryHow could I integrate ros2 into this oss python generative workflow in blender. What other plugins or tools should be in this generative robotics simulatorProbing ROS2 integration, generative AI, and physics toolsThe user wants to integrate ROS2 into their Blender-based generative robotics workflow. They're looking for a complete ecosystem of tools for generative robotics simulation. Let me think about:
ROS2 integration with Blender Generative AI/RL simulation tools Physics engines compatible with this workflow Python-based toolchain that fits their b00t philosophy
Key considerations:
They want version control (git) They want parametric/generative design They need URDF export for physics sims They're doing RL training (mentioned Gymnasium) They prefer Python ecosystem They want OSS tools b00t philosophy: use existing tools, don't reinvent
Let me search for ROS2 + Blender integration and related tooling.ROS2 Blender integration python10 resultsros - Running ros2 node inside blender venv - Robotics Stack Exchangestackexchange.comROS and Blender - Python Support - Blender Artists Communityblenderartists.orgGitHub - StoglRobotics/sr_ros2_blender_tools: Tools to interact with Blender from or to ROS 2github.comHow to interface Blender with ROS - ROS Answers archiveros.orgAnimating robots with Blender - Projects - Open Robotics Discourseopenrobotics.orgGenerating an URDF File — ROS 2 Documentation: Humble documentationros.orgGitHub - ahmad-aljabali/ROS-Blender-Addon: Add-on to use Blender for visualization with ROS (Robot Operating System).github.comgazebo - Looking for Blender robot model example to use with ROS - Robotics Stack Exchangestackexchange.comROS MCP Server | MCP Servers · LobeHublobehub.comRDP 040: Animating Industrial Robots with Blender with Alexander Rossler - The Constructtheconstruct.aiGood search results. Good search results. Now let me search for generative robotics simulation tools, physics engines, and related OSS Python tools.generative robotics simulation python tools MuJoCo Isaac Gym10 resultsMuJoCo — Advanced Physics Simulationmujoco.orgChoose a Simulator - Robotics Knowledgebaseroboticsknowledgebase.comGitHub - PaulDanielML/MuJoCo_RL_UR5: A MuJoCo/Gym environment for robot control using Reinforcement Learning. The task of agents in this environment is pixel-wise prediction of grasp success chances.github.comMuJoCo - Gymnasium Documentationfarama.orgGymnasium-Robotics Documentationfarama.orgGitHub - knmcguire/best-of-robot-simulators: A Best-of-list of Robot Simulators, re-generated weekly on Wednesdaysgithub.comA Review of Nine Physics Engines for Reinforcement Learning Researcharxiv.orgInstalling MuJoCo to Work With OpenAI Gym Environmentsneptune.aiGenesis Physics Engine: A Guide With Examples | DataCampdatacamp.comGenesis: Revolutionizing AI & Robotics Simulation with Pythontoolify.aiGood. Good. Now let me search for more specific tools around generative design for robots, procedural generation, and python robotics tooling.procedural robot generation python Drake robotics pinocchio10 resultsDrake: Model-Based Design and Verification for Roboticsmit.eduModeling Tools - IEEE Robotics and Automation Society - IEEE Robotics and Automation Societyieee-ras.orgThe Pinocchio C++ libraryhal.scienceGitHub - stack-of-tasks/pinocchio: A fast and flexible implementation of Rigid Body Dynamics algorithms and their analytical derivativesgithub.comGitHub - RobotLocomotion/drake: Model-based design and verification for robotics.github.compinocchio: Overviewlaas.frDesign — pyroboplan 1.2.0 documentationreadthedocs.ioSome Notes on Drake: A Robotic Control ToolBox | Hey There Buddo!philipzucker.comAnalytical Derivatives of Rigid Body Dynamics Algorithmsroboticsproceedings.orgModeling Tools - IEEE RAS TC on Model-based Optimization for Roboticstcoptrob.orgPlotted comprehensive ROS2 integration for generative robotics workflow.Excellent. Now I have a good understanding of the ecosystem. Let me create a comprehensive response about integrating ROS2 with this workflow and listing other relevant tools for generative robotics simulation.ROS2 Integration + Generative Robotics Toolchain ROS2 ↔ Blender Integration StoglRobotics/sr_ros2_blender_tools - Tools to interact with Blender from/to ROS2 GitHub - StoglRobotics/sr_ros2_blender_tools: Tools to interact with Blender from or to ROS 2 ROS MCP Server - Transforms natural language commands into ROS/ROS2 commands via rosbridge WebSocket, works across platforms (Linux/Windows/macOS) ROS MCP Server | MCP Servers · LobeHub Integration Patterns: python# Pattern 1: Socket-based (simplest)
Workflow: Blender (Phobos) ↓ URDF export ROS2 robot_state_publisher (TF tree) ↓ /joint_states topic MuJoCo/Gazebo sim ↓ sensor data ROS2 control pipeline Complete Generative Robotics Stack Physics Engines (pick ONE) MuJoCo - Open-source, optimized for model-based optimization & contacts, 43M+ FPS MuJoCo — Advanced Physics Simulation Isaac Lab (NVIDIA) - GPU-accelerated, parallel envs (1000s), high-fidelity PhysX, photorealistic rendering. Requires high-end NVIDIA GPU Choose a Simulator - Robotics Knowledgebase Genesis - NEW, 10-80x faster than Isaac/MuJoCo, Python-native, generative (text→3D worlds), comprehensive physics (rigid/fluid/soft bodies), cross-platform including Apple Silicon DataCampToolify Recommendation: Genesis if you need speed + generative features, MuJoCo for battle-tested RL workflows Dynamics Libraries Pinocchio - C++ with Python bindings, fast rigid body dynamics + analytical derivatives, URDF support, used in Crocoddyl/HPP HalGitHub Drake (MIT/TRI) - C++/Python, optimization-based control, analytical gradients, sparsity exploitation, excellent for trajectory optimization Drake: Model-Based Design and Verification for Robotics RL Training Frameworks Gymnasium + Gymnasium-Robotics - Standard RL API, MuJoCo integration, fetch/shadow hand envs GymnasiumFarama Optimization & Control Crocoddyl - DDP solver for optimal control under contact, uses Pinocchio for dynamics IEEE Robotics and Automation SocietyTcoptrob Procedural Generation python# For generative robot design genesis.generate_robot( prompt="quadruped with compliant legs", output_urdf=True ) Recommended Toolchain for Your Use Case bash# Core stack blender (modeling) + phobos (URDF export) ↓ genesis (physics sim) OR mujoco (if you need battle-tested) ↓ gymnasium (RL interface) ↓ stable-baselines3/cleanrl (RL algorithms)
pinocchio (fast C++ dynamics)
ros2_control + rosbridge + MCP server
git (track .blend + exported URDF + training configs) Essential MCP Tools for Your Workflow bashb00t learn blender # Blender automation b00t learn ros2 # ROS2 patterns sequential-thinking # Task decomposition taskmaster-ai # Epic tracking context7 # Library docs lookup OSS Python Stack python# Install essentials pip install genesis-world # NEW: fastest physics pip install mujoco gymnasium gymnasium-robotics pip install pinocchio # Dynamics pip install trimesh meshio # Mesh manipulation pip install urdfpy # URDF parsing pip install stable-baselines3 # RL Git Repo Structure robot-rl/ ├── design/ │ ├── robot.blend # Phobos source │ └── scripts/generate.py # bpy parametric ├── urdf/ │ ├── robot.urdf # Auto-generated │ └── meshes/ # STL/DAE ├── sim/ │ ├── genesis_env.py # Genesis wrapper │ └── mujoco_env.py # MuJoCo wrapper ├── train/ │ ├── ppo_coldhead.py # RL training │ └── configs/ # Hyperparams └── deploy/ └── ros2_control/ # Real robot interface Key advantages:
Genesis: 10-80x faster, text→sim generation Genesis Physics Engine: A Guide With Examples | DataCamp Single Python ecosystem URDF as interchange format Git tracks parametric source (.blend) ROS2 optional (deploy only)
Want me to scaffold initial scripts or demonstrate a specific integration?RetryWhat about guidance pilot flight sims in python rust or blenderSearching for flight simulation engines and toolsThe user is asking about guidance, pilot, and flight simulation systems. This is a shift from robotics to aerospace/flight simulation. They want to know about tools in Python, Rust, or Blender for flight simulation systems. Let me search for:
Flight simulation engines/frameworks in Python, Rust Guidance, navigation, and control (GNC) systems Flight dynamics libraries Autopilot systems How Blender might integrate with flight sims
This is a broader domain so I should look for both professional/research tools and also game engines used for flight training.flight simulator Python JSBSim FlightGear autopilot10 resultsJSBSim Flight Dynamics Model — JSBSim 1.2.3 documentationgithub.ioGitHub - JSBSim-Team/jsbsim: An open source flight dynamics & control software librarygithub.comFlightGear forum • View topic - How is JSBSim incorporated into FG?flightgear.orgJSBSim - FlightGear wikiflightgear.orgJSBSim Open Source Flight Dynamics Modelsourceforge.netAutopilot - FlightGear wikiflightgear.orgJSBSim Flight Dynamics Model: JSBSimgithub.ioJSBSim Simulation | PX4 Guide (main)px4.iojsbsim/README.md at master · JSBSim-Team/jsbsimgithub.comJSBSim Flight Dynamics Model — JSBSim 1.2.2 documentationgithub.ioGood, I found JSBSim which is the main open-source flight dynamics model. Good, I found JSBSim which is the main open-source flight dynamics model. Now let me search for Python-specific flight simulation frameworks and also Rust-based options.Python flight simulation Gymnasium PyFlyt drone UAV control10 resultsPyFlyt · PyPIpypi.orgGitHub - jjshoots/PyFlyt: UAV Flight Simulator for Reinforcement Learning Researchgithub.comPyFlyt—UAV Simulation Environments for Reinforcement ...arxiv.org(PDF) PyFlyt -- UAV Simulation Environments for Reinforcement Learning Researchresearchgate.netGitHub - utiasDSL/gym-pybullet-drones: PyBullet Gymnasium environments for single and multi-agent reinforcement learning of quadcopter controlgithub.com[2304.01305] PyFlyt -- UAV Simulation Environments for Reinforcement Learning Researcharxiv.orgCrazyFlyt · PyPIpypi.orgPyFlyt - UAV Simulation Environments for Reinforcement Learning Research — PyFlyttaijunjet.comPyFlyt -- UAV Simulation Environments for Reinforcement Learning Researchacademia.eduPyFlyt/PyFlyt/core/drones/quadx.py at master · jjshoots/PyFlytgithub.comGood, I have Python-based UAV/flight sim info. Good, I have Python-based UAV/flight sim info. Now let me search for Rust-based flight simulation options and also how to integrate with Blender for visualization.Rust flight simulator aerospace guidance control autopilot10 resultsWriting a flight controller for drones in rust - community - The Rust Programming Language Forumrust-lang.orgHow to Fly Minicopters in RUST (Flight Controls, Landing, Driving)corrosionhour.comAerospace Applications - MATLAB & Simulinkmathworks.comMini-copter controls (a basic tutorial for absolute beginners) - Tips and Tricks - RustEZrustez.comGitHub - GaloisInc/RustyFlight: A Rust flight control system, derived from CleanFlightgithub.comRust :: Steam Communitysteamcommunity.comflight-controller · GitHub Topics · GitHubgithub.comControls | Rust Wiki | Fandomfandom.comHelicopter controls · Rust Feedbacknolt.ioSteam Community :: Guide :: Rust : How To Fly & Control The MiniCopter & Scrap Transport Helicopter [OCTOBER 2025]steamcommunity.comThe search results are mostly about the video game "Rust", not the Rust programming language for The search results are mostly about the video game "Rust", not the Rust programming language for flight simulation! Let me try a more specific search."rust programming" aerospace simulation nyx-space dynamics orbital2 resultsIs there any aerospace/military companies claimed to use Rust as their main programming language in the future? - The Rust Programming Language Forumrust-lang.org[SOLVED] What technologie to use for a simulation? - help - The Rust Programming Language Forumrust-lang.orgLet me search for specific Rust aerospace libraries I know exist, like nyx-space.Let me search for specific Rust aerospace libraries I know exist, like nyx-space.nyx-space rust orbital mechanics spacecraft navigation10 resultsnyx_space - Rustnyxspace.comOrbit in nyx_space - Rustgitlab.ionyx_space::od - Rustnyxspace.comNyx Spacenyxspace.comnyx-space - crates.io: Rust Package Registrycrates.ioRust - Nyx Spacenyxspace.comGitHub - nyx-space/nyx: Nyx is a high fidelity, fast, reliable and validated astrodynamics toolkit library written in Rust and available in Pythongithub.comnyx_space::cosmic - Rustnyxspace.comNyx Space · GitHubgithub.com04_lro_od - Rust - nyx_spacenyxspace.comGood! Found nyx-space. Now let me search for Blender visualization integration with flight sims.Good! Found nyx-space. Now let me search for Blender visualization integration with flight sims.Blender visualization flight simulation JSBSim animation trajectory10 resultsGitHub - JSBSim-Team/jsbsim: An open source flight dynamics & control software librarygithub.comJSBSim Open Source Flight Dynamics Modelsourceforge.netBlender JSBSim import - FlightGear wikiflightgear.org(PDF) Integration of JSBSim and Unreal Engine for Flight Simulator Development: A Case Study on the Cessna T-37researchgate.netJSBSim Jon S. Berndt and the JSBSim Development Teamsourceforge.netSimulation — Blenderblender.orgJSBSim Simulation | PX4 Guide (main)px4.ioStudy: Visualization of spacecraft trajectories with NASA SPICE and Blenderupc.eduCreating a RealFlight model with Blender - Blog - ArduPilot Discourseardupilot.orgGitHub - RenanMsV/JSBSim2Blender: Load and visualize aircraft FDM metrics from JSBSim XML files inside Blendergithub.comUnearthed flight simulation tool for Python and Blender.Great! I found JSBSim2Blender which imports JSBSim models into Blender. I have enough information now to provide a comprehensive answer about flight simulation in Python, Rust, and Blender.Flight Simulation: GNC + Autopilot Stack Core Flight Dynamics Engines JSBSim (C++ with Python bindings) - Jsbsim-teamGitHub Open-source FDM, XML-based aircraft config (mass, aerodynamics, propulsion, flight control, autopilot), used by FlightGear, DARPA AlphaDogfight, PX4/ArduPilot SITL JSBSim defeated USAF F-16 Weapons Instructor in AlphaDogfight Trials using hierarchical RL GitHub - JSBSim-Team/jsbsim: An open source flight dynamics & control software library Architecture: python# Python JSBSim import jsbsim fdm = jsbsim.FGFDMExec(root_dir='/path/to/jsbsim') fdm.load_model('f16') fdm.run_ic() # Initial conditions while fdm.run(): # Get state: position, velocity, orientation # Apply control inputs # Propagate dynamics Python UAV/Drone Simulation PyFlyt - Gymnasium-compatible UAV envs (quadrotor/fixed-wing), PyBullet physics, modular components (motors, lifting surfaces), arbitrary configs, onboard cameras (RGB/depth/segmentation), multi-agent, collision tracking PyPIarXiv gym-pybullet-drones - Quadcopter RL, supports CrazyFlie firmware SITL (betaflight/crazyflie-firmware), PID controllers GitHub - utiasDSL/gym-pybullet-drones: PyBullet Gymnasium environments for single and multi-agent reinforcement learning of quadcopter control pythonimport gymnasium import PyFlyt.gym_envs env = gymnasium.make("PyFlyt/QuadX-Hover-v2", render_mode="human")
from PyFlyt.pz_envs import MAFixedwingDogfightEnv env = MAFixedwingDogfightEnv(render_mode="human") Rust Aerospace Nyx Space - High-fidelity astrodynamics (orbital mechanics, spacecraft navigation, orbit determination), used in Firefly Blue Ghost 1 lunar mission & NASA CAPSTONE, AGPLv3, Python bindings available nyx_space - Rust +2 Key crates:
nyx-space - Trajectory optimization, Monte Carlo, OD ANISE - Modern SPICE replacement (ephemerides, attitude) hifitime - Nanosecond-precision time (formally verified)
RustyFlight (GaloisInc) - Port of CleanFlight drone firmware to Rust using C2Rust transpiler, targets SITL + Cortex-M boards GitHub - GaloisInc/RustyFlight: A Rust flight control system, derived from CleanFlight rust// nyx-space example use nyx_space::prelude::*;
let orbit = Orbit::keplerian( 7000.0, 0.01, 45.0, 0.0, 0.0, 0.0, Epoch::now().unwrap(), EARTH_J2000 );
let prop = Propagator::new(orbit, spacecraft, dynamics); Blender Integration JSBSim2Blender - Import JSBSim XML aircraft models into Blender 2.8-4.2, visualize FDM metrics FlightGear WikiGitHub Workflow: Blender/MSFS for 3D models → JSBSim for flight dynamics → Unreal Engine for real-time visualization + control surface animation (PDF) Integration of JSBSim and Unreal Engine for Flight Simulator Development: A Case Study on the Cessna T-37 SPICE+Blender - NASA SPICE toolkit integration for spacecraft trajectory visualization, recreate orbital mechanics scenarios Study: Visualization of spacecraft trajectories with NASA SPICE and Blender Visualization Pipeline: JSBSim/PyFlyt (dynamics simulation) ↓ export trajectory data Python script ↓ Blender Python API (bpy) Blender (keyframe animation) ↓ render Video/animation python# Blender bpy script import bpy import json
with open('trajectory.json') as f: data = json.load(f)
aircraft = bpy.data.objects['f16'] for frame, state in enumerate(data): aircraft.location = state['position'] aircraft.rotation_euler = state['orientation'] aircraft.keyframe_insert(data_path="location", frame=frame) aircraft.keyframe_insert(data_path="rotation_euler", frame=frame) Complete Flight Sim Stack ┌──────────────────────────────────────────┐ │ Design (Blender/CAD) │ │ - Aircraft geometry │ │ - Mass properties │ └────────────────┬─────────────────────────┘ ↓ export URDF/mesh ┌──────────────────────────────────────────┐ │ Flight Dynamics (JSBSim/PyFlyt) │ │ - 6DOF equations of motion │ │ - Aerodynamics │ │ - Propulsion │ └────────────────┬─────────────────────────┘ ↓ state feedback ┌──────────────────────────────────────────┐ │ Guidance, Nav & Control (Python/Rust) │ │ - Autopilot (PID/LQR/MPC) │ │ - Path planning │ │ - Sensor fusion (Kalman) │ └────────────────┬─────────────────────────┘ ↓ control commands ┌──────────────────────────────────────────┐ │ RL Training (Gymnasium) │ │ - PPO/SAC/TD3 │ │ - Multi-agent (dogfight) │ └────────────────┬─────────────────────────┘ ↓ visualization ┌──────────────────────────────────────────┐ │ Visualization (Blender/FlightGear) │ │ - Trajectory replay │ │ - Real-time rendering │ └──────────────────────────────────────────┘ Recommended Toolchain For fixed-wing/jets: bashpip install jsbsim gymnasium
For drones/quadrotors: bashpip install PyFlyt gymnasium stable-baselines3
For spacecraft: bashcargo add nyx-space hifitime anise
Blender visualization: bash# Install JSBSim2Blender addon
Git Repo Structure flight-sim/ ├── aircraft/ │ ├── f16.blend # Blender model │ ├── f16_jsbsim.xml # JSBSim config │ └── meshes/ ├── autopilot/ │ ├── pid_controller.py │ ├── guidance_law.rs # Rust GNC │ └── path_planner.py ├── training/ │ ├── dogfight_env.py # Gymnasium env │ └── train_ppo.py ├── viz/ │ └── replay_trajectory.py # Blender bpy └── tests/ └── validate_dynamics.py Key advantages:
JSBSim: battle-tested, used by DARPA/NASA PyFlyt: fastest RL iteration for UAVs Nyx: spacecraft/orbital (lunar missions proven) Blender: professional visualization Git tracks parametric source
Want guidance on autopilot implementation (PID/MPC) or RL training setup?
This project now includes automatic service startup and a web dashboard for managing Blender and related tools.
When you open the project in VSCode with the dev container, the following services will automatically start:
- Blender MCP Server (port 9876): AI agent integration for Blender automation
- Jupyter Lab (port 8888): Interactive notebooks for generative CAD workflows
- Web Dashboard (port 3000): Management interface for all services
The web dashboard provides:
- Service Status: Real-time monitoring of all running services
- Project Management: Browse and manage your Blender projects
- Git Integration: Check status and commit changes directly from the UI
- Blender Console: Execute Python scripts in Blender remotely
- Quick Actions: Direct links to Jupyter, ComfyUI, and other tools
Access the dashboard at: http://localhost:3000
- Open the project in VSCode
- The dev container will build and start services automatically
- Access the dashboard to monitor and control your environment
- Use the Blender console to execute parametric modeling scripts
- Commit changes through the dashboard or VSCode git integration
is my own invention that is intended to try and automate some of this.
This project includes comprehensive build and packaging support for creating distributable Blender extensions.
# Test the stdio MCP server
just test-mcp-stdio
# Package extension for distribution
just package-extension
# Install to system Blender
just install-extension
# Create GitHub release
just release 2.0.0- BUILD.md - Complete build and packaging guide
- ARCHITECTURE.md - Technical architecture and design
- STDIO_IMPLEMENTATION.md - Implementation status and roadmap
- INTEGRATION.md - Integration patterns and workflows
All available build commands can be seen with:
just --listKey recipes:
prepare-extension- Prepare extension directorybuild-extension- Create distributable .zipbuild-extension-cli- Use Blender's official CLIvalidate-extension- Check package integritypackage-extension- Complete build pipelineclean-build- Remove build artifacts
See BUILD.md for detailed instructions.