Active ragdoll training with Unity ML-Agents (PyTorch).
Based on walker example Built off of walker github The Robot Kyle model from the Unity assets store is used for the ragdoll.
-
Heuristic function inlcuded to drive the joints by user input (for development testing only).
-
Added stabilizer to hips and spine. The stabilizer applies torque to help ragdoll balance.
-
Added "earlyTraining" bool for initial balance/walking toward target.
-
Added WallsAgent prefab for navigating around obstacles (using Ray Perception Sensor 3D).
-
Added StairsAgent prefab for navigating small and large steps.
-
Added curiosity to yaml to improve walls and stairs training.
-
Added two environement one for obstacle detection and another for terrain detection
-
Integrate YOLOv8 with Barracuda and pipe the outputs into the observation space of the RL agent