A drift-free, math-engine powered control strategy for the Unitree G1. Bridging simulation and reality with PyBullet and Human-in-the-Loop perception.
This system uses a digital twin where a simulated robot in PyBullet mirrors and computes control for the physical Unitree G1. The simulation acts as a real-time math engine.
Instead of analytical IK, we use PyBullet in p.DIRECT mode:
q_integrator).
Typical IK solvers bias toward a nominal pose. Here, the system holds the last valid configuration. Result: Arm stays fixed when idle.
| Mode | DoF | Function |
|---|---|---|
| Free Mode | 3 (XYZ) | Positioning without orientation constraint. |
| Locked Mode | 6 (Pose) | Full placement with fixed RPY. |
Combines slow semantic understanding (VLM) with fast geometric tracking (ArUco) to maintain real-time responsiveness.
src: realsense_receive_rgb_min.py
Answers "Where is the shampoo?" providing bounding boxes for initialization.
High-speed tracking of the hand marker for closed-loop control.
Figure 3.1: IBVS Architecture
if state.auto_mode: # 1. Get Error Vector dx, dy = received_error_px # 2. Hand-Eye Mapping (Cam X -> Robot Y) vy = -dx * GAIN_XY vz = -dy * GAIN_XY # 3. Depth Correction vx = (target_depth - hand_depth) * GAIN_Z robot.set_velocity([vx, vy, vz])
15-point spline interpolation showing arm trajectory avoiding obstacles.
Low-latency transport of dpx + depth data.
Static transform: +X_img → ±Y_robot
Real-time environmental reconstruction for path planning. The system runs a background SLAM thread (live_slam).
Persistent point cloud colored by height (Turbo colormap) to distinguish floor from obstacles.
Uses percentile filtering and exponential smoothing for clear navigation.
Active feedback on fingertips
VLM currently ~0.98s. Too slow for dynamic objects.
Marker is on hand back. Grasp center is ~10cm forward.
Prevent unintended motion during vision spikes.
Comprehensive archive of test runs and capabilities.
Arm trajectory safety testing.
Depth sensor calibration test.
Hybrid tracking system demo.
Standard manipulation cycle.
Real-time object tracking.
Remote operator interface.
Fine motor control test.
Speed and acceleration checks.
Digital twin synchronization.
Complex trajectory execution.
Specific object training set.
Raw RealSense output.
Language-to-coordinate mapping.
Human-Robot Interaction tests.
Startup and calibration sequence.