đ¤ Dual-Arm Robotic System Synchronisation
Visual-Guided Decentralised Coordination in PyBullet
đ Overview
This project implements a dual-arm robotic system in simulation, where:
- A UR3 manipulator (Worker) performs object manipulation and trajectory execution
- A Franka Emika Panda (Observer) provides visual feedback via an eye-in-hand camera
The system demonstrates decentralised coordination, where:
No direct joint/state sharing exists between robots â coordination emerges purely through vision-based feedback and control loops.
The primary task involves: - Picking a cube using visual estimation
- Executing a smooth infinity (â) trajectory
- Maintaining continuous visual tracking under noise and occlusion
đ§ Core Contributions
- đ Decentralised dual-arm coordination
- đī¸ Visual servoing using image-space error
- đ Mathematical trajectory modelling (Lissajous curve)
- âī¸ Physics-based simulation (PyBullet)
- đ Logging + performance evaluation pipeline
- đ§Š Robust state-machine-driven control (both robots)
đī¸ System Architecture
Camera (Franka - Eye-in-Hand)
â
RGB Frame â HSV Detection â Centroid (cx, cy)
â
Pixel Error (ex, ey)
â
v_cam â v_world (via R_cam)
â
Jacobian-based Control (DLS Inverse)
â
Franka Joint Velocity Update
â
âââââââââââââââââââââââââââââ
â
Stable 3D Estimate (Y, Z)
â
UR3 State Machine
â
Pick â Lift â Infinity Trajectory
âī¸ Mathematical Foundations
đ Infinity Trajectory (Lissajous Curve)
y(t) = C_Y + A_Y sin(Īt)
z(t) = C_Z + A_Z sin(2Īt)
đ Visual Servoing Control
Pixel error:
e_x = c_x - W/2
e_y = c_y - H/2
Camera velocity:
v_cam = [ 0,
K_pix * e_x,
-K_pix * e_y ]
Jacobian-based control:
q_dot = J^T (J J^T + ÎģI)^(-1) v_world
đ Control Logic
đĻž UR3 (Worker Arm)
Finite state machine:
- WAIT_FOR_CAM â waits for stable visual estimate
- APPROACH â moves to pick position
- LIFT â lifts object to safe pose
- MOVE_â â executes infinity trajectory
Smoothing:
p_smooth = Îą p_ref + (1 - Îą) p_smooth
đī¸ Franka (Observer Arm)
Detection Pipeline
- RGB â HSV conversion
- Red colour thresholding
- Centroid detection
Tracking States
- SEARCH
- TRACKING
- TEMP_LOST
- LOST
Control
- Pixel error â velocity
- Jacobian inverse
- Velocity clipping
đĻ Movement_with_logs.py (Enhanced Implementation)
đ Logging System
- Pixel error:
e_x(t), e_y(t) - End-effector error:
e_EE(t) = || p_EE(t) - p_ref(t) ||
đ Stability Improvements
- First-order smoothing
- Buffered cube position estimation
- Cluster-based validation
đ§Š Robustness Features
- Handles noisy detections
- Handles occlusions
- Multi-frame consistency checks
- Kalman Filter for pose estimation
đ Tech Stack
- Python
- PyBullet
- NumPy
- OpenCV
âļī¸ How to Run
git clone https://github.com/DyutideeptaB/Dual-Arm_Robotic_System_Synchronisation.git
cd Dual-Arm_Robotic_System_Synchronisation
pip install -r requirements.txt
python Final_with_logs.py
đ Outputs
- Pixel error vs time
- End-effector tracking error
- Infinity trajectory (reference vs actual)
đŦ Applications
- Collaborative robotics
- Industrial automation
- Visual servoing research
- Multi-agent systems
đ Future Work
- Reinforcement learning integration
- Real robot deployment
- Multi-object tracking
- Dynamic environments
đ¤ Author
Dyutideepta Banerjee
Physics + AI | Simulation Driven Systems | Computer Vision
â Final Note
This project demonstrates how coordinated robotic behaviour can emerge purely from perception-driven feedback, without centralised control.