Human-Centric Spatial Augmented Reality for Interactive (Dis)assembly Operator Assistance

Developing a computer vision-enabled Spatial Augmented Reality framework for intelligent, privacy-preserving operator assistance in human-centric smart (dis)assembly.

Vision-enabled SAR setup for in-situ guidance and operator assistance.

Overview

This project develops a human-centric Spatial Augmented Reality (SAR) system that projects adaptive, light-guided (dis)assembly instructions directly onto the workspace and is controlled through AI-based hand gesture recognition. The system delivers real-time guidance and error feedback without requiring handheld or wearable devices. Beyond guidance, the framework is designed as an intelligent operator assistance system that augments physical and cognitive capabilities, informing users about posture-related physical risk and cognitive state (e.g., workload) while protecting operator identity via privacy-by-design sensing and data handling. User studies show significant reductions in task time, error rates, and perceived workload compared to conventional instruction methods.

Motivation
This project addresses limitations of conventional assembly instructions that rely on static manuals or wearable devices. By projecting guidance directly onto the workspace, SAR reduces cognitive load and improves task flow. The project further targets Industry 5.0 operator assistance by extending SAR from guidance-only to capability augmentation, combining task awareness with posture and cognitive-state feedback under privacy-preserving constraints suitable for real shop-floor deployment.
System Architecture
The system integrates computer vision, gesture recognition, and spatial projection to deliver adaptive (dis)assembly instructions in real time. A closed-loop operator assistance layer estimates task progress and deviations, then adapts projected cues and feedback. Optional operator-state modules provide posture-aware physical support and cognitive workload cues, enabling adaptive assistance without exposing operator identity.
Key Components
  • Vision-based workspace perception (parts/steps/progress verification)
  • Gesture-controlled SAR interface for touchless step navigation and confirmations
  • Projection mapping and calibration for spatially registered overlays
  • Error detection and feedback cues (missed step, wrong part/order, misalignment)
  • Operator assistance modules (posture/physical risk cues, cognitive workload inference)
  • Privacy-by-design pipeline (identity suppression, minimal logging, policy-based retention)
Evaluation
User studies compared SAR guidance with conventional instructions. Participants showed reduced completion time and fewer errors. Evaluation also considers cognitive ergonomics and operator acceptance, and, where enabled, assesses the feasibility of posture- and workload-aware assistance under privacy-preserving sensing constraints.
Key Findings
  • Faster task completion
  • Lower error rate
  • Reduced perceived workload
  • Improved interaction flow via touchless gesture control
  • Practical pathway to privacy-preserving operator assistance (capability augmentation)
Naimul Hasan

Naimul Hasan

PhD Researcher

My research interests include smart assembly system, Industry 5.0.

Louie Webb

Louie Webb

PhD Researcher

My research interests include distributed robotics, mobile computing and programmable matter.

Malarvizhi Kaniappan Chinnathai

Malarvizhi Kaniappan Chinnathai

Lecturer in Modelling of Discrete Event Processes

My research interests include development and application of discrete event simulation for decision support in manufacturing scale-up, operations research, electric vehicle (EV) assembly, process planning, and intelligent logistics for manufacturing systems.

Bugra Alkan

Bugra Alkan

Senior Lecturer in AI and Robotics

My research interests include human–robot collaboration, industrial AI and cyber-physical production systems.