Image of an artist with an AR headset, illustrating the process of capturing and analyzing data related to professional training sessions. Credit: NYU Tandon School of Engineering
In the high-stakes world of aviation, a pilot’s ability to respond under stress can mean the difference between a safe flight and disaster. Comprehensive and accurate training is crucial to equip pilots with the skills needed to handle these difficult situations.
Pilot trainers rely on augmented reality (AR) systems to teach, guiding pilots through various scenarios so they learn appropriate actions. But these systems work best when they are adapted to the mental states of each subject.
Enter HuBar, a new visual analytics tool designed to summarize and compare AR task execution sessions, such as AR-guided simulated flights, through analysis of performer behavior and workload cognitive.
By providing deep insights into pilot behavior and mental states, HuBar enables researchers and trainers to identify patterns, pinpoint areas of difficulty, and optimize AR-assisted training programs to improve outcomes learning and actual performance.
HuBar was developed by a research team at the NYU Tandon School of Engineering who will present it at the IEEE Visualization and Visual Analytics 2024 conference on October 17, 2024.
“While pilot training is a potential use case, HuBar is not just for aviation,” explained Claudio Silva, NYU Tandon Institute professor in the Department of Computer Science and Engineering (CSE). , who led the research in collaboration with Northrop Grumman Corporation (NGC). ). “HuBar visualizes various data from AR-assisted tasks, and this comprehensive analysis leads to better performance and learning outcomes in various complex scenarios. »
“HuBar could help improve training in surgery, military operations and industrial tasks,” added Silva, who is also co-director of the Center for Research in Data Visualization and Analysis (VIDA) at NYU.
The team presented HuBar in an article published on the arXiv preprint server, which demonstrates its capabilities using aviation as a case study, analyzing data from multiple helicopter co-pilots in an AR flight simulation. The team also produced a video about the system.
Focusing on two pilot subjects, the system revealed striking differences: One subject maintained generally optimal attention states with few errors, while the other experienced underload states and made frequent errors.
HuBar’s detailed analysis, including video footage, showed that the underperforming co-pilot often consulted a manual, indicating less familiarity with the task. Ultimately, HuBar can enable trainers to identify specific areas where co-pilots are struggling and understand why, providing insights to improve AR-assisted training programs.
What makes HuBar unique is its ability to analyze non-linear tasks where different sequences of steps can lead to success, while simultaneously integrating and visualizing multiple complex data streams.
This includes brain activity (fNIRS), body movements (IMU), gaze tracking, task procedures, errors, and mental workload classifications. HuBar’s comprehensive approach enables holistic analysis of performer behavior in AR-assisted tasks, allowing researchers and trainers to identify correlations between cognitive states, physical actions, and task execution across different paths execution of tasks.
HuBar’s interactive visualization system also facilitates comparison between different sessions and different performers, making it possible to discern patterns and anomalies in complex, non-sequential procedures that might otherwise go unnoticed in traditional analysis methods.
“We can now see exactly when and why a person might become mentally overloaded or dangerously underloaded during a task,” said Sonia Castelo, a VIDA research engineer, Ph.D. student at VIDA and lead author of the study. HuBar article.
“This type of detailed analysis has never been possible before across such a wide range of applications. It’s like having x-ray vision into a person’s mind and body during a task, providing information to adapt AR assistive systems to meet an individual’s needs. user.”
As augmented reality systems, including headsets like Microsoft Hololens, Meta Quest, and Apple Vision Pro, become more sophisticated and ubiquitous, tools like HuBar will be crucial to understanding how these technologies affect human performance and cognitive load.
“The next generation of AR training systems could adapt in real time based on the user’s mental state,” said Joao Rulff, a Ph.D. student at VIDA who worked on the project. “HuBar helps us understand exactly how this could work in various applications and complex task structures. »
More information:
Sonia Castelo et al, HuBar: a visual analysis tool to explore human behavior based on fNIRS in AR guidance systems, arXiv (2024). DOI: 10.48550/arxiv.2407.12260
arXiv
Provided by NYU Tandon School of Engineering
Quote: New tool helps analyze pilot performance and mental workload in augmented reality (October 15, 2024) retrieved October 15, 2024 from
This document is subject to copyright. Except for fair use for private study or research purposes, no part may be reproduced without written permission. The content is provided for informational purposes only.