This paper was converted on www.awesomepapers.org from LaTeX by an anonymous user.
Want to know more? Visit the Converter page.

\systemname: A Visual Analytics Tool to Explore Human Behaviour based on fNIRS in AR guidance systems

\authororcidSonia Castelo0000-0001-6881-3006    \authororcidJoao Rulff0000-0003-3341-7059    \authororcidParikshit Solunke0009-0003-5546-0135    \authororcidErin McGowan0000-0002-7565-3052    \authororcidGuande Wu0000-0002-9244-173X    \authororcidIran Roman0000-0003-3781-7244   
\authororcidRoque Lopez0000-0003-3484-1783
   \authororcidBea Steers0009-0007-2831-6460    \authororcidQi Sun0000-0002-3094-5844    \authororcidJuan Bello0000-0001-8561-5204    \authororcidBradley Feest0009-0008-7446-5368    \authororcidMichael Middleton0000-0002-7964-1139   
\authororcidRyan Mckendrick
   and \authororcidClaudio Silva0000-0003-2452-2295
Abstract

The concept of an intelligent augmented reality (AR) assistant has \revisionsignificant, wide-ranging application\revisions, with potential uses in medicine, military, and mechanics \revisiondomains. Such an assistant must be able to perceive the environment and actions, reason about the \revisionenvironment state in relation to a given task, and seamlessly interact with the \revisiontask performer. These interactions typically involve an AR headset equipped with sensors which capture video, audio, and haptic feedback. Previous works have sought to facilitate the development of \revisionintelligent AR assistant\revisions by visualizing these sensor data streams \revisionin conjunction with the assistant’s perception and reasoning model \revisionoutputs. However, existing visual analytics systems do not focus on user modeling or include biometric data, and are only capable of visualizing a single task session for a single performer at a time. \revisionMoreover, they \revisiontypically assume a \revisiontask involves linear progression from one step to the next. We propose a visual analytics system that allows users to compare performance during multiple task sessions\revision, focusing on non-linear tasks where different \revisionstep sequences can lead to \revisionsuccess. In particular, we design visualizations for understanding user behavior through functional near-infrared spectroscopy (fNIRS) data as a proxy for perception, attention, and memory as well as corresponding motion data (acceleration, angular velocity, and gaze). We distill these insights into embedding representations that allow users to easily select groups of sessions with similar behaviors. We provide \revisiontwo case studies that \revisiondemonstrate how \revisionto use these visualizations to gain insights about task performance using data collected during helicopter copilot training tasks. Finally, we evaluate our approach by conducting an in-depth examination of a think-aloud experiment with five domain experts.

keywords:
Perception & Cognition, Application Motivated Visualization, Temporal Data, Image and Video Data, Mobile, AR/VR/Immersive, Specialized Input/Display Hardware
\onlineid

1833 \vgtccategoryResearch \vgtcpapertypeApplication/design study \authorfooterAuthors are with the New York University (NYU) and Northrop Grumman Corporation (NGC). E-mails: {s.castelo, jlrulff, pss442, erin.mcgowan, guandewu, irr2020, rlopez, bs3639, qs2053, jpbello, csilva}@nyu.edu. {bradley.feest, michael.middleton, ryan.mckendrick}@ngc.com \teaser [Uncaptioned image] \systemnameis a visual analytics system that offers a hierarchical set of visualizations designed to analyze performer behavior in augmented reality (AR) assistance tasks by enabling multi-perspective analysis of multimodal time-series data. The \revisionOverview includes the Scatter \revisionPlot \revisionView (A)\revision, \revisionwhich offers three different projections of performer sessions, revealing multidimensional clusters and patterns while \revisionenabling filtering and selection of sessions. The \revisionWorkload \revisionAggregation \revisionView (B) summarizes performers’ cognitive workloads, session durations, and workload-error correlations for the chosen groups. The \revisionEvent \revisionTimeline \revisionView (C) aligns multiple time series (procedures, mental workload, errors, \revisiontask phases) \revisioncollected during performer sessions along a shared time axis, enabling comparison across sessions and exploration by brushing to update linked views. The \revisionSummary \revisionMatrix \revisionView (D) facilitates analysis of procedure frequency, error proportion, overall errors, and mental state distribution within and across sessions for selected workload categories. The \revisionDetail \revisionView (E) enables in-depth exploration of individual sessions with synchronized video and time series visualizations, supporting brushing for seamless navigation and analysis.