HCI Deep Dives is your go-to podcast for exploring the latest trends, research, and innovations in Human Computer Interaction (HCI). Auto-generated using the latest publications in the field, each episode dives into in-depth discussions on topics like wearable computing, augmented perception, cognitive augmentation, and digitalized emotions. Whether you’re a researcher, practitioner, or just curious about the intersection of technology and human senses, this podcast offers thought-provoking insights and ideas to keep you at the forefront of HCI.
All content for HCI Deep Dives is the property of Kai Kunze and is served directly from their servers
with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
HCI Deep Dives is your go-to podcast for exploring the latest trends, research, and innovations in Human Computer Interaction (HCI). Auto-generated using the latest publications in the field, each episode dives into in-depth discussions on topics like wearable computing, augmented perception, cognitive augmentation, and digitalized emotions. Whether you’re a researcher, practitioner, or just curious about the intersection of technology and human senses, this podcast offers thought-provoking insights and ideas to keep you at the forefront of HCI.
ISMAR 2024: Filtering on the Go: Effect of Filters on Gaze Pointing Accuracy During Physical Locomotion in Extended Reality
HCI Deep Dives
20 minutes 26 seconds
1 year ago
ISMAR 2024: Filtering on the Go: Effect of Filters on Gaze Pointing Accuracy During Physical Locomotion in Extended Reality
Pavel Manakhov, Ludwig Sidenmark, Ken Pfeuffer, and Hans Gellersen. 2024. Filtering on the Go: Effect of Filters on Gaze Pointing Accuracy During Physical Locomotion in Extended Reality. IEEE Transactions on Visualization and Computer Graphics 30, 11 (Nov. 2024), 7234–7244. https://doi.org/10.1109/TVCG.2024.3456153
Eye tracking filters have been shown to improve accuracy of gaze estimation and input for stationary settings. However, their effectiveness during physical movement remains underexplored. In this work, we compare common online filters in the context of physical locomotion in extended reality and propose alterations to improve them for on-the-go settings. We conducted a computational experiment where we simulate performance of the online filters using data on participants attending visual targets located in world-, path-, and two head-based reference frames while standing, walking, and jogging. Our results provide insights into the filters' effectiveness and factors that affect it, such as the amount of noise caused by locomotion and differences in compensatory eye movements, and demonstrate that filters with saccade detection prove most useful for on-the-go settings. We discuss the implications of our findings and conclude with guidance on gaze data filtering for interaction in extended reality.
https://ieeexplore.ieee.org/document/10672561
HCI Deep Dives
HCI Deep Dives is your go-to podcast for exploring the latest trends, research, and innovations in Human Computer Interaction (HCI). Auto-generated using the latest publications in the field, each episode dives into in-depth discussions on topics like wearable computing, augmented perception, cognitive augmentation, and digitalized emotions. Whether you’re a researcher, practitioner, or just curious about the intersection of technology and human senses, this podcast offers thought-provoking insights and ideas to keep you at the forefront of HCI.