skip to main content
10.1145/3357251.3357587acmotherconferencesArticle/Chapter ViewAbstractPublication PagessuiConference Proceedingsconference-collections
research-article
Public Access

Effects of Shared Gaze Parameters on Visual Target Identification Task Performance in Augmented Reality

Published: 19 October 2019 Publication History

Abstract

Augmented reality (AR) technologies provide a shared platform for users to collaborate in a physical context involving both real and virtual content. To enhance the quality of interaction between AR users, researchers have proposed augmenting users’ interpersonal space with embodied cues such as their gaze direction. While beneficial in achieving improved interpersonal spatial communication, such shared gaze environments suffer from multiple types of errors related to eye tracking and networking, that can reduce objective performance and subjective experience.
In this paper, we conducted a human-subject study to understand the impact of accuracy, precision, latency, and dropout based errors on users’ performance when using shared gaze cues to identify a target among a crowd of people. We simulated varying amounts of errors and the target distances and measured participants’ objective performance through their response time and error rate, and their subjective experience and cognitive load through questionnaires. We found some significant differences suggesting that the simulated error levels had stronger effects on participants’ performance than target distance with accuracy and latency having a high impact on participants’ error rate. We also observed that participants assessed their own performance as lower than it objectively was, and we discuss implications for practical shared gaze applications.

Supplementary Material

MP4 File (a12-norouzi.mp4)

References

[1]
Michael Barz, Andreas Bulling, and Florian Daiber. 2015. Computational modelling and prediction of gaze estimation error for head-mounted eye trackers. DFKI ResearchReports 1, 1 (2015), 1–10.
[2]
Martin Bauer, Gerd Kortuem, and Zary Segall. 1999. ” Where are you pointing at?” A study of remote collaboration in a wearable videoconference system. In Digest of Papers. Third International Symposium on Wearable Computers. IEEE, 151–158.
[3]
Susan E Brennan, Xin Chen, Christopher A Dickinson, Mark B Neider, and Gregory J Zelinsky. 2008. Coordinating cognition: The costs and benefits of shared gaze during collaborative search. Cognition 106, 3 (2008), 1465–1477.
[4]
John Brooke 1996. SUS-A quick and dirty usability scale. Usability evaluation in industry 189, 194 (1996), 4–7.
[5]
Juan J Cerrolaza, Arantxa Villanueva, Maria Villanueva, and Rafael Cabeza. 2012. Error characterization and compensation in eye tracking systems. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 205–208.
[6]
Jan Drewes, Guillaume S Masson, and Anna Montagnini. 2012. Shifts in reported gaze position due to changes in pupil size: Ground truth and compensation. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 209–212.
[7]
Stephen R Ellis, Francois Breant, B Manges, Richard Jacoby, and Bernard D Adelstein. 1997. Factors influencing operator interaction with virtual objects viewed via head-mounted see-through displays: viewing conditions and rendering latency. In Proceedings of IEEE Annual International Symposium on Virtual Reality. IEEE, 138–145.
[8]
Anna Maria Feit, Shane Williams, Arturo Toledo, Ann Paradiso, Harish Kulkarni, Shaun Kane, and Meredith Ringel Morris. 2017. Toward everyday gaze input: Accuracy and precision of eye tracking and implications for design. In Proceedings of the Chi Conference on Human Factors in Computing Systems. ACM, 1118–1130.
[9]
Kay Fitzpatrick, Marcus A Brewer, and Shawn Turner. 2006. Another look at pedestrian walking speed. Transportation Research Record 1982, 1 (2006), 21–29.
[10]
Erik Geelhoed, Aaron Parker, Damien J. Williams, and Martin Groen. 2009. Effects of latency on telepresence. Technical Report HPL-2009-120. HP Laboratories.
[11]
Kunal Gupta, Gun A Lee, and Mark Billinghurst. 2016. Do you see what I see? The effect of gaze tracking on task space remote collaboration. IEEE Transactions on Visualization and Computer Graphics 22, 11(2016), 2413–2422.
[12]
Sandra G Hart and Lowell E Staveland. 1988. Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research. In Advances in Psychology. Vol. 52. Elsevier, 139–183.
[13]
Kenneth Holmqvist, Marcus Nyström, and Fiona Mulvey. 2012. Eye tracker data quality: what it is and how to measure it. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 45–52.
[14]
Anthony J Hornof and Tim Halverson. 2002. Cleaning up systematic error in eye-tracking data by using required fixation locations. Behavior Research Methods, Instruments, & Computers 34, 4 (2002), 592–604.
[15]
Sophie Jörg, Aline Normoyle, and Alla Safonova. 2012. How responsiveness affects players’ perception in digital games. In Proceedings of the ACM Symposium on Applied Perception. ACM, 33–38.
[16]
Kangsoo Kim, Mark Billinghurst, Gerd Bruder, Henry Been-Lirn Duh, and Gregory F. Welch. 2018. Revisiting Trends in Augmented Reality Research: A Review of the 2nd Decade of ISMAR (2008–2017). IEEE Transactions on Visualization and Computer Graphics (TVCG) 24, 11(2018), 2947–2962.
[17]
Kangsoo Kim, Arjun Nagendran, Jeremy Bailenson, and Greg Welch. 2015. Expectancy Violations Related to a Virtual Human’s Joint Gaze Behavior in Real-Virtual Human Interactions. In Proceedings of International Conference on Computer Animation and Social Agents. 5–8.
[18]
Kiyoshi Kiyokawa, Haruo Takemura, and Naokazu Yokoya. 1999. A collaboration support technique by integrating a shared virtual reality and a shared augmented reality. In IEEE Proceedings of the International Conference on Systems, Man, and Cybernetics (Cat. No. 99CH37028), Vol. 6. IEEE, 48–53.
[19]
Alexandros Koilias, Christos Mousas, and Christos-Nikolaos Anagnostopoulos. 2019. The Effects of Motion Artifacts on Self-Avatar Agency. In Informatics, Vol. 6. Multidisciplinary Digital Publishing Institute, 18.
[20]
Stephen RH Langton, Roger J Watt, and Vicki Bruce. 2000. Do the eyes have it? Cues to the direction of social attention. Trends in Cognitive Sciences 4, 2 (2000), 50–59.
[21]
Cha Lee, Scott Bonebrake, Doug A Bowman, and Tobias Höllerer. 2010. The role of latency in the validity of AR simulation. In IEEE Virtual Reality Conference (VR). 11–18.
[22]
D Harrison Mcknight, Michelle Carter, Jason Bennett Thatcher, and Paul F. Clay. 2011. Trust in a specific technology: An investigation of its components and measures. ACM Transactions on Management Information Systems 2, 2 (2011), 12.
[23]
Marcus Nyström, Richard Andersson, Kenneth Holmqvist, and Joost Van De Weijer. 2013. The influence of calibration method and eye physiology on eyetracking data quality. Behavior Research Methods 45, 1 (2013), 272–288.
[24]
Kristien Ooms, Lien Dupont, Lieselot Lapon, and Stanislav Popelka. 2015. Accuracy and precision of fixation locations recorded with the low-cost Eye Tribe tracker in different experimental setups. Journal of Eye Movement Research 8, 1 (2015), 1–20.
[25]
Andriy Pavlovych and Wolfgang Stuerzlinger. 2011. Target following performance in the presence of latency, jitter, and signal dropouts. In Proceedings of Graphics Interface. Canadian Human-Computer Communications Society, 33–40.
[26]
Thammathip Piumsomboon, Arindam Day, Barrett Ens, Youngho Lee, Gun Lee, and Mark Billinghurst. 2017. Exploring enhancements for remote mixed reality collaboration. In ACM SIGGRAPH Asia Mobile Graphics & Interactive Applications. 16.
[27]
Thammathip Piumsomboon, Youngho Lee, Gun A Lee, Arindam Dey, and Mark Billinghurst. 2017. Empathic mixed reality: Sharing what you feel and interacting with what you see. In International Symposium on Ubiquitous Virtual Reality. IEEE, 38–41.
[28]
Eric Ragan, Curtis Wilkes, Doug A Bowman, and Tobias Hollerer. 2009. Simulation of augmented reality systems in purely virtual environments. In IEEE Virtual Reality Conference. 287–288.
[29]
Katrin Schoenenberg. 2016. The Quality of Mediated-Conversations under Transmission Delay. Ph.D. Dissertation. TU Berlin.
[30]
Nicholas Toothman and Michael Neff. 2019. The Impact of Avatar Tracking Errors on User Experience in VR. In Proceedings of IEEE Virtual Reality (VR). 1–11.
[31]
Thomas Waltemate, Irene Senna, Felix Hülsmann, Marieke Rohde, Stefan Kopp, Marc Ernst, and Mario Botsch. 2016. The impact of latency on perceptual judgments and motor performance in closed-loop interaction in virtual reality. In Proceedings of the ACM Conference on Virtual Reality Software and Technology. 27–35.
[32]
Gregory Welch, Gerd Bruder, Peter Squire, and Ryan Schubert. 2019. Anticipating Widespread Augmented Reality: Insights from the 2018 AR Visioning Workshop. Technical Report. University of Central Florida and Office of Naval Research.
[33]
Yanxia Zhang, Ken Pfeuffer, Ming Ki Chong, Jason Alexander, Andreas Bulling, and Hans Gellersen. 2017. Look together: using gaze for assisting co-located collaborative search. Personal and Ubiquitous Computing 21, 1 (2017), 173–186.

Cited By

View all
  • (2024)Filtering on the Go: Effect of Filters on Gaze Pointing Accuracy During Physical Locomotion in Extended RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.345615330:11(7234-7244)Online publication date: 1-Nov-2024
  • (2023)GE-Simulator: An Open-Source Tool for Simulating Real-Time Errors for HMD-based Eye TrackersProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3588417(1-6)Online publication date: 30-May-2023
  • (2023)Virtual Big Heads in Extended Reality: Estimation of Ideal Head Scales and Perceptual Thresholds for Comfort and Facial CuesACM Transactions on Applied Perception10.1145/357107420:1(1-31)Online publication date: 11-Jan-2023
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Other conferences
SUI '19: Symposium on Spatial User Interaction
October 2019
164 pages
ISBN:9781450369756
DOI:10.1145/3357251
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 19 October 2019

Permissions

Request permissions for this article.

Check for updates

Badges

  • Best Paper

Author Tags

  1. Augmented Reality
  2. Eye Tracking
  3. Shared Gaze
  4. Target Identification

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

Conference

SUI '19
SUI '19: Symposium on Spatial User Interaction
October 19 - 20, 2019
LA, New Orleans, USA

Acceptance Rates

Overall Acceptance Rate 86 of 279 submissions, 31%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)166
  • Downloads (Last 6 weeks)38
Reflects downloads up to 10 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Filtering on the Go: Effect of Filters on Gaze Pointing Accuracy During Physical Locomotion in Extended RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.345615330:11(7234-7244)Online publication date: 1-Nov-2024
  • (2023)GE-Simulator: An Open-Source Tool for Simulating Real-Time Errors for HMD-based Eye TrackersProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3588417(1-6)Online publication date: 30-May-2023
  • (2023)Virtual Big Heads in Extended Reality: Estimation of Ideal Head Scales and Perceptual Thresholds for Comfort and Facial CuesACM Transactions on Applied Perception10.1145/357107420:1(1-31)Online publication date: 11-Jan-2023
  • (2023)A comprehensive survey on AR-enabled local collaborationVirtual Reality10.1007/s10055-023-00848-227:4(2941-2966)Online publication date: 20-Aug-2023
  • (2023)Augmented Reality for Cognitive ImpairmentsSpringer Handbook of Augmented Reality10.1007/978-3-030-67822-7_31(765-793)Online publication date: 1-Jan-2023
  • (2022)The Eye in Extended Reality: A Survey on Gaze Interaction and Eye Tracking in Head-worn Extended RealityACM Computing Surveys10.1145/349120755:3(1-39)Online publication date: 25-Mar-2022
  • (2022)Weighted Pointer: Error-aware Gaze-based Interaction through Fallback ModalitiesIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2022.320309628:11(3585-3595)Online publication date: Nov-2022
  • (2020)Virtual Big Heads: Analysis of Human Perception and Comfort of Head Scales in Social Virtual Reality2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)10.1109/VR46266.2020.1581089101511(425-433)Online publication date: Mar-2020
  • (2020)Examining Whether Secondary Effects of Temperature-Associated Virtual Stimuli Influence Subjective Perception of Duration2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)10.1109/VR46266.2020.00070(493-499)Online publication date: Mar-2020
  • (2020)Virtual Big Heads: Analysis of Human Perception and Comfort of Head Scales in Social Virtual Reality2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)10.1109/VR46266.2020.00063(425-433)Online publication date: Mar-2020
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media