skip to main content
10.1145/3281505.3281538acmconferencesArticle/Chapter ViewAbstractPublication PagesvrstConference Proceedingsconference-collections
research-article
Public Access

An evaluation of pupillary light response models for 2D screens and VR HMDs

Published: 28 November 2018 Publication History

Abstract

Pupil diameter changes have been shown to be indicative of user engagement and cognitive load for various tasks and environments. However, it is still not the preferred physiological measure for applied settings. This reluctance to leverage the pupil as an index of user engagement stems from the problem that in scenarios where scene brightness cannot be controlled, the pupil light response confounds the cognitive-emotional response. What if we could predict the light response of an individual's pupil, thus creating the opportunity to factor it out of the measurement? In this work, we lay the groundwork for this research by evaluating three models of pupillary light response in 2D, and in a virtual reality (VR) environment. Our results show that either a linear or an exponential model can be fit to an individual participant with an easy-to-use calibration procedure. This work opens several new research directions in VR relating to performance analysis and inspires the use of eye tracking beyond gaze as a pointer and foveated rendering.

Supplementary Material

PDF File (a19-john-supp.pdf)
Supplemental file.
ZIP File (a19-john.zip)
Supplemental files.

References

[1]
{n. d.}. LIRIS-ACCEDE dataset. http://liris-accede.ec-lyon.fr/. ({n. d.}). Accessed: 2018-07-15.
[2]
Gunnar Ahlberg, Lars Enochsson, Anthony G Gallagher, Leif Hedman, Christian Hogman, David A McClusky, Stig Ramel, C Daniel Smith, and Dag Arvidsson. 2007. Proficiency-based virtual reality training significantly reduces the error rate for residents during their first 10 laparoscopic cholecystectomies. The American journal of surgery 193, 6 (2007), 797--804.
[3]
Brian P Bailey and Shamsi T Iqbal. 2008. Understanding changes in mental workload during execution of goal-directed tasks and its application for interruption management. ACM Transactions on Computer-Human Interaction 14, 4 (2008), 21.
[4]
Yoann Baveye, Emmanuel Dellandréa, Christel Chamaret, and Liming Chen. 2015. Deep learning vs. kernel methods: Performance for emotion prediction in videos. In International Conference on Affective Computing and Intelligent Interaction. IEEE, 77--83.
[5]
Jackson Beatty. 1982. Task-evoked pupillary responses, processing load, and the structure of processing resources. Psychological bulletin 91, 2 (1982), 276.
[6]
Oliver Bergamin, Andreas Schoetzau, Keiko Sugimoto, and Mario Zulauf. 1998. The influence of iris color on the pupillary light reflex. Graefe's archive for clinical and experimental ophthalmology 236, 8 (1998), 567--570.
[7]
Shrikant R Bharadwaj, Jingyun Wang, and T Rowan Candy. 2011. Pupil responses to near visual demand during human visual development. JOV 11, 4 (2011), 6--6.
[8]
M. M. Bradley. 2009. Natural selective attention: orienting and emotion. Psychophysiology 46 (2009), 1--11.
[9]
Margaret M Bradley and Peter J Lang. 2015. Memory, emotion, and pupil diameter: Repetition of natural scenes. Psychophysiology 52, 9 (2015), 1186--1193.
[10]
Margaret M Bradley, Laura Miccoli, Miguel A Escrig, and Peter J Lang. 2008. The pupil as a measure of emotional arousal and autonomic activation. Psychophysiology 45, 4 (2008), 602--607.
[11]
Fion D Bremner. 2012. Pupillometric Evaluation of the Dynamics of the Pupillary Response to a Brief Light Stimulus in Healthy Subjects. Investigative ophthalmology & visual science 53, 11 (2012), 7343--7347.
[12]
Anne-Marie Brouwer, Thorsten O Zander, Jan BF van Erp, Johannes E Korteling, and Adelbert W Bronkhorst. 2015. Using neurophysiological signals that reflect cognitive or affective state: six recommendations to avoid common pitfalls. Frontiers in neuroscience 9 (2015), 136.
[13]
JF Cardenas-Garcia, HG Yao, and S Zheng. 1995. 3D reconstruction of objects using stereo imaging. Optics and Lasers in Engineering 22, 3 (1995), 193--213.
[14]
WN Charman and Helen Whitefoot. 1977. Pupil diameter and the depth-of-field of the human eye as measured by laser speckle. Optica Acta: International Journal of Optics 24, 12 (1977), 1211--1216.
[15]
W Neil Charman and Hema Radhakrishnan. 2009. Accommodation, pupil diameter and myopia. Ophthalmic and Physiological Optics 29, 1 (2009), 72--79.
[16]
Chien-Chung Chen and Christopher William Tyler. 2015. Shading beats binocular disparity in depth from luminance gradients: Evidence against a maximum likelihood principle for cue combination. PloS one 10, 8 (2015), e0132658.
[17]
Hao Chen, Arindam Dey, Mark Billinghurst, and Robert W. Lindeman. 2017. Exploring Pupil Dilation in Emotional Virtual Reality Environments. In ICAT-EGVE 2017, Robert W. Lindeman, Gerd Bruder, and Daisuke Iwai (Eds.). The Eurographics Association.
[18]
SG De Groot and JW Gebhard. 1952. Pupil size as determined by adapting luminance. JOSA 42, 7 (1952), 492--495.
[19]
CJ Ellis. 1981. The pupillary light reflex in normal subjects. British Journal of Ophthalmology 65, 11 (1981), 754--759.
[20]
Thomas M Gable, Andrew L Kun, Bruce N Walker, and Riley J Winton. 2015. Comparing heart rate and pupil size as objective measures of workload in the driving context: initial look. In Adjunct Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. ACM, 20--25.
[21]
HS Gradle and Walter Ackerman. 1932. The Reaction Time of The Normal Pupil: Second Communication. American Medical Association 99, 16 (1932), 1334--1336.
[22]
Daniel G Green, Maureen K Powers, and Martin S Banks. 1980. Depth of focus, eye size and visual acuity. (1980).
[23]
Daniel A Guttentag. 2010. Virtual reality: Applications and implications for tourism. Tourism Management 31, 5 (2010), 637--651.
[24]
Robert Hackett. 2017. Apple aquires SMI. http://fortune.com/2017/06/27/apple-acquires-eye-tracking-tech-firm-for-augmented-reality/. (June, 2017). Accessed: 2018-08-15.
[25]
Robert T Held, Emily A Cooper, and Martin S Banks. 2012. Blur and disparity are complementary cues to depth. Current biology 22, 5 (2012), 426--431.
[26]
David M Hoffman, Ahna R Girshick, Kurt Akeley, and Martin S Banks. 2008. Vergence-accommodation conflicts hinder visual performance and cause visual fatigue. Journal of vision 8, 3 (2008), 33--33.
[27]
LLHolladay. 1926. The fundamentals of glare and visibility. JOSA 12, 4 (1926), 271--319.
[28]
Hsiu-Mei Huang, Ulrich Rauch, and Shu-Sheng Liaw. 2010. Investigating learners' attitudes toward virtual reality learning environments: Based on a constructivist approach. Computers & Education 55, 3 (2010), 1171--1182.
[29]
Anke Huckauf. 2005. Virtual and real visual depth. In Proceedings of the 2nd symposium on Applied perception in graphics and visualization. ACM, 172--172.
[30]
Harish Katti, Karthik Yadati, Mohan Kankanhalli, and Chua Tat-Seng. 2011. Affective video summarization and story board generation using pupillary dilation and eye gaze. In 2011 IEEE International Symposium on Multimedia. IEEE, 319--326.
[31]
JL Kenemans, MN Verbaten, W Sjouw, and JL Slangen. 1988. Effects of task relevance on habituation of visual single-trial ERPs and the skin conductance orienting response. International journal of psychophysiology 6, 1 (1988), 51--63.
[32]
Jeff Klingner, Rakshit Kumar, and Pat Hanrahan. 2008. Measuring the task-evoked pupillary response with a remote eye tracker. In Proceedings of the 2008 symposium on Eye tracking research & applications. ACM, 69--72.
[33]
Arthur F Kramer. 1991. Physiological metrics of mental workload: A review of recent progress. Multiple-task performance (1991), 279--328.
[34]
Andrew L Kun, Oskar Palinko, and Ivan Razumenić. 2012. Exploring the effects of size and luminance of visual targets on the pupillary light reflex. In Proceedings of the 4th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. ACM, 183--186.
[35]
Yves Le Grand. 1957. Light, colour, and vision. Wiley.
[36]
Benjamin J Li, Jeremy N Bailenson, Adam Pines, Walter J Greenleaf, and Leanne M Williams. 2017. A Public Database of Immersive VR Videos with Corresponding Ratings of Arousal, Valence, and Correlations between Head Movements and Self Report Measures. Frontiers in psychology 8 (2017), 2116.
[37]
Irene E Loewenfeld and Otto Lowenstein. 1993. The pupil: Anatomy, physiology, and clinical applications. Vol. 2. Wiley-Blackwell.
[38]
Juan-Miguel López-Gil, Jordi Virgili-Gomá, Rosa Gil, and Roberto García. 2016. Method for Improving EEG Based Emotion Recognition by Combining It with Synchronized Biometrie and Eye Tracking Technologies in a Non-invasive and Low Cost Way. Frontiers in computational neuroscience 10 (2016).
[39]
Emma L Markwell, Beatrix Feigl, and Andrew J Zele. 2010. Intrinsically photo-sensitive melanopsin retinal ganglion cell contributions to the pupillary light reflex and circadian rhythm. Clinical and Experimental Optometry 93, 3 (2010), 137--149.
[40]
Sandra P Marshall. 2002. The index of cognitive activity: Measuring cognitive workload. In Human factors and power plants, 2002. proceedings of the 2002 IEEE 7th conference on. IEEE, 7--7.
[41]
Parry Moon and Domina Eberle Spencer. 1944. On the stiles-crawford effect. JOSA 34, 6 (1944), 319--329.
[42]
Vitor F Pamplona, Manuel M Oliveira, and Gladimir VG Baranoski. 2009. Photorealistic models for pupil light reflex and iridal pattern deformation. ACM Transactions on Graphics (TOG) 28, 4 (2009), 106.
[43]
Timo Partala and Veikko Surakka. 2003. Pupil size variation as an indication of affective processing. International journal of human-computer studies 59, 1 (2003), 185--198.
[44]
Keshav Prasad, Kayla Briët, Obiageli Odimegwu, Olivia Connolly, Diego Gonzalez, and Andrew S Gordon. 2017. "The Long Walk" From Linear Film to Interactive Narrative. In AIIDE 2017 Workshop on Intelligent Narrative Technologies, Vol. 10.
[45]
Antonio Medina Puerta. 1989. The power of shadows: shadow stereopsis. JOSA A 6, 2 (1989), 309--311.
[46]
Pallavi Raiturkar, Andrea Kleinsmith, Andreas Keil, Arunava Banerjee, and Eakta Jain. 2016. Decoupling light reflex from pupillary dilation to measure emotional arousal in videos. In Proceedings of the ACM Symposium on Applied Perception. ACM, 89--96.
[47]
Prentice Reeves. 1918. Rate of pupillary dilation and contraction. Psychological Review 25, 4 (1918), 330.
[48]
P Ren, A Barreto, Y Gao, and M Adjouadi. 2011. Comparison of the use of pupil diameter and galvanic skin response signals for affective assessment of computer users. Biomedical sciences instrumentation 48 (2011), 345--350.
[49]
Jason Rowley. 2016. Google aquires Eyefluence. https://www.businessinsider.com/google-buys-eyefluence-vr-ar-eye-tracking-startup-2016-10. (Oct. 24 2016). Accessed: 2018-08-15.
[50]
Mohammad Soleymani, Maja Pantic, and Thierry Pun. 2012. Multimodal emotion recognition in response to videos. IEEE Transactions on Affective Computing 3, 2 (2012), 211--223.
[51]
Robert F Stanners, Michelle Coulter, Allen W Sweet, and Philip Murphy. 1979. The pupillary response as an indicator of arousal and cognition. Motivation and Emotion 3, 4 (1979), 319--340.
[52]
Shiro Usui and Lawrence Stark. 1982. A model for nonlinear stochastic behavior of the pupil. Biological Cybernetics 45, 1 (1982), 13--21.
[53]
Andrew B Watson and John I Yellott. 2012. A unified formula for light-adapted pupil size. Journal of Vision 12, 10 (2012), 12--12.
[54]
Michael W Weiss, Sandra E Trehub, E Glenn Schellenberg, and Peter Habashi. 2016. Pupils dilate for vocal or familiar music. Journal of Experimental Psychology: Human Perception and Performance 42, 8 (2016), 1061.
[55]
Gerald Westheimer. 1964. Pupil size and visual resolution. Vision research 4, 1--2 (1964), 39--45.
[56]
Olivier White and Robert M French. 2017. Pupil diameter may reflect motor control and learning. Journal of motor behavior 49, 2 (2017), 141--149.
[57]
Barry Winn, David Whitaker, David B Elliott, and Nicholas J Phillips. 1994. Factors affecting light-adapted pupil size in normal human subjects. Investigative ophthalmology & visual science 35, 3 (1994), 1132--1137.
[58]
JM Woodhouse and FW Campbell. 1975. The role of the pupil light reflex in aiding adaptation to the dark. Vision research 15, 6 (1975), 649--653.
[59]
Francis A Young and William R Biersdorf. 1954. Pupillary contraction and dilation in light and darkness. Journal of comparative and physiological psychology 47, 3 (1954), 264.
[60]
Rockefeller SL Young and Eiji Kimura. 2008. Pupillary correlates of light-evoked melanopsin activity in humans. Vision research 48, 7 (2008), 862--871.

Cited By

View all
  • (2024)Measuring Cognitive Load in Virtual Reality Training via PupillometryIEEE Transactions on Learning Technologies10.1109/TLT.2023.332647317(704-710)Online publication date: 1-Jan-2024
  • (2024)Emotion Prediction Through Eye Tracking in Affective Computing Systems2024 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops)10.1109/PerComWorkshops59983.2024.10502422(326-332)Online publication date: 11-Mar-2024
  • (2024)Predicting Cognitive Failures in Virtual Reality Using Pupillometry2024 IEEE International Conference on Artificial Intelligence and eXtended and Virtual Reality (AIxVR)10.1109/AIxVR59861.2024.00043(261-264)Online publication date: 17-Jan-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
VRST '18: Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology
November 2018
570 pages
ISBN:9781450360869
DOI:10.1145/3281505
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 28 November 2018

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. eyetracking
  2. light response
  3. pupil dilation
  4. videos
  5. virtual reality

Qualifiers

  • Research-article

Funding Sources

Conference

VRST '18

Acceptance Rates

Overall Acceptance Rate 66 of 254 submissions, 26%

Upcoming Conference

VRST '24

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)175
  • Downloads (Last 6 weeks)40
Reflects downloads up to 12 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Measuring Cognitive Load in Virtual Reality Training via PupillometryIEEE Transactions on Learning Technologies10.1109/TLT.2023.332647317(704-710)Online publication date: 1-Jan-2024
  • (2024)Emotion Prediction Through Eye Tracking in Affective Computing Systems2024 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops)10.1109/PerComWorkshops59983.2024.10502422(326-332)Online publication date: 11-Mar-2024
  • (2024)Predicting Cognitive Failures in Virtual Reality Using Pupillometry2024 IEEE International Conference on Artificial Intelligence and eXtended and Virtual Reality (AIxVR)10.1109/AIxVR59861.2024.00043(261-264)Online publication date: 17-Jan-2024
  • (2024)Cognitive effort detection for tele-robotic surgery via personalized pupil response modelingInternational Journal of Computer Assisted Radiology and Surgery10.1007/s11548-024-03108-z19:6(1113-1120)Online publication date: 8-Apr-2024
  • (2023)Open-DPSM: An open-source toolkit for modeling pupil size changes to dynamic visual inputsBehavior Research Methods10.3758/s13428-023-02292-156:6(5605-5621)Online publication date: 11-Dec-2023
  • (2023)When XR and AI Meet - A Scoping Review on Extended Reality and Artificial IntelligenceProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581072(1-45)Online publication date: 19-Apr-2023
  • (2023)Acute Stress Classification with a Stroop task and in-office biophilic Relaxation in Virtual Reality based on Behavioral and Physiological data2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)10.1109/ISMAR-Adjunct60411.2023.00115(537-542)Online publication date: 16-Oct-2023
  • (2023)An integrated methodology for the assessment of stress and mental workload applied on virtual trainingInternational Journal of Computer Integrated Manufacturing10.1080/0951192X.2023.2189311(1-19)Online publication date: 15-Mar-2023
  • (2022)The Eye in Extended Reality: A Survey on Gaze Interaction and Eye Tracking in Head-worn Extended RealityACM Computing Surveys10.1145/349120755:3(1-39)Online publication date: 25-Mar-2022
  • (2022)Cognitive load Classification with a Stroop task in Virtual Reality based on Physiological data2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR55827.2022.00083(656-666)Online publication date: Oct-2022
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media