Int J Performability Eng ›› 2019, Vol. 15 ›› Issue (3): 732-742.doi: 10.23940/ijpe.19.03.p2.732742

Previous Articles     Next Articles

Object-based Visual Attention Quantification using Head Orientation in VR Applications

Honglei Hana, b, *, Aidong Lub, Chanchan Xua, and Unique Wellsb   

  1. a School of Animation and Digital Arts, Communication University of China, Beijing, 100024, China;
    b Department of Computer Science, University of North Carolina at Charlotte, Charlotte, 28223, USA
  • Submitted on ; Revised on ;
  • Contact: hanhonglei@cuc.edu.cn
  • About author:Honglei Han joined the Communication University of China after receiving his M.S. degree in computer science in 2006. He is currently an associate professor. He received his Ph.D. from the Institute of Software at Chinese Academy of Sciences in 2015. He was a visiting scholar at the University of North Carolina at Charlotte. His research interests include computer graphics, virtual reality, and computer games.Aidong Lu received her bachelor's and Master's degrees in computer science from Tsinghua University in 1999 and 2001, respectively, and her Ph.D. in electrical and computer engineering from Purdue University in 2005. She is currently an associate professor at the University of North Carolina at Charlotte. Her research interest is developing effective visualization approaches to improve visual communications in real-life applications and education. Chanchan Xu obtained her Master's degree in computer application technology from Beijing Forestry University in 2013. She is currently a graduate student at the Communication University of China. Her research interests are computer graphics and virtual reality.Unique Wells is a Master's degree candidate at the University of North Carolina at Charlotte. Her research interests are data visualization and virtual reality.

Abstract: This paper presents a method to measure what and how deep users can perceive when exploring virtual reality environments using a head mounted display. A preliminary user study was conducted to verify that user gaze behavior has specific differences in immersive virtual reality environments compared with that in conventional, non-immersive virtual reality environments, which are based on a desktop screen. Gathered from the study results for gaze behavior, the users experiencing immersive virtual reality environments are more likely to adjust their head movement to center interesting objects in their vision. Based on this finding, a quantitative method is proposed to measure the user's visual attention in such a virtual reality environment. A user personalized storyboard is designed to capture the user's most regarded views as key frames that can depict the users' exploration experience in immersive virtual reality environments.

Key words: gaze analysis, visual attention, virtual reality, interaction, eye tracking