Int J Performability Eng ›› 2019, Vol. 15 ›› Issue (3): 732-742.doi: 10.23940/ijpe.19.03.p2.732742
Previous Articles Next Articles
Honglei Hana, b, *, Aidong Lub, Chanchan Xua, and Unique Wellsb
Submitted on
;
Revised on
;
Contact:
hanhonglei@cuc.edu.cn
About author:
Honglei Han joined the Communication University of China after receiving his M.S. degree in computer science in 2006. He is currently an associate professor. He received his Ph.D. from the Institute of Software at Chinese Academy of Sciences in 2015. He was a visiting scholar at the University of North Carolina at Charlotte. His research interests include computer graphics, virtual reality, and computer games.Aidong Lu received her bachelor's and Master's degrees in computer science from Tsinghua University in 1999 and 2001, respectively, and her Ph.D. in electrical and computer engineering from Purdue University in 2005. She is currently an associate professor at the University of North Carolina at Charlotte. Her research interest is developing effective visualization approaches to improve visual communications in real-life applications and education. Chanchan Xu obtained her Master's degree in computer application technology from Beijing Forestry University in 2013. She is currently a graduate student at the Communication University of China. Her research interests are computer graphics and virtual reality.Unique Wells is a Master's degree candidate at the University of North Carolina at Charlotte. Her research interests are data visualization and virtual reality.
Honglei Han, Aidong Lu, Chanchan Xu, and Unique Wells. Object-based Visual Attention Quantification using Head Orientation in VR Applications [J]. Int J Performability Eng, 2019, 15(3): 732-742.
Add to citation manager EndNote|Reference Manager|ProCite|BibTeX|RefWorks
1. M. Sivak, “The Information that Drivers Use: Is it Indeed 90% Visual?” 2. K. Rayner, “Eye Movements and Attention in Reading, Scene Perception, and Visual Search,” 3. A. Mack and I. Rock, “Inattentional Blindness,” MIT Press, MA, 1998 4. D. J.Simons and C. F. Chabris, “Gorillas in Our Midst: Sustained Inattentional Blindness for Dynamic Events,” 5. R. J.Peters and L. Itti, “Applying Computational Tools to Predict Gaze Direction in Interactive Visual Environments,” 6. A. T. Duchowski, “Eye Tracking Methodology: Theory and Practice,” Springer-Verlag, London, 2007 7. M. Bernhard, E. Stavrakis,M. Wimmer, “An Empirical Pipeline to Derive Gaze Prediction Heuristics for 3D Action Games,” 8. “Tobii Eye Tracking in Virtual Reality,”(https://vr.tobii.com/, Last accessed on October 23, 2018) 9. R. J. Snowden, P. Thompson,T. Troscianko, “Basic Vision : An Introduction to Visual Perception,” Oxford University Press, Oxford, 2012 10. S. Hillaire, A. Lécuyer, T. Regiacorte, R. Cozot, J. Royan,G. Breton, “Design and Application of Real-Time Visual Attention Model for the Exploration of 3D Virtual Environments,” 11. L. Itti, “Quantifying the Contribution of Low-Level Saliency to Human Eye Movements in Dynamic Scenes,” 12. S. Lee, G. J. Kim,S. Choi, “Real-Time Tracking of Visually Attended Objects in Virtual Environments and its Application to Lod,” 13. K. Yun, Y. Peng, D. Samaras, G. J. Zelinsky,T. L. Berg, “Studying Relationships between Human Gaze, Description, and Computer Vision,” in 14. V. Sitzmann, A. Serrano, A. Pavel, M. Agrawala, D. Gutierrez,G. Wetzstein, “Saliency in VR: How Do People Explore Virtual Environments?” 15. V. Sundstedt, M. Bernhard, E. Stavrakis, E. Reinhard,M. Wimmer, “Visual Attention and Gaze Behavior in Games: An Object-based Approach,” Springer-Verlag, London, 2013 16. M. Land, N. Mennie,J. Rusted, “The Roles of Vision and Eye Movements in the Control of Activities of Daily Living,” 17. G. N. Yannakakis, H. P. Martínez,A. Jhala, “Towards Affective Camera Control in Games,” 18. P. Burelli, “Virtual Cinematography in Games: Investigating the Impact on Player Experience,” in 19. P. Majaranta and A. Bulling, “Eye Tracking and Eye-based Human-Computer Interaction,” Springer-Verlag, London, 2014 20. T. Nakayama, H. Kato,Y. Yamane, “Discovering the Gap between Web Site Designers' Expectations and Users' Behavior,” 21. F. Alt, A. S. Shirazi, A. Schmidt,J. Mennen, “Increasing the User's Attention on the Web: Using Implicit Interaction based on Gaze Behavior to Tailor Content,” in 22. T. Löwe, M. Stengel, E. C. Förster, S. Grogorick,M. Magnor, “Visualization and Analysis of Head Movement and Gaze Data for Immersive Video in Head-Mounted Displays,” in 23. A. Picardi, P. Burelli,G. N. Yannakakis, “Modelling Virtual Camera Behaviour through Player Gaze,” in 24. P. Burelli and G. N. Yannakakis, “Towards Adaptive Virtual Camera Control in Computer Games,” in 25. Y. F. Ma, X. S. Hua, L. Lu,H. J. Zhan, “A Generic Framework of User Attention Model and Its Application in Video Summarization,” 26. M. Slater, “Place Illusion and Plausibility Can Lead to Realistic Behaviour in Immersive Virtual Environments,” 27. C. Papadopoulos, I. Gutenko,A. E. Kaufman, “Veevvie: Visual Explorer for Empirical Visualization, VR and Interaction Experiments,” 28. A. Steed, S. Friston, M. Lopez, J. Drummond, Y. Pan,D. Swapp, “An 'in the Wild' Experiment on Presence and Embodiment using Consumer Virtual Reality Equipment,” |
[1] | Junbo Wang, Ming Yang, and Yuxin Zhang. Detector Layout and Detection Probability Analysis for Physical Protection Systems of Nuclear Power Plants in Virtual Environments [J]. Int J Performability Eng, 2019, 15(4): 1255-1262. |
[2] | Mingyang Sun and Haiyang Yu. Automobile Intelligent Dashboard Design based on Human Computer Interaction [J]. Int J Performability Eng, 2019, 15(2): 571-578. |
[3] | Yi Wang. Virtual or Real: Discussion on Application of Semi-Virtual Exhibition Space [J]. Int J Performability Eng, 2019, 15(2): 630-636. |
[4] | Yuan Zhong, Jiqin Wu, Feng Han, and Jiawei Zhang. A Model for Pantograph-Catenary Electromechanical Interaction [J]. Int J Performability Eng, 2018, 14(12): 3140-3150. |
[5] | Rui Han, Zhiquan Feng, Changsheng Ai, Wei Xie, and Kang Wang. A Data Glove-based KEM Dynamic Gesture Recognition Algorithm [J]. Int J Performability Eng, 2018, 14(11): 2590-2600. |
|