Mata: An Android Eye-Tracking Based User Interface Control Application

Main Article Content

Yaya Heryadi
Michael James

Abstract

The advent of smartphone technology has provided us with intelligent devices for communication as well as playing game. Unfortunately, applications that exploit available sensors in the smartphone are mostly designed for people with no physical handicap. This paper presents Mata, a game user interface using eye-tracking to operate and control games running on Android smartphone. This system is designed to enhance user experiences and help motoric impaired peoples in using smartphone for playing games. Development and testing of the Mata system has proven the concepts of eye-tracking and eyegazing usage as unimodal input for game user interface.

Dimensions

Plum Analytics

Article Details

Section
Articles

References

Velloso, E., Fleming, A., Alexander, J. and Gellersen, H. "Gaze-supported gaming: MAGIc techniques for first person shooters." Proceedings of the 2015 Annual Symposium on computer-Human Interaction in Play. AcM, 2015.

Tsai, M.J., Huang, L.J., Hou, H.T., Hsu, C.Y. and Chiou, G.L. "Visual behavior, flow and achievement in game-based learning." Computers & Education 98 (2016): 115-129.

Manssuer, L. R., Pawling, R., Hayes, A. E., and Tipper, S. P. “The role of emotion in learning trustworthiness from eye-gaze: Evidence from facial electromyography,” Cognitive neuroscience, 7(1-4), 82-102, 2016.

Saitovitch, A., Popa, T., Lemaitre, H., Rechtman, E., Lamy, J. C., Grevent, D. and Boddaert, N. “Tuning Eye-Gaze Perception by Transitory STS Inhibition,” Cerebral Cortex, 26(6), 28232831, 2016.

Mannaru, P., Balasingam, B., Pattipati, K., Sibley, C., and Coyne, J. “Cognitive context detection in uAS operators using eye-gaze patterns on computer screens,” In SPIE Defense+ Security (pp. 98510F-98510F). International Society for Optics and Photonics, 2016.

Bellis, M. “Touch Screen,” [Online:] Available: http://inventors.about.com/library/inventors/bltouch.htm

Anacan, R., Alcayde, J. G., Antegra, R., and Luna, L. “Eye-GuIDE (Eye-Gaze user Interface Design) Messaging for Physically-Impaired People,” arXiv preprint arXiv: 1302.1649, 2013.

Hutchinson, T. E., White, K. P., Martin, W. N., Reichert, K. C., and Frey, L. A. “Human-computer interaction using eye-gaze input,” IEEE Transactions on systems, man, and cybernetics, 19(6), 1527-1534, 1989.

Wang, J. “Integration of eye-gaze, voice and manual response in multimodal user interface,” In Systems, Man and Cybernetics, 1995. Intelligent Systems for the 21st Century., IEEE International Conference on (Vol. 5, pp. 3938-3942). IEEE, 1995.

Eid, M. A., Giakoumidis, N., and El Saddik, A. (2016). A Novel Eye-Gaze-Controlled Wheelchair System for Navigating Unknown Environments: Case Study With a Person With ALS. IEEE Access, 4, 558-573.

Kurauchi, A., Feng, W., Joshi, A., Morimoto, C., and Betke, M. “EyeSwipe: Dwell-free Text Entry Using Gaze Paths,” In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (pp. 1952-1956). ACM, 2016.

Sharma, C., Bhavsar, P., Srinivasan, B., and Srinivasan, R. “Eye gaze movement studies of control room operators: A novel approach to improve process safety,” Computers & Chemical Engineering, 85, 43-57, 2016.

Giannopoulos, I., Schoning, J., Kruger, A., and Raubal, M. “Attention as an input modality for Post-WIMP interfaces using the viGaze eye tracking framework,” Multimedia Tools and Applications, 75(6), 2913-2929, 2016.

Dalmaijer E. “Is the low-cost EyeTribe eye tracker any good for research?,” [Online:] https://doi.org/10.7287/peerj.preprints.585v1, 2014.

LC Tech. “Communicate with the world using the power of your eyes,” [Online:] http://www.eyegaze.com/eye-tracking-assistive-technology-device/, 2016

Dynavox. “Say it with your eyes,” [Online:] http://www.dynavoxtech.com/products/eyemax/benefits/, 2016.

Wobbrock, J. O., J. Rubinstein, M. Sawyer, and A. T. Duchowski. "Not typing but writing: Eye-based text entry using letter-like gestures," In Proceedings of The Conference on Communications by Gaze Interaction (COGAIN), pp. 61-64. 2007.

Alavi, M. "An assessment of the prototyping approach to information systems development." Communications of the ACM 27, no. 6, pp. 556-563, 1984.

Jones, M. J., and Viola, P. “Robust real-time object detection,” In Workshop on statistical and computational theories of vision (Vol. 266, p. 56), 2001.

Freund, Y. and R. E. Schapire. “A decision-theoretic generalization of on-line learning and an application to boosting,” In Computational Learning Theory: Eurocolt ’95, pages 23-37. Springer-Verlag, 1995