Emowars: Interactive Game Input Menggunakan Ekspresi Wajah

Authors

  • Andry Chowanda Bina Nusantara University

DOI:

https://doi.org/10.21512/comtech.v4i2.2542

Keywords:

affective game, facial expression recognition, interactive game input, affective computing

Abstract

Research in the affective game has received attention from the research communities over this lustrum. As a crucial aspect of a game, emotions play an important role in user experience as well as to emphasize the user’s emotions state on game design. This will improve the user’s interactivity while they playing the game. This research aims to discuss and analyze whether emotions can replace traditional user game inputs (keyboard, mouse, and others). The methodology used in this research is divided into two main phases: game design and facial expression recognition. The results of this research indicate that users preferred to use a traditional input such as mouse. Moreover, user’s interactivities with game are still slightly low. However, this is a great opportunity for researchers in affective game with a more interactive game play as well as rich and complex story. Hopefully this will improve the user affective state and emotions in game. The results of this research imply that happy emotion obtains 78% of detection, meanwhile the anger emotion has the lowest detection of 44.4%. Moreover, users prefer mouse and FER (face expression recognition) as the best input for this game.

Dimensions

Plum Analytics

Author Biography

Andry Chowanda, Bina Nusantara University

Computer Science Department, School of Computer Science

References

Chih-Chung, C., & Chih-Jen, L. (2011). LIBSVM : a library for support vector machines. ACM Transactions on Intelligent Systems and Technology, 2:27:1-27:27, USA.

Darwin, C. (1965, 1872). The Expression of Emotions in Man and Animals. Chicago: University of Chicago Press.

Ekman, P., Friesen., W., Hage, J. (2002). Facial Action Coding System: The Manual. Salt Lake City: Research Nexus division of Network Information Research Corporation.

Gilleade, K. M., & Dix, A. (2004). Using Frustration in the Design of Adaptive Videogames. Singapore: ACW.

Pantic, M., Cowie, R., Derrico, F., Heylen, F., Mehu, M., Pelachaud, C., Poggi, I. , Schroeder, M., and Vin-Ciarelli, A. (2011). Agenda on Social Signal Processing Research. Europe: Europe 7th Framework Programme.

Picard, R. (1997). Affective Computing. Cambridge, MA: The MIT Press.

Schell, J. (2008). The Art of Game Design. Massachusetts: Morgan Kaufmann.

Viola, P., & Jones, M. (2004). Robust real-time object detection. Int. J. Computer Vision.,57 (2), 137–154, May 2004.

Vukadinovic, D., & Pantic, M. (2005). Fully automatic facial feature point detection using Gabor feature based boosted features. IEEE Int. Conf. Syst., Man, Cybern., 2005, 1692–1698.

Ying-Li, T., Kanade, T., & Cohn, J. (2011). Handbook of Face Recognition. London: Springer.

Downloads

Published

2013-12-01

Issue

Section

Articles
Abstract 389  .
PDF downloaded 201  .