Emowars: Interactive Game Input Menggunakan Ekspresi Wajah
DOI:
https://doi.org/10.21512/comtech.v4i2.2542Keywords:
affective game, facial expression recognition, interactive game input, affective computingAbstract
Research in the affective game has received attention from the research communities over this lustrum. As a crucial aspect of a game, emotions play an important role in user experience as well as to emphasize the user’s emotions state on game design. This will improve the user’s interactivity while they playing the game. This research aims to discuss and analyze whether emotions can replace traditional user game inputs (keyboard, mouse, and others). The methodology used in this research is divided into two main phases: game design and facial expression recognition. The results of this research indicate that users preferred to use a traditional input such as mouse. Moreover, user’s interactivities with game are still slightly low. However, this is a great opportunity for researchers in affective game with a more interactive game play as well as rich and complex story. Hopefully this will improve the user affective state and emotions in game. The results of this research imply that happy emotion obtains 78% of detection, meanwhile the anger emotion has the lowest detection of 44.4%. Moreover, users prefer mouse and FER (face expression recognition) as the best input for this game.
Plum Analytics
References
Chih-Chung, C., & Chih-Jen, L. (2011). LIBSVM : a library for support vector machines. ACM Transactions on Intelligent Systems and Technology, 2:27:1-27:27, USA.
Darwin, C. (1965, 1872). The Expression of Emotions in Man and Animals. Chicago: University of Chicago Press.
Ekman, P., Friesen., W., Hage, J. (2002). Facial Action Coding System: The Manual. Salt Lake City: Research Nexus division of Network Information Research Corporation.
Gilleade, K. M., & Dix, A. (2004). Using Frustration in the Design of Adaptive Videogames. Singapore: ACW.
Pantic, M., Cowie, R., Derrico, F., Heylen, F., Mehu, M., Pelachaud, C., Poggi, I. , Schroeder, M., and Vin-Ciarelli, A. (2011). Agenda on Social Signal Processing Research. Europe: Europe 7th Framework Programme.
Picard, R. (1997). Affective Computing. Cambridge, MA: The MIT Press.
Schell, J. (2008). The Art of Game Design. Massachusetts: Morgan Kaufmann.
Viola, P., & Jones, M. (2004). Robust real-time object detection. Int. J. Computer Vision.,57 (2), 137–154, May 2004.
Vukadinovic, D., & Pantic, M. (2005). Fully automatic facial feature point detection using Gabor feature based boosted features. IEEE Int. Conf. Syst., Man, Cybern., 2005, 1692–1698.
Ying-Li, T., Kanade, T., & Cohn, J. (2011). Handbook of Face Recognition. London: Springer.
Downloads
Published
Issue
Section
License
Authors who publish with this journal agree to the following terms:
a. Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License - Share Alike that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this journal.
b. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this journal.
c. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.
USER RIGHTS
All articles published Open Access will be immediately and permanently free for everyone to read and download. We are continuously working with our author communities to select the best choice of license options, currently being defined for this journal as follows: