CISUC

Activity Recognition for Movement-Based Interaction in Mobile Games

Authors

Abstract

Although smartphones include a set of sensors that enable innovative interactions, current mobile game interaction is mostly touch-based. Some games also include tilt movement based on the accelerometer sensor. However, sensors like accelerometers and gyroscopes can be used to recognize, in real time, full body motions. Exploring this can lead to innovative and immersive experiences while promoting physical activity. We present a proof of concept 3D endless running game called ActivRunner which implements an activity recognition system that predicts, in real time, 4 activities: standing, move left, move right, squat and jump. The goal is to replace the traditional touch interaction with a more natural movement-based one, showing the potential of this kind of interaction to create innovative and immersive mobile experiences while promoting physical activity.

Keywords

H.5.2. [Information interfaces and presentation (e.g. HCI)]: User Interfaces - Input devices and strategies; Interaction styles; I.2.1 [Artificial Intelligence]: Applications and Expert Systems - Games

Related Project

URBY.SENSE - Urban mobility analysis and prediction for non-routine scenarios using digital footprints. PTDC/ECM-TRA/6803/2014

Conference

19th International Conference on Human-Computer Interaction with Mobile Devices and Services, Demo Panel, September 2017


Cited by

No citations found