Datasets and Analyses for "Affect Recognition using Psychophysiological Correlates in High Intensity VR Exergaming"
Datasets and analyses for the paper "Affect Recognition using Psychophysiological Correlates in High Intensity VR Exergaming" published at CHI 2020.
We present the datasets of two experiments that investigate the use of different sensors for affect recognition in a VR exergame. The first experiment compares the impact of physical exertion and gamification on psychophysiological measurements during rest, conventional exercise, VR exergaming, and sedentary VR gaming. The second experiment compares underwhelming, overwhelming and optimal VR exergaming scenarios. We identify gaze fixations, eye blinks, pupil diameter and skin conductivity as psychophysiological measures suitable for affect recognition in VR exergaming and analyse their utility in determining affective valence and arousal. Our findings provide guidelines for researchers of affective VR exergames.
The datasets and analyses consist of the following:
1. two CSV sheets containing the quantitative and qualitative data of the Experiments I and II;
2. two JASP files with ANOVAS and t-tests for Experiments I and II;
3. two R scripts with correlation and regression analyses for Experiments I and II.
Cite this dataset as:
Barathi, S.,
Proulx, M.,
O'Neill, E.,
Lutteroth, C.,
2020.
Datasets and Analyses for "Affect Recognition using Psychophysiological Correlates in High Intensity VR Exergaming".
Bath: University of Bath Research Data Archive.
Available from: https://doi.org/10.15125/BATH-00758.
Export
Data
Supplement.zip
application/zip (364kB)
Creative Commons: Attribution 4.0
Datasets and Analyses for "Affect Recognition using Psychophysiological Correlates in High Intensity VR Exergaming"
Creators
Soumya Chinnachamy Barathi
University of Bath
Michael Proulx
University of Bath
Eamonn O'Neill
University of Bath
Christof Lutteroth
University of Bath
Contributors
University of Bath
Rights Holder
Documentation
Data collection method:
We used a Lode Excalibur Sport exercise bike and an FOVE HMD. They were connected to a PC running Unity with an Intel Xeon E5 2680 processor, 64 gigabytes of RAM, and two NVIDIA Titan X graphics cards. We measured blink rate in blinks per minute (Blinks) with the eye gaze tracker built into the FOVE HMD, recording pupillometry data with FOVE’s Unity plugin at 160Hz and counting blinks as periods with zero pupil diameter. We measured the tonic skin conductance (Conductivity) using the Shimmer3 Consensys GSR development kit in microsiemens (μS) at 128 Hz. Furthermore, we recorded the average power output (Power) in Watts during the sprint phases in conditions. Experiment I: We collected ground truth data for affect based on validated post-condition questionnaires. We measured intrinsic motivation with the Intrinsic Motivation Inventory (IMI) . We used the main Interest/Enjoyment subscale (IMI Enjoy) with a scoring ranges from 1 to 7, with 7 being the highest intrinsic motivation score. Participants then performed each of the four conditions: B (Baseline), G (Game), E (Exercise) and EG (Exergame). After conditions G, E and EG, participants completed the IMI, and left qualitative feedback about their experience. Experiment II: In addition to recording Conductivity, Blinks and Power to determine affective state, we recorded the total time of eye gaze fixations (Fixations) on visual components of the game: the competitor, the gap between the player and the competitor, the points, prompts, the displayed RPM and the timer. We used ray casting to detect the game components corresponding to a point of gaze. A low Fixations value indicates that the player was looking more at the peripheral VR environment or ‘staring at nothing’ instead of paying attention to the game. We also recorded a participant’s pupil dilation (Pupil) during the warm up and in each of the two sprints, considering their average. Similar to Experiment I, we used the IMI Interest/Enjoyment subscale (IMI Enjoy) to measure intrinsic motivation. Lastly, the experience sampling method integrated in the exergame was used to collect ground truth values about the player’s affective state; we consider the average of all values measured in a condition (Affect). We matched the sensor data and the ground truth by taking the average of the sensor data and of the experience sampling measures over a whole gameplay session.
Technical details and requirements:
We used a Lode Excalibur Sport exercise bike and an FOVE HMD. They were connected to a PC running Unity with an Intel Xeon E5 2680 processor, 64 gigabytes of RAM, and two NVIDIA Titan X graphics cards. We measured blink rate in blinks per minute (Blinks) with the eye gaze tracker built into the FOVE HMD, recording pupillometry data with FOVE’s Unity plugin at 160Hz and counting blinks as periods with zero pupil diameter. We measured the tonic skin conductance (Conductivity) using the Shimmer3 Consensys GSR development kit in microsiemens (μS) at 128 Hz. Furthermore, we recorded the average power output (Power) in Watts during the sprint phases in conditions. For the II Experiment, in addition to recording Conductivity, Blinks and Power to determine affective state, we recorded the total time of eye gaze fixations (Fixations) on visual components of the game: the competitor, the gap between the player and the competitor, the points, prompts, the displayed RPM and the timer. JASP statistics software (https://jasp-stats.org/) and the R programming language were used for data analysis. For the R scripts, we recommend using the RStudio IDE (https://rstudio.com/).
Methodology link:
Barathi, S. C., Proulx, M., O'Neill, E., and Lutteroth, C., 2020. Affect Recognition using Psychophysiological Correlates in High Intensity VR Exergaming. In: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. ACM. Available from: https://doi.org/10.1145/3313831.3376596.
Documentation Files
readme.txt
text/plain (1kB)
Creative Commons: Attribution 4.0
Funders
H2020 Marie Skłodowska-Curie Actions
https://doi.org/10.13039/100010665
Fellow for Industrial Research Enhancement (FIRE)
665992
Engineering and Physical Sciences Research Council
https://doi.org/10.13039/501100000266
EPSRC Centre for Doctoral Training in Digital Entertainment
EP/L016540/1
Engineering and Physical Sciences Research Council
https://doi.org/10.13039/501100000266
Centre for the Analysis of Motion, Entertainment Research and Applications (CAMERA)
EP/M023281/1
Publication details
Publication date: 15 January 2020
by: University of Bath
Version: 1
DOI: https://doi.org/10.15125/BATH-00758
URL for this record: https://researchdata.bath.ac.uk/id/eprint/758
Related papers and books
Barathi, S. C., Proulx, M., O'Neill, E., and Lutteroth, C., 2020. Affect Recognition using Psychophysiological Correlates in High Intensity VR Exergaming. In: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. ACM. Available from: https://doi.org/10.1145/3313831.3376596.
Contact information
Please contact the Research Data Service in the first instance for all matters concerning this item.
Contact person: Christof Lutteroth
Faculty of Science
Computer Science
Research Centres & Institutes
Centre for Digital Entertainment (CDE)