Dataset for "Touché: Data-Driven Interactive Sword Fighting in Virtual Reality"

Dataset for "Touché: Data-Driven Interactive Sword Fighting in Virtual Reality"

This is the data repository for the paper "Touché: Data-Driven Interactive Sword Fighting in Virtual Reality" by Javier Dehesa, Andrew Vidler, Christof Lutteroth and Julian Padget, presented at CHI 2020 conference in Honolulu, HI, USA. See the publication for details.

The archives gesture_recognition_data.zip and gesture_recognition_code.zip contain respectively the data and code for the gesture recognition component. Similarly, the archives animation_data.zip and animation_code.zip contain respectively the data and code for the animation component. Instructions about how to use these are provided within them.

The archive user_studies.zip contains information about our user studies. The file questionnaire_study.jasp and interactive_study.jasp contain the data and analysis of the questionnaire and interactive studies respectively. They can be consulted with the open source tool JASP (https://jasp-stats.org/). The video questionnaire_conditions.mp4 shows the full videos used as the three conditions for the questionnaire study.

Keywords:
animation, gesture recognition, virtual reality, human-computer interaction, machine learning
Subjects:

Cite this dataset as:
Dehesa, J., Ninja Theory Ltd, 2020. Dataset for "Touché: Data-Driven Interactive Sword Fighting in Virtual Reality". Bath: University of Bath Research Data Archive. Available from: https://doi.org/10.15125/BATH-00754.

Export

Data

gesture_detection_data.zip
application/zip (5MB)
Creative Commons: Attribution 4.0

Data for the gesture recognition module of the system.

animation_data.zip
application/zip (77MB)
Creative Commons: Attribution 4.0

Data for the animation synthesis module of the system.

user_studies.zip
application/zip (430MB)
Creative Commons: Attribution 4.0

Data from the user studies and video showing the three conditions used in the questionnaire study.

Code

gesture_detection_code.zip
application/zip (93kB)
Creative Commons: Attribution 4.0

Code for the gesture recognition module of the system.

animation_code.zip
application/zip (19MB)
Creative Commons: Attribution 4.0

Code for the animation synthesis module of the system.

Creators

Javier Dehesa
University of Bath

Ninja Theory Ltd

Contributors

Andrew Vidler
Supervisor
Ninja Theory Ltd

Christof Lutteroth
Supervisor
University of Bath

Julian Padget
Supervisor
University of Bath

University of Bath
Rights Holder

Documentation

Data collection method:

Gesture recognition data was collected with VR hardware in a custom-made virtual scenario, where the subject was presented with a signal indicating a gesture to perform, which they then did while pressing a button on the hand controller. Animation data was motion captured with specialised equipment within the facilities of Ninja Theory, Ltd. User studies data was collected through online forms, filled after each condition of each of the studies.

Data processing and preparation activities:

The user studies data was preprocessed for convenience to produce an accessible JASP file. The preprocessing simply translated the raw text of the questions into short identifiers and mapped Likert points (like "strongly agree/disagree") to numerical values. For this publication, free-text comment data was removed from the dataset for anonymisation purposes.

Technical details and requirements:

Gesture recognition data was captured with an Oculus Rift kit. Animimation data was captured with Vicon Bonita hardware.

Additional information:

Gesture recognition data is stored in CSV files where each row contains the position and orientation of both hands and the gesture being performed on each frame. Animation data is stored in CSV files where each row contains the position and orientation of each joint of the skeleton. User studies data is stored in JASP files encoding the answers of each participant to each question in the study.

Documentation Files

README.txt
text/plain (1kB)
Creative Commons: Attribution 4.0

General information.

Funders

Engineering and Physical Sciences Research Council (EPSRC)
https://doi.org/10.13039/501100000266

EPSRC Centre for Doctoral Training in Digital Entertainment
EP/L016540/1

Publication details

Publication date: 6 January 2020
by: University of Bath

Version: 1

DOI: https://doi.org/10.15125/BATH-00754

URL for this record: https://researchdata.bath.ac.uk/id/eprint/754

Contact information

Please contact the Research Data Service in the first instance for all matters concerning this item.

Contact person: Javier Dehesa

Departments:

Faculty of Science
Computer Science

Research Centres & Institutes
Centre for Digital Entertainment (CDE)