Dataset for "A Novel Neural Network Architecture with Applications to 3D Animation and Interaction in Virtual Reality"

This is the dataset for the doctoral thesis "A Novel Neural Network Architecture with Applications to 3D Animation and Interaction in Virtual Reality" by Javier de la Dehesa Cueto-Felgueroso. See the original document for details.

The dataset is structured in three parts. The files `gfnn_code.zip` and `gfnn_data.zip` contain the code and data for the experiments with grid-functioned neural networks discussed in chapter 3 of the thesis. The files `quadruped_code.zip` and `quadruped_data.zip` contain the code and data for the quadruped locomotion experiments and user study discussed in chapter 4. The files `framework_code.zip` and `framework_data.zip` contain the code and data for the human-character interaction framework experiments and user studies discussed in chapter 5. Each pair of files should be decompressed in the same directory, but separate from the other parts. Further details and instructions for each of the parts can be found within the corresponding compressed files.

Keywords:
machine learning, neural networks, animation, gesture recognition, virtual reality, human-computer interaction
Subjects:

Cite this dataset as:
De La Dehesa Cueto-Felgueroso, J., 2020. Dataset for "A Novel Neural Network Architecture with Applications to 3D Animation and Interaction in Virtual Reality". Bath: University of Bath Research Data Archive. Available from: https://doi.org/10.15125/BATH-00752.

Export

[QR code for this page]

Data

gfnn_data.zip
application/zip (89kB)
Creative Commons: Attribution 4.0

Grid-functioned neural networks experiments data.

quadruped_data.zip
application/zip (667MB)
Creative Commons: Attribution 4.0

Quadruped locomotion experiments data and user study results.

framework_data.zip
application/zip (514MB)
Creative Commons: Attribution 4.0

Human-character interaction framework experiments data and study results.

Code

gfnn_code.zip
application/zip (32kB)
Software: MIT License

Grid-functioned neural networks experiments code.

quadruped_code.zip
application/zip (21MB)
Software: MIT License

Quadruped locomotion experiments code.

framework_code.zip
application/zip (19MB)
Software: MIT License

Human-character interaction framework experiments code.

Contributors

Julian Padget
Supervisor
University of Bath

Christof Lutteroth
Supervisor
University of Bath

Andrew Vidler
Supervisor
Ninja Theory

University of Bath
Rights Holder

Documentation

Data collection method:

Synthetic evaluation data has been generated programmatically. Quadruped locomotion data has been extracted from the dataset published as part of the article "Mode-Adaptive Neural Networks for Quadruped Motion Control" by Zhang et al. (2018). Gesture recognition data was collected with VR hardware in a custom-made virtual scenario, where the subject was presented with a signal indicating a gesture to perform, which they then did while pressing a button on the hand controller. Sword fighting animation data was motion captured with specialised equipment within the facilities of Ninja Theory, Ltd. User studies data was collected through online forms, filled after each condition of each of the studies.

Data processing and preparation activities:

Quadruped locomotion data has been extracted from the dataset published as part of the article "Mode-Adaptive Neural Networks for Quadruped Motion Control" by Zhang et al. (2018). A selection of the original dataset was converted from the original BVH format into CSV and TensorFlow's TfRecord formats.

Technical details and requirements:

Gesture recognition data was captured with an Oculus Rift kit. Animation data was captured with Vicon Bonita hardware.

Additional information:

Synthetic evaluation data is stored in TfRecord format including the input and output values for each of the examples. Quadruped locomotion data is expressed in CSV, including the configuration of the character joints on each frame, and in TfRecord format, encoded as described in the thesis. Gesture recognition data is stored in CSV files where each row contains the position and orientation of both hands and the gesture being performed on each frame. Animation data is stored in CSV files where each row contains the position and orientation of each joint of the skeleton. User studies data is stored in JASP files encoding the answers of each participant to each question in the study.

Funders

Engineering and Physical Sciences Research Council
https://doi.org/10.13039/501100000266

EPSRC Centre for Doctoral Training in Digital Entertainment
EP/L016540/1

Publication details

Publication date: 20 November 2020
by: University of Bath

Version: 1

DOI: https://doi.org/10.15125/BATH-00752

URL for this record: https://researchdata.bath.ac.uk/id/eprint/752

Related theses

Dehesa, J., 2021. A Novel Neural Network Architecture with Applications to 3D Animation and Interaction in Virtual Reality. Thesis (Doctor of Engineering (EngD)). University of Bath. Available from: https://researchportal.bath.ac.uk/en/studentTheses/a-novel-neural-network-architecture-with-applications-to-3d-anima.

Contact information

Please contact the Research Data Service in the first instance for all matters concerning this item.

Contact person: Javier De La Dehesa Cueto-Felgueroso

Departments:

Faculty of Science
Computer Science

Research Centres & Institutes
Centre for Digital Entertainment (CDE)