Dataset for, "An RCT study showing few weeks of music lessons enhance audio-visual temporal processing"

This dataset includes data on behavioural outcomes for the audio-visual simultaneity judgement task and emotion recognition task used in the publication, "An RCT study showing few weeks of music lessons enhance audio-visual temporal processing". In this study, the authors investigated the effect of eleven weeks of piano lessons on audio-visual temporal processing and emotion recognition abilities in adults. The data is organised to facilitate replication of the analyses carried out in this study, which includes the raw data of the two tasks mentioned above collected from each participant over seven data-collection sessions. A 'Read-me-first' file is included in both data folders that introduce the structure of the data, the meaning of the file names, and how to interpret the raw data.

Keywords:
Audio-visual processing, Emotion recognition, Temporal processing, Music lessons
Subjects:
Psychology

Cite this dataset as:
Che, Y., 2025. Dataset for, "An RCT study showing few weeks of music lessons enhance audio-visual temporal processing". Bath: University of Bath Research Data Archive. Available from: https://doi.org/10.15125/BATH-01216.

Export

Data

Simultaniety Judgement task.zip
application/zip (826kB)
Creative Commons: Attribution 4.0

This dataset includes data on behavioural outcomes for the audio-visual simultaneity judgement task used in the publication, "An RCT study showing few weeks of music lessons enhance audio-visual temporal processing".

Emotion recognition task.zip
application/zip (1MB)
Creative Commons: Attribution 4.0

This dataset includes data on behavioural outcomes for the emotion recognition task used in the publication, "An RCT study showing few weeks of music lessons enhance audio-visual temporal processing".

Creators

Yuqing Che
University of Bath

Contributors

Crescent Jicol
Researcher
University of Bath

Karin Petrini
Supervisor
University of Bath

Chris Ashwin
Supervisor
University of Bath

University of Bath
Rights Holder

Documentation

Data collection method:

This dataset contains data for behavioural outcomes from the audio-visual simultaneity judgement (SJ) and emotion recognition (ER) tasks described in the paper: "An RCT study showing few weeks of music lessons enhance audio-visual temporal processing". In the SJ task, participants judged whether the presented auditory and visual cues were synchronised by making a key press. The SJ task included two types of audio-visual cues, which are the flash and beep, and the face and voice. In the ER task, participants made emotional judgements about dynamic facial expression stimuli, having to classify them as being either joy, sadness, fear, anger, disgust, surprise, or neutral by making a speeded mouse click on the target emotion. Three levels of emotional intensity (low, medium, and high) were included for all the emotions except the neutral. Participants were screened before being recruited in the study so that only non-musician adults with normal or adjust-to-normal vision and hearing were included. This study used a parallel group RCT design. We did not include blinding in this study as the design required participants’ active involvement in certain conditions. The experimenter had to know and run the sessions, with the experimenter also serving as the trainer. However, the experimenter had no control over the group allocation process as the participants were randomly assigned to their group at the beginning of the study.

Data processing and preparation activities:

All the data in this dataset has been anonymised prior to sharing.

Technical details and requirements:

The simultaneity judgement task data is stored as .txt files therefore no special software is required to view them. The emotion recognition task data is also stored as .txt files. However, a MATLAB program is required to process these files and get further data of interest (average accuracy and reaction time for correct responses). The details of data processing have been provided in the Read-me-first.txt under the emotion recognition task folder. The code of the MATLAB program used for data processing (Emotion_analysis.m) is also included in the same folder.

Additional information:

Information on the layout of the data has been provided in the Read-me-first.txt under each folders.

Funders

Self-funded

Publication details

Publication date: 22 November 2025
by: University of Bath

Version: 1

DOI: https://doi.org/10.15125/BATH-01216

URL for this record: https://researchdata.bath.ac.uk/1216

Related papers and books

Che, Y., Jicol, C., Ashwin, C., and Petrini, K., 2022. An RCT study showing few weeks of music lessons enhance audio-visual temporal processing. Scientific Reports, 12(1). Available from: https://doi.org/10.1038/s41598-022-23340-4.

Contact information

Please contact the Research Data Service in the first instance for all matters concerning this item.

Contact person: Yuqing Che

Departments:

Faculty of Humanities & Social Sciences
Psychology

Research Centres & Institutes
Centre for Applied Autism Research