Dataset for "Understanding Freehand Cursorless Pointing Variability and Its Impact on Selection Performance"
This dataset supports the journal entry "Understanding Freehand Cursorless Pointing Variability and Its Impact on Selection Performance" (TOCHI, 2025), containing motion capture data, of the body and hands, captured during a range of pointing gestures from 23 participants.
The user study that captured this data systematically explored how target position (3 rows by 5 columns), task focus (Pointing as a Primary Task vs Secondary Task), and user effort (Accurate pointing vs Casual pointing), affect pointing behaviour and performance.
The dataset includes:
- Motion capture data for each trial (grouped by participant). This contains body landmarks – captured via a markerless motion capture system and finger landmarks – tracked with infrared markers.
- Trial Annotations. Metadata for each trial, such as the target position, labels for when pointing occurs, and observed behaviour labels.
- Encoded gesture statistics. For each trial, for which a valid pointing gesture could be extracted, an encoding of the gesture performed, derived from the medians for body pose features (e.g. elbow flexion), fatigue measures (e.g. consumed endurance), and rays (e.g. vector and accuracy).
- Self-reported user data. Including participant age, hand dominance, and fatigue measures (obtained after completing pointing within each condition).
- Code for visualising the trials, including a subset of the rays used in our subsequent analysis, code for generating our encoded gestures, using the motion capture data and annotations, and the script used to perform the analysis over our encoded gestures.
This dataset has been provided for two purposes:
1. For further investigation into pointing behaviour and for the development of pointing interaction systems. For this, please refer to the Pointing Dataset section of the README to understand the structure and dataset contents, and the Trial Visualiser section of the README for usage of a script for visualising the motion capture data.
2. For reproduction of data used in the analysis of the accompanying paper (Understanding Freehand Cursorless Pointing Variability and Its Impact on Selection Performance), for which please see Pointing Dataset section of the README to understand the structure and dataset contents, along with the Gesture Encoder and Analysis Script sections of the README for the code used to perform our analysis.
Cite this dataset as:
Whiffing, J.,
Langlotz, T.,
Lutteroth, C.,
Sharma, A.,
Clarke, C.,
2025.
Dataset for "Understanding Freehand Cursorless Pointing Variability and Its Impact on Selection Performance".
Bath: University of Bath Research Data Archive.
Available from: https://doi.org/10.15125/BATH-01594.
Export
Data
AverageCameraTr … ngResiduals.txt
text/plain (1kB)
Creative Commons: Attribution 4.0
The camera tracking residuals, averaged over each run study, from the Qualisys motion capture system.
EncodedGestures_FINAL.csv
text/csv (5MB)
Creative Commons: Attribution 4.0
Encoded gesture statistics. For each trial, for which a valid pointing gesture could be extracted, an encoding of the gesture performed, derived from the medians for body pose features (e.g. elbow flexion), fatigue measures (e.g. consumed endurance), and rays (e.g. vector and accuracy). See README for descriptions of specific fields.
GestureAnnotations.json
text/plain (11MB)
Creative Commons: Attribution 4.0
Trial Annotations. Metadata for each trial, such as the target position, labels for when pointing occurs, and observed behaviour labels. See README for description of file contents.
SubjectiveQuestionnaires.csv
text/csv (9kB)
Creative Commons: Attribution 4.0
Self-reported user data. Including participant age, hand dominance, and fatigue measures (obtained after completing pointing within each condition). See README for descriptions of the fields within the file.
trials.zip
application/zip (8GB)
Creative Commons: Attribution 4.0
Motion capture data for each trial (grouped by participant). This contains body landmarks - captured via a markerless motion capture system and finger landmarks - tracked with infrared markers.
Code
code.zip
application/zip (139kB)
Creative Commons: Attribution 4.0
A zip file containing the supporting code and scripts. This includes code for 1) visualising the trials, including a subset of the rays used in our subsequent analysis, 2) generating our encoded gestures, using the motion capture data and annotations, and 3) the script used to perform the analysis over our encoded gestures.
Creators
James Whiffing
University of Bath
Tobias Langlotz
Aarhus University
Christof Lutteroth
University of Bath
Adwait Sharma
University of Bath
Christopher Clarke
University of Bath
Contributors
University of Bath
Rights Holder
Coverage
Collection date(s):
From 18 March 2024 to 9 May 2024
Documentation
Data collection method:
For the complete methodology used in the data collection, please refer to the paper. We used a repeated measures within-subject design with Pointing Style (accurate or casual) and Focus (focused or distracted) as independent variables (IV). We asked participants to "point as accurately and precisely as possible" for the accurate pointing condition. In contrast, in the casual pointing condition participants were instructed to "point as casually and relaxed as possible...". The distracted condition involved participants completing a Stroop effect test while pointing to targets, while the focused condition had pointing as the primary and only task.
Technical details and requirements:
Please refer to the README file for instructions to visualise the recorded pointing gestures or use any of the other provided scripts. Please refer to the README file for an explanation of the dataset structure and fields within specific files. For the technical details of the study setup and data collection, please refer to the paper. In summary, we utilised a set of 135 targets, which were grouped into 15 clusters of 9 (3×3), arranged into a 3 × 5 (rows × columns) array, with each target within a cluster spaced 8.4cm apart. The middle row was located 1.4 m from the floor and 2 m away from the participant, and the top and bottom rows were pitched ±25° from the middle row. Each column was yawed ±35° relative to the adjacent column. We used 12 Arqus infrared (IR) tracking cameras, and 10 Miqus cameras capturing RGB images at 1080p (4:3), with recording and marker tracking managed by QTM. The cameras provided coverage of a 4 m wide × 3 m deep × 2.5 m tall volume, within which the participant would be placed 2 m from the shorter edges, and ∼1 m from the long edge. The system was calibrated at the start of each day, with an average residual of 0.732 mm and standard deviation of 0.174 mm. We used 28 6.5 mm IR reflective markers to track all fingers on both hands; two for each finger and four to capture the palm and wrist. To aid in the tracking of the hands, we employed QTM’s AIM models and skeleton-assisted labelling. All sensing apparatus sampled data at 100Hz.
Additional information:
Motion capture data is grouped into zips for each participant, where each participant has been given a unique ID (GUUID). Each zip contains files representing each captured trial for the given participant, where the name of the file details the condition and trial number, e.g. ACCURATE_FOCUSED_3.csv, where 'ACCURATE_FOCUSED' is the condition and '3' is the trial number within that condition. Metadata for the trials is located within the GestureAnnotations.json. Additionally, and encoding of each valid gesture can be located within the encoded_gestures.csv Please refer to the dataset README file for an explanation of the dataset structure and fields within specific files.
Methodology link:
Whiffing, J., Langlotz, T., Lutteroth, C., Sharma, A., and Clarke, C., 2025. Understanding Freehand Cursorless Pointing Variability and Its Impact on Selection Performance. ACM Transactions on Computer-Human Interaction. Available from: https://doi.org/10.1145/3770583.
Documentation Files
README.md
text/plain (22kB)
Creative Commons: Attribution 4.0
A README for the dataset, explaining the contents of the provided files and usage of the accompanying scripts.
Templates
PointingBlockQuestionnaire.pdf
application/pdf (136kB)
Creative Commons: Attribution 4.0
Questionnaires used for collecting self report fatigue measures after each condition is completed
Funders
Engineering and Physical Sciences Research Council
https://doi.org/10.13039/501100000266
DTP 2022-2024 University of Bath
EP/W524712/1
Engineering and Physical Sciences Research Council
https://doi.org/10.13039/501100000266
Centre for the Analysis of Motion, Entertainment Research and Applications (CAMERA) - 2.0
EP/T022523/1
Publication details
Publication date: 10 October 2025
by: University of Bath
Version: 1
DOI: https://doi.org/10.15125/BATH-01594
URL for this record: https://researchdata.bath.ac.uk/1594
Related papers and books
Whiffing, J., Langlotz, T., Lutteroth, C., Sharma, A., and Clarke, C., 2025. Understanding Freehand Cursorless Pointing Variability and Its Impact on Selection Performance. ACM Transactions on Computer-Human Interaction. Available from: https://doi.org/10.1145/3770583.
Contact information
Please contact the Research Data Service in the first instance for all matters concerning this item.
Contact person: James Whiffing
Faculty of Science
Computer Science
Research Centres & Institutes
Human-Computer Interaction