# Pointing Dataset
The data is organised into the following structure:
```
|- trial_data/
| |- ${PARTICIPANT0}.zip
| |- ${PARTICIPANT1}.zip
| | ...
| \- ${PARTICIPANT22}.zip
| |- ${CONDITION0}_0.csv
| |- ${CONDITION0}_1.csv
| | ...
| |- ${CONDITION0}_134.csv
| |- ${CONDITION1}_0.csv
| |- ${CONDITION1}_1.csv
| | ...
| |- ${CONDITION3}_134.csv
|
|- GestureAnnoations.json
|- encoded_gestures.csv
\- SubjectiveQuestionnaires.csv
```
## Motion Capture and Pointing Gesture Data
Motion capture data is grouped into zips for each participant, where each participant has been given a unique ID (GUUID, e.g. `081cf3d4-a0ee-4c4f-abc9-d28ca934c23e.zip`). Each zip contains files representing each captured trial for the given participant, where the name of the file details the condition and trial number, e.g. `ACCURATE_FOCUSED_3.csv`, where 'ACCURATE_FOCUSED' is the condition and '3' is the trial number within that condition.
Trials for which the recording could not be saved or captured are not be present within the dataset, e.g. there will not be empty files.
For detailed explanation for the data collection methodology, please refer to the accompanying paper.
Within each trial we have the following features:
- Hand and Finger landmarks
- Body landmarks
- Eye and Gaze
- Pointing ray vectors
- Frame Number and Timestamp (seconds from recording start)
### Hand Marker Data
These are the markers placed onto both of the participant's hands, tracked by the Qualisys motion capture system (QTM). These are composed of:
Marker Label | Marker Placement
-|-
Fingers
`IndexTip`| On the nail of the index finger
`Index2`| On the Distal Interphalangeal Joint (DIP) of the index finger (top knuckle)
`MiddleTip`| On the nail of the middle finger
`Middle2`| On the Distal Interphalangeal Joint (DIP) of the middle finger (top knuckle)
`RingTip`| On the nail of the ring finger
`Ring2`| On the Distal Interphalangeal Joint (DIP) of the ring finger (top knuckle)
`PinkyTip`| On the nail of the pinky finger
`Pinky2`| On the Distal Interphalangeal Joint (DIP) of the pinky finger (top knuckle)
`ThumbTip`| On the nail of the thumb
`Thumb1`| On the Metacarpophalangeal Joint (MP) of the thumb (middle knuckle)
Hand
`HandIn`| On the Metacarpophalangeal Joint (MCP) of the index finger (base knuckle)
`HandOut`| On the Metacarpophalangeal Joint (MCP) of the pinky finger (base knuckle)
`WristIn`| On the protrusion on the wrist from the Radius bone (on the thumb-side of the wrist)
`WristOut`| On the protrusion on the wrist from the Ulna bone (on the pinky-side of the wrist)
These are prefixed with `LH_L` for the left hand, or `RH_R` for the right hand, with a field existing for each axis coordinate (e.g. x, y, & z). These are with respect to the lab coordinates system.
### Body Landmarks
These are the landmarks tracked via the Theia3d markerless motion capture system.
Landmark label | Joint
-|-
Upper Body
`head` | Midpoint of ears
`torso` | Base of neck
`uarm` | Shoulder
`larm` | Elbow
`hand` | Wrist
Lower Body
`pelvis` | Midpoint of pelvis plane
`thigh` | Hip
`shank` | Knee
`foot` | Ankle
`toes` | Mid-Foot
These are prefixed with `l_` if the landmark is for the left side of the body (relative to the participant), or `r_` for the right of the body. Each landmark has the x, y, & z coordinates, along with a 4x4 rotation matrix (suffixed with 0-15), which was used to derived the landmark coordinates. Further information can be found in the Theia3d [documentation](https://docs.theiamarkerless.com/theia3d-documentation/theia-model-description/default-model-description).
### Eyes and Gaze
To track the Tobii Pro Glasses 3, we used marker set 2 from [Tobii](https://www.tobii.com/products/accessories/motion-capture#parts), with an additional marker added to the left side of the glasses. These are prefixed with `Tobii3-Set-L2-R2` and used to derive the cyclops eye.
The positions of the eyes in the lab coordinate system are tracked as `LEFT_[X|Z|Z]_POS` and `RIGHT_[X|Z|Z]_POS` for the left and right eye respectively.
The gaze vector for eye eye is tracked as `[LEFT|RIGHT]_[X|Z|Z]_VEC`, as a unit vector representing the direction of the gaze for the given eye.
The gaze vectors are averaged to obtain the gaze ray (`CYCLOPS_[X|Z|Z]` and `CYCLOPS_[X|Z|Z]_VEC`).
### Tracking Rates
Below we provide a table highlighting the average tracking rate (as a percentage of frames where the landmark was labelled for each participant, with the standard deviation) for the marker derived landmarks. Tracking was primarily focused on the glasses, hand, and index finger markers. Tracking for other finger markers is more noisy and with a higher dropout.
Point | Capture Rate
--------------------------------------|-----------------
Cyclops Eye (derived from glasses) | 99.533% (3.626)
Thumb Tip | 97.813% (12.250)
Index Finger Tip | 99.886% (1.218)
Middle Finger Tip | 83.644% (32.694)
Ring Finger Tip | 81.917% (29.515)
Pinky Finger Tip | 80.394% (29.992)
Index Finger Base Knuckle | 99.969% (1.232)
Pinky Finger Base Knuckle | 98.819% (5.571)
Inner Wrist | 99.612% (3.664)
Outer Wrist | 98.399% (7.226)
The landmarks for the body, tracked via a markerless motion capture system are 100% when present.
### Pointing ray vectors
## Gesture Annotations
## Encoded Gestures
## Subjective Questionnaires
# Supporting Scripts
## Setup
Prior to running the code included with the dataset, please ensure the following steps are executed.
### Dataset
Unzip the participant data that you want to run the scripts against. Do not alter the structure of the dataset, besides removing unused zip files.
Ensure that the code is placed in the same same directory as the `encoded_gestures.csv` and `gesture_annoations.json` files and the `trial_data/` directory
### Python Environment
To ensure the you have the relevant python dependencies installed, please setup a virtual environment using the provided `requirements.txt` file.
```
# 1. Create the virtual environment.
python -m venv $PATH_TO_CREATE_VENV
# e.g. python -m venv ../.pointing - which will create a venv with the name .pointing
# ----
# 2. If the venv is not already activated, activate the venv.
${PATH_TO_CREATE_VENV}/Scripts/[A|a]ctivate[.bat|.ps1]
# e.g. ../.pointing/Scripts/Activate.ps1 - this will activate the venv in powershell, use activate for linux/mac and activate.bat for cmd environments
# If successful you should see the venv name in parentheses at the start of your prompt
# ----
# 3. Install dependencies.
python -m pip install -r ./requirements.txt
```
You will need to ensure the environment is activated prior to running any scripts.
## Participant Selection
For both of the python scripts provided, you will be able to choose for which participants you want to be processed.
```
┏━━━━┳━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━┓
┃ # ┃ Selected ┃ Participant ID ┃ Pre-Processed ┃ Features Extracted ┃
┡━━━━╇━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━┩
│ 0 │ False │ 081cf3d4-a0ee-4c4f-abc9-d28ca934c23e │ True │ False │
│ 1 │ False │ 191b20e3-570d-4fdb-96a3-96f0c54c6b8a │ True │ False │
│ 2 │ False │ 26703bcf-214b-45b5-b90e-b25059b7b232 │ True │ False │
│ 3 │ False │ 29a3ccd2-87b6-4783-9078-c90c7777712b │ True │ False │
│ 4 │ False │ 373d3394-4880-4945-83a0-9c0d151846dc │ True │ False │
│ 5 │ False │ 384f8e59-92c5-436f-ba88-8e77a13506d7 │ True │ False │
│ 6 │ False │ 452ef6ea-d832-4798-b711-d78f53cde3aa │ True │ False │
│ 7 │ False │ 5148eabf-9910-415a-8d7b-d74965bb6bd9 │ True │ False │
│ 8 │ False │ 6c765be9-c48d-421b-96fc-ba00f5636444 │ True │ False │
│ 9 │ False │ 7418bba5-28e8-4c30-a1bf-34fbb4fca50a │ True │ False │
│ 10 │ False │ 7f5723fc-8072-4f3e-b1bb-259d35f6d92d │ True │ False │
│ 11 │ False │ 86f214ec-2fd2-4989-8089-4601df74e130 │ True │ False │
│ 12 │ False │ 8b84215b-eab2-4e36-8f1b-8ec0aba690e2 │ True │ False │
│ 13 │ False │ 8e5a0c62-3676-47f5-965b-a657ce0edc96 │ True │ False │
│ 14 │ False │ 951e1730-1cde-40e5-9f64-eccd4bd91fa0 │ True │ False │
│ 15 │ False │ 97b54f6b-2867-49e9-96a2-e1dd9315b11a │ True │ False │
│ 16 │ False │ a71bf683-c322-4dfa-be3b-ea47f42c664b │ True │ False │
│ 17 │ False │ aaa6374b-7925-4c8d-862c-cc82fb258e1c │ True │ False │
│ 18 │ False │ b89d790a-13d4-4d79-b891-8887a2dd273b │ True │ False │
│ 19 │ False │ c458fe4a-6fab-40ef-b2fd-bb3c942484e6 │ True │ False │
│ 20 │ False │ d65a5cb9-32a9-4fc3-a52f-f036e8d57462 │ True │ False │
│ 21 │ False │ e5b1b60e-7f4f-4a72-a5eb-d83f1750c21e │ True │ False │
│ 22 │ False │ eeb32ad2-85b5-4f7a-b3b6-22ebba8aade8 │ True │ False │
└────┴──────────┴──────────────────────────────────────┴───────────────┴────────────────────┘
Please select the participants that you wish to process, or confirm if you're happy with the present ones.
>
# You can enter the specific participant IDs (or partials),
> 081cf3d4-a0ee-4c4f-abc9-d28ca934c23e 191b20e3
...
│ 0 │ True │ 081cf3d4-a0ee-4c4f-abc9-d28ca934c23e │ True │ False │
│ 1 │ True │ 191b20e3-570d-4fdb-96a3-96f0c54c6b8a │ True │ False │
...
# Or use the index of the participants provided in the leftmost column.
> 10
...
│ 10 │ True │ 7f5723fc-8072-4f3e-b1bb-259d35f6d92d │ True │ False │
...
# If using the index, you can also define a range (inclusive)
> 10-13
...
│ 10 │ True │ 7f5723fc-8072-4f3e-b1bb-259d35f6d92d │ True │ False │
│ 11 │ True │ 86f214ec-2fd2-4989-8089-4601df74e130 │ True │ False │
│ 12 │ True │ 8b84215b-eab2-4e36-8f1b-8ec0aba690e2 │ True │ False │
│ 13 │ True │ 8e5a0c62-3676-47f5-965b-a657ce0edc96 │ True │ False │
...
# Multiple selections can be made at once and mix-n-matched, e.g. 'a71bf683 10 18-20'
> 'a71bf683 10 18-20'
...
│ 10 │ True │ 7f5723fc-8072-4f3e-b1bb-259d35f6d92d │ True │ False │
...
│ 16 │ True │ a71bf683-c322-4dfa-be3b-ea47f42c664b │ True │ False │
...
│ 18 │ True │ b89d790a-13d4-4d79-b891-8887a2dd273b │ True │ False │
│ 19 │ True │ c458fe4a-6fab-40ef-b2fd-bb3c942484e6 │ True │ False │
│ 20 │ True │ d65a5cb9-32a9-4fc3-a52f-f036e8d57462 │ True │ False │
...
# You can also type in 'a' to select all participants
> a
┏━━━━┳━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━┓
┃ # ┃ Selected ┃ Participant ID ┃ Pre-Processed ┃ Features Extracted ┃
┡━━━━╇━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━┩
│ 0 │ True │ 081cf3d4-a0ee-4c4f-abc9-d28ca934c23e │ True │ False │
│ 1 │ True │ 191b20e3-570d-4fdb-96a3-96f0c54c6b8a │ True │ False │
│ 2 │ True │ 26703bcf-214b-45b5-b90e-b25059b7b232 │ True │ False │
│ 3 │ True │ 29a3ccd2-87b6-4783-9078-c90c7777712b │ True │ False │
│ 4 │ True │ 373d3394-4880-4945-83a0-9c0d151846dc │ True │ False │
│ 5 │ True │ 384f8e59-92c5-436f-ba88-8e77a13506d7 │ True │ False │
│ 6 │ True │ 452ef6ea-d832-4798-b711-d78f53cde3aa │ True │ False │
│ 7 │ True │ 5148eabf-9910-415a-8d7b-d74965bb6bd9 │ True │ False │
│ 8 │ True │ 6c765be9-c48d-421b-96fc-ba00f5636444 │ True │ False │
│ 9 │ True │ 7418bba5-28e8-4c30-a1bf-34fbb4fca50a │ True │ False │
│ 10 │ True │ 7f5723fc-8072-4f3e-b1bb-259d35f6d92d │ True │ False │
│ 11 │ True │ 86f214ec-2fd2-4989-8089-4601df74e130 │ True │ False │
│ 12 │ True │ 8b84215b-eab2-4e36-8f1b-8ec0aba690e2 │ True │ False │
│ 13 │ True │ 8e5a0c62-3676-47f5-965b-a657ce0edc96 │ True │ False │
│ 14 │ True │ 951e1730-1cde-40e5-9f64-eccd4bd91fa0 │ True │ False │
│ 15 │ True │ 97b54f6b-2867-49e9-96a2-e1dd9315b11a │ True │ False │
│ 16 │ True │ a71bf683-c322-4dfa-be3b-ea47f42c664b │ True │ False │
│ 17 │ True │ aaa6374b-7925-4c8d-862c-cc82fb258e1c │ True │ False │
│ 18 │ True │ b89d790a-13d4-4d79-b891-8887a2dd273b │ True │ False │
│ 19 │ True │ c458fe4a-6fab-40ef-b2fd-bb3c942484e6 │ True │ False │
│ 20 │ True │ d65a5cb9-32a9-4fc3-a52f-f036e8d57462 │ True │ False │
│ 21 │ True │ e5b1b60e-7f4f-4a72-a5eb-d83f1750c21e │ True │ False │
│ 22 │ True │ eeb32ad2-85b5-4f7a-b3b6-22ebba8aade8 │ True │ False │
└────┴──────────┴──────────────────────────────────────┴───────────────┴────────────────────┘
# To remove a participant from the selection, re-enter the participant
> 081cf3d4 5-10 20
┏━━━━┳━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━┓
┃ # ┃ Selected ┃ Participant ID ┃ Pre-Processed ┃ Features Extracted ┃
┡━━━━╇━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━┩
│ 0 │ False │ 081cf3d4-a0ee-4c4f-abc9-d28ca934c23e │ True │ False │
│ 1 │ True │ 191b20e3-570d-4fdb-96a3-96f0c54c6b8a │ True │ False │
│ 2 │ True │ 26703bcf-214b-45b5-b90e-b25059b7b232 │ True │ False │
│ 3 │ True │ 29a3ccd2-87b6-4783-9078-c90c7777712b │ True │ False │
│ 4 │ True │ 373d3394-4880-4945-83a0-9c0d151846dc │ True │ False │
│ 5 │ False │ 384f8e59-92c5-436f-ba88-8e77a13506d7 │ True │ False │
│ 6 │ False │ 452ef6ea-d832-4798-b711-d78f53cde3aa │ True │ False │
│ 7 │ False │ 5148eabf-9910-415a-8d7b-d74965bb6bd9 │ True │ False │
│ 8 │ False │ 6c765be9-c48d-421b-96fc-ba00f5636444 │ True │ False │
│ 9 │ False │ 7418bba5-28e8-4c30-a1bf-34fbb4fca50a │ True │ False │
│ 10 │ False │ 7f5723fc-8072-4f3e-b1bb-259d35f6d92d │ True │ False │
│ 11 │ True │ 86f214ec-2fd2-4989-8089-4601df74e130 │ True │ False │
│ 12 │ True │ 8b84215b-eab2-4e36-8f1b-8ec0aba690e2 │ True │ False │
│ 13 │ True │ 8e5a0c62-3676-47f5-965b-a657ce0edc96 │ True │ False │
│ 14 │ True │ 951e1730-1cde-40e5-9f64-eccd4bd91fa0 │ True │ False │
│ 15 │ True │ 97b54f6b-2867-49e9-96a2-e1dd9315b11a │ True │ False │
│ 16 │ True │ a71bf683-c322-4dfa-be3b-ea47f42c664b │ True │ False │
│ 17 │ True │ aaa6374b-7925-4c8d-862c-cc82fb258e1c │ True │ False │
│ 18 │ True │ b89d790a-13d4-4d79-b891-8887a2dd273b │ True │ False │
│ 19 │ True │ c458fe4a-6fab-40ef-b2fd-bb3c942484e6 │ True │ False │
│ 20 │ False │ d65a5cb9-32a9-4fc3-a52f-f036e8d57462 │ True │ False │
│ 21 │ True │ e5b1b60e-7f4f-4a72-a5eb-d83f1750c21e │ True │ False │
│ 22 │ True │ eeb32ad2-85b5-4f7a-b3b6-22ebba8aade8 │ True │ False │
└────┴──────────┴──────────────────────────────────────┴───────────────┴────────────────────┘
# You confirm your selection by typing 'y' or 'yes'
> y
Using the provided participants: ...
```
The available participants are derived from the uncompressed participant directories in the dataset.
## Trial Visualiser
This is a hacked together tool which misuses pyplot in order to render the motion capture data, and to allow us to label the trials in the dataset.
In order to run the visualiser, run the following in your terminal of choice:
```
python ./trial_visualiser_and_annotator.py -d $PATH_TO_DATASET
```
We recommend loading only 1 participant per instance of the visualiser, as one cannot navigate between participants, with the next participant selected being presented once the window for the current participant is closed.
## Gesture Encoder
For our analysis of the data, we encoded each gesture based on the trial properties (e.g. participant id, conditions, trial index, ...), along with stats and values derived from the pointing gesture during the hold phase (refer to paper or dataset README), such as the medians for the rays used to estimate where pointing is being directed, the error of these rays, and various fatigue measures.
This is performed with the `gesture_encoder.py` script, which produces an `encoded_gestures.csv`.
Trials for which pointing cannot be determined (not performed, pointed towards the wrong target, or from technical issue), are skipped. Such trials are recorded in an `invalid_trials.json`, though some trials may occur twice if multiple issues are present for the single trial.
This script can be run with the following:
```
python ./gesture_encoder.py -d $PATH_TO_DATASET [-o $PATH_TO_OUTPUT]
```
## Analysis Script
To analyse the gesture captured in the dataset, we used the provided `StatisticalAnalysisScript.R` script. This performs statistical analysis over the `encoded_gestures.csv` file to produce the results used in our accompanying paper.
This is run by opening the script into [RStudio](https://posit.co/download/rstudio-desktop/) (see linked installation instructions). Make sure that the working directory is set to the dataset (this should have the script in the directory with the `encoded_gestures.csv`). This can be set via `session > Set Working Directory > Choose Directory...` or `Ctrl+Shift+H`.
Once the directory is set, you can execute the script by hitting `Ctrl+Enter` for each line, or by hitting `Ctrl+A` and then pressing `Enter`.
Results will be printed to the console panel within RStudio.