Datasets for Bilateral Lower-Limb Neuromechanical Signals in Able-Bodied and Impaired Individuals with Wearable and Ambient Sensors (BLISS)
To address the challenges in human activity recognition (HAR) and advance research in lower-limb assistive devices and machine learning (ML) for HAR, we introduce the BLISS dataset (Bilateral Lower-Limb Neuromechanical Signals). This benchmark dataset includes bilateral EMG and limb kinematics from wearable sensors for 21 subjects, encompassing both healthy individuals and those with mild to severe gait abnormalities. The selected abnormalities, which are clinically prevalent, include reduced knee flexion due to stiffness and/or overweight conditions, and dorsiflexor weakness resulting from stroke, multiple sclerosis and lumbar radiculopathy. Subjects perform free ground-level walking in an uncontrolled environment across multiple trials, each representing a complete gait bout. Additionally, marker-based motion capture and force plates provide ground truth estimates of limb positions and ground reaction forces. Comprehensive analyses estimate angular accelerations, velocities, positions, and torques at individual joints. The dataset is fully annotated for gait cycle phases: loading response (LR), mid-stance (MST), terminal stance (TS), pre-swing (PSW), and swing (SW),. This dataset complements existing benchmarks by offering detailed guidelines for sensor modality selection, analysis, and annotation, and balancing data between healthy and impaired subjects.
Cite this dataset as:
Ahmed, S.,
Mohanna, M.,
Martinez Hernandez, U.,
Awad, M.,
Mansour, A.,
2025.
Datasets for Bilateral Lower-Limb Neuromechanical Signals in Able-Bodied and Impaired Individuals with Wearable and Ambient Sensors
(BLISS).
Version 3.
Bath: University of Bath Research Data Archive.
Available from: https://doi.org/10.15125/BATH-01425.
Export
Data
AB2930.zip
application/zip (431MB)
Creative Commons: Attribution 4.0
AB2931.zip
application/zip (405MB)
Creative Commons: Attribution 4.0
AB2932.zip
application/zip (367MB)
Creative Commons: Attribution 4.0
AB2933.zip
application/zip (412MB)
Creative Commons: Attribution 4.0
AB2934.zip
application/zip (509MB)
Creative Commons: Attribution 4.0
AB2935.zip
application/zip (438MB)
Creative Commons: Attribution 4.0
AB2936.zip
application/zip (935MB)
Creative Commons: Attribution 4.0
AB2937.zip
application/zip (400MB)
Creative Commons: Attribution 4.0
AB2938.zip
application/zip (361MB)
Creative Commons: Attribution 4.0
AB2939.zip
application/zip (525MB)
Creative Commons: Attribution 4.0
AB2940.zip
application/zip (482MB)
Creative Commons: Attribution 4.0
AB2941.zip
application/zip (423MB)
Creative Commons: Attribution 4.0
AB2942.zip
application/zip (25MB)
Creative Commons: Attribution 4.0
AB2943.zip
application/zip (461MB)
Creative Commons: Attribution 4.0
AB2944.zip
application/zip (99MB)
Creative Commons: Attribution 4.0
AB2945.zip
application/zip (75MB)
Creative Commons: Attribution 4.0
AB2946.zip
application/zip (296MB)
Creative Commons: Attribution 4.0
AB2947.zip
application/zip (470MB)
Creative Commons: Attribution 4.0
AB2948.zip
application/zip (645MB)
Creative Commons: Attribution 4.0
AB2949.zip
application/zip (185MB)
Creative Commons: Attribution 4.0
AB2950.zip
application/zip (609MB)
Creative Commons: Attribution 4.0
Code
MATLAB scripts.zip
application/zip (4MB)
Software: MIT License
Creators
Samer Ahmed
Data Collector
University of Bath
Mohamed Mohanna
Data Collector
Ain Shams University
Uriel Martinez Hernandez
Supervisor
University of Bath; University of Sheffield
Mohammed Awad
Ain Shams University
Alia Mansour
Ain Shams University
Contributors
University of Bath
Rights Holder
Documentation
Data collection method:
The subject's weight and height were measured using analog devices. The motion capture cameras and force plates were calibrated prior to the experiment. 15 wearable sensors (inertial measurement (IMU) plus electromyography (EMG) integrated units) were attached to the participants' lower limbs. 26 passive reflective markers were attached non-invasively on top of skeletal landmarks according to the IOR lower body automated identification of markers (AIM) model. The attachment of sensors and markers is visually illustrated in the additional metadata files associated with the dataset. The walking procedure involved moving forward for about 5 meters. Each subject performed this trial approximately 50 times, with administrated breaks to prevent fatigue. The number of gait bouts in each direction was roughly equal, with each bout covering nearly 5 meters. Subjects walked at their self-selected speeds, and trials with pauses or trips within the force plate area were discarded. Data collection took about 3 hours for healthy subjects and extended to 4-5 hours for those with severe impairments.
Data processing and preparation activities:
The dataset was annotated using a visual3D. EMG and IMU signals were filtered thoroughly according to literature recommendations. Visual3D software analyzed raw Qualysis signals, assigning joint reference frames and calculating kinematic and dynamic data. The processed data was later segmented into analysis windows to create additional feature-based datasets. Each window's features were computed, including 780 features from 30 sensors. MATLAB was used for gait analysis, comparing joint kinematic/kinetic data to normative data and processing EMG envelopes and gait cycle data. Data from the test subjects was completely anonymized using unique codes.
Technical details and requirements:
Qualysis software (Qualysis, Göteborg, Sweden) 2023; Visual3D software (HAS-motion; Kingston, Ontario) 2023; MATLAB software (Mathworks, Natick, Massachusetts, United States) 2022; 15 Trigno Avanti™ Sensors; Delsys, Natick, MA, USA. The motion capture system used (Qualysis, Göteborg, Sweden) includes 14 cameras (2 Miqus and 12 Arqus cameras) to track passive markers. Ground reaction forces are recorded using four force plates (BMS400600; Amti, Watertown, MA, USA) fixed to the lab floor.
Additional Metadata
Dataset Description.docx
application/vnd.openxmlformats-officedocument.wordprocessingml.document (29kB)
Creative Commons: Attribution 4.0
INSTRUMENTATION SETUP 3.png
image/png (593kB)
Creative Commons: Attribution 4.0
Subject Information.xlsx
application/vnd.openxmlformats-officedocument.spreadsheetml.sheet (10kB)
Creative Commons: Attribution 4.0
Funders
Self-funded
Publication details
Publication date: 28 April 2025
by: University of Bath
Version: 3
DOI: https://doi.org/10.15125/BATH-01425
URL for this record: https://researchdata.bath.ac.uk/1425
Contact information
Please contact the Research Data Service in the first instance for all matters concerning this item.
Contact person: Samer Ahmed
Faculty of Engineering & Design
Electronic & Electrical Engineering
Research Centres & Institutes
Centre for Bioengineering & Biomedical Technologies (CBio)