Next level step monitoring in health, sports and medicine by augmenting gait with meaningful environmental information

Project Details

Description

This high but calculated risk project will develop and validate wearable sensor signal analysis algorithms
capable of differentiating and classifying surface conditions of gait in real life relevant to patients (walking)
and athletes (running) considering specific diseases or injury risk profiles. For instance, for patients with
knee osteoarthritis or soft tissue related instabilities, step counts will be identified and analysed particularly
on inclined surfaces or stairs where the condition is known to create a specific functional challenge while
normal steps require little to no specific effort. For runners, steps on asphalt (hard) versus offroad
(softer/rough) will be classified to differentiate the specific loading conditions.
This project will innovate next level step monitoring with new meaningful digital mobility biomarkers in an
original and continuous co-creation process together with the relevant user groups (patients/athletes,
treating physicians, clinical researchers) to a) establish wearable hardware usability for optimal wear
compliance, b) identify patient-centred biomarker definitions (understandable, meaningful, actionable) for
which the signal analysis algorithms will be developed for, c) determine the biomarker definitions most
relevant for clinical diagnostics, outcome assessment and clinical trial endpoint (disease specific, general
mobility) and d) establish user-centred biomarker and data visualisations for efficient comprehension,
self-assessment (patient/athlete) or decision making (treating physician, coach). The interdisciplinary
collaboration between partners from Québec and Luxembourg provides the ecological validity required for
such a project to deliver efficacy and authority in diverse populations and across health care systems.
This project will also generate a highly innovative method applying advanced computer image recognition
to automatically generate a ground truth for algorithm training and validation with little to no human
intervention. This will overcome the need for time consuming, tedious, and error prone annotations of
lengthy video footage by trained human observers, a major obstacle in digital activity classifications.
AcronymNextStep-EI
StatusActive
Effective start/end date1/10/2330/09/25

Funding

  • FNR - Fonds National de la Recherche: €121,000.00

Fingerprint

Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.