Rationale: The movement of our eyes to specific locations is intimately tied to the demands of a task. Visual attention is an integral component of motor performance expected to change with accurate sensory feedback and intuitive motor control. Hand function impairment also leads to proximal body compensations, which can lead to poor performance and long-term complications.
Method: GaMA combines motion tracking and eye tracking during functional real-world tasks. This metric quantifies the motion of the prosthesis (or hand and arm), compensatory body movements and visual gaze behavior during standard tasks. Metrics output include upper limb and hand kinematic measures as well as measures of visual attention.
Applicability: This assessment metric is relevant to many conditions of upper limb sensory-motor impairment, so long as the individual is able to grasp and move objects.
Functional Metrics for Humans with Bi-Directionally Integrated Prosthetic Limbs
- Investigators: JS Hebert, CS Chapman, AH Vette, PM Pilarski
- Previous funding: Hand Proprioception & Touch Interfaces (HAPTIX), Defense Advanced Research Projects Agency (DARPA). Subcontracted through Dr. Paul Marasco, Cleveland Clinic.
Normative Validation Papers:
1. Valevicius AM, Jun PY, Hebert JS, Vette AH. Use of optical motion capture for the analysis of normative upper body kinematics during functional upper limb tasks: A systematic review. Journal of Electromyography and Kinesiology. 2018, Feb; 40: 1–15. https://doi.org/10.1016/j.jelekin.2018.02.011.
This review paper critically assessed kinematic model characteristics, the performed functional tasks, kinematic outcomes, and reported validity and reliability as background work prior to developing our protocols.
2. Boser QA, Valevicius AM, Lavoie EB, Chapman CS, Pilarski PM, Hebert JS, Vette AH. Cluster-Based Upper Body Marker Models for Three-Dimensional Kinematic Analysis: Comparison with an Anatomical Model and Reliability Analysis. Journal of Biomechanics. 2018, Apr; 72: 228–34. https://doi.org/10.1016/j.jbiomech.2018.02.028.
This paper compared three motion tracking upper body marker cluster models with an anatomical model during the performance of two functional tasks used in GaMA, including between-session reliability, resulting in the choice of the cluster marker model for GaMA.
3. Valevicius AM, Boser QA, Lavoie EB, Murgatroyd G, Chapman CS, Pilarski PM, Vette AH, Hebert JS. Characterization of normative hand movements during two functional upper limb tasks. PLoS ONE. 2018, Jun; 13(6): e0199549. https://doi.org/10.1371/journal.pone.0199549.
This paper disseminated the two standardized upper limb task protocols and quantitatively characterized the kinematics of normative hand movement using optical motion capture.
4. Lavoie EB, Valevicius AM, Boser QA, Kovic O, Vette AH, Pilarski PM, Hebert JS, Chapman CS. Using synchronized eye and motion tracking to determine high-precision eye movement patterns during object interaction tasks. Journal of Vision. 2018, Jun; 18(6): 1–20. https://doi.org/10.1167/18.6.18.
This paper established normative values for visual attention during the functional tasks, using head-mounted eye tracking and optical motion capture with kinematic segmentation.
5. Valevicius AM, Boser QA, Lavoie EB, Chapman CS, Pilarski PM, Hebert JS, Vette AH. Characterization of normative angular joint kinematics during two functional upper limb tasks. Gait & Posture. 2019, Jan; 69(2019): 176-186. https://doi.org/10.1016/j.gaitpost.2019.01.037.
This paper quantitatively characterized the kinematics of normative joint movement kinematics during GaMA standardized task protocols, including test-retest reliability.
6. Williams HE, Chapman CS, Pilarski PM, Vette AH, Hebert JS. Gaze and Movement Assessment (GaMA): Inter-site validation of a visuomotor upper limb functional protocol. PLoSOne, under review. Preprint 10.1101/681437 posted on bioRxiv: http://biorxiv.org/cgi/content/short/681437v1.
This paper demonstrates the reproducibility of the GaMA testing protocol at a new site with new technology and different rater.
Prosthesis User Validation Papers: currently under review.