2022 journal article

Reliable Vision-Based Grasping Target Recognition for Upper Limb Prostheses

IEEE TRANSACTIONS ON CYBERNETICS, 52(3), 1750–1762.

By: B. Zhong n, H. Huang n & E. Lobaton n

co-author countries: United States of America 🇺🇸
author keywords: Grasping; Uncertainty; Prosthetics; Task analysis; Target recognition; Bayes methods; Prediction algorithms; Bayesian deep learning (BDL); grasping strategy; reliable computer vision; upper limb prosthesis
MeSH headings : Arm; Artificial Limbs; Bayes Theorem; Hand Strength; Humans; Robotics; Upper Extremity
Source: Web Of Science
Added: May 10, 2022

Computer vision has shown promising potential in wearable robotics applications (e.g., human grasping target prediction and context understanding). However, in practice, the performance of computer vision algorithms is challenged by insufficient or biased training, observation noise, cluttered background, etc. By leveraging Bayesian deep learning (BDL), we have developed a novel, reliable vision-based framework to assist upper limb prosthesis grasping during arm reaching. This framework can measure different types of uncertainties from the model and data for grasping target recognition in realistic and challenging scenarios. A probability calibration network was developed to fuse the uncertainty measures into one calibrated probability for online decision making. We formulated the problem as the prediction of grasping target while arm reaching. Specifically, we developed a 3-D simulation platform to simulate and analyze the performance of vision algorithms under several common challenging scenarios in practice. In addition, we integrated our approach into a shared control framework of a prosthetic arm and demonstrated its potential at assisting human participants with fluent target reaching and grasping tasks.