Works (12)

Updated: July 19th, 2023 21:18

2022 journal article

Fusion of Human Gaze and Machine Vision for Predicting Intended Locomotion Mode

IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, 30, 1103–1112.

By: M. Li n, B. Zhong n, E. Lobaton n & H. Huang n

Contributors: M. Li n, B. Zhong n, E. Lobaton n & H. Huang n

author keywords: Wearable robots; Feature extraction; Point cloud compression; Machine vision; Cameras; Legged locomotion; Visualization; Human gaze; machine vision; intent recognition; wearable robot; deep learning
MeSH headings : Algorithms; Humans; Intention; Locomotion; Lower Extremity; Walking
TL;DR: A novel system that fuses the human gaze and machine vision for locomotion intent recognition of lower limb wearable robots is developed, showing high accuracy of intent recognition and reliable decision-making on locomotion transition with adjustable lead time. (via Semantic Scholar)
Sources: ORCID, Web Of Science, NC State University Libraries
Added: May 4, 2022

2022 journal article

Improving Performance and Quantifying Uncertainty of Body-Rocking Detection Using Bayesian Neural Networks

Information, 13(7), 338.

By: R. da Silva n, B. Zhong n, Y. Chen n & E. Lobaton n

author keywords: Bayesian Neural Networks; uncertainty quantification; stereotypical motor movement; body rocking
Sources: ORCID, Web Of Science, NC State University Libraries
Added: July 13, 2022

2021 article

Efficient Environmental Context Prediction for Lower Limb Prostheses

Zhong, B., Silva, R. L., Tran, M., Huang, H., & Lobaton, E. (2021, June 7). IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, Vol. 6.

By: B. Zhong n, R. Silva n, M. Tran n, H. Huang n & E. Lobaton n

Contributors: B. Zhong n, R. Silva n, M. Tran n, H. Huang n & E. Lobaton n

author keywords: Cameras; Wearable robots; Uncertainty; Neural networks; Real-time systems; Sensors; Hardware; Bayesian neural network (BNN); efficient deep learning system; environmental context prediction; prostheses; uncertainty quantification
TL;DR: An uncertainty-aware frame selection strategy that can dynamically select frames according to lower limb motion and uncertainty captured by Bayesian neural networks (BNNs) for environment prediction is developed and extended to the situation of multimodality fusion. (via Semantic Scholar)
UN Sustainable Development Goal Categories
Sources: ORCID, Web Of Science, NC State University Libraries
Added: June 10, 2021

2020 journal article

Detection of driver manual distraction via image-based hand and ear recognition

ACCIDENT ANALYSIS AND PREVENTION, 137.

By: L. Li n, B. Zhong n, C. Hutmacher n, Y. Liang, W. Horrey* & X. Xu n

author keywords: Driving distraction; Upper extremity kinematics; Deep learning; Computer vision; Multi-class classification
MeSH headings : Accidents, Traffic / prevention & control; Adult; Algorithms; Data Collection; Distracted Driving; Ear / physiology; Female; Hand / physiology; Humans; Male; Neural Networks, Computer; Pattern Recognition, Automated / methods
TL;DR: A novel algorithm for detection of drivers' manual distraction was proposed in this manuscript and achieved comparable overall accuracy with similar research, and was more efficient than other methods. (via Semantic Scholar)
UN Sustainable Development Goal Categories
3. Good Health and Well-being (OpenAlex)
Sources: Web Of Science, NC State University Libraries
Added: April 6, 2020

2020 journal article

Enhancing the morphological segmentation of microscopic fossils through Localized Topology-Aware Edge Detection

AUTONOMOUS ROBOTS, 45(5), 709–723.

By: Q. Ge n, T. Richmond n, B. Zhong n, T. Marchitto* & E. Lobaton n

author keywords: Edge detection; Topological structure; Morphological segmentation
TL;DR: A homology-based detector of local structural difference between two edge maps with a tolerable deformation is employed as a new criterion for the training and design of data-driven approaches that focus on enhancing these structural differences. (via Semantic Scholar)
UN Sustainable Development Goal Categories
14. Life Below Water (OpenAlex)
Sources: Web Of Science, NC State University Libraries, ORCID
Added: November 30, 2020

2020 journal article

Environmental Context Prediction for Lower Limb Prostheses With Uncertainty Quantification

IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, 18(2), 458–470.

By: B. Zhong n, R. Silva n, M. Li n, H. Huang n & E. Lobaton n

Contributors: B. Zhong n, R. Silva n, M. Li n, H. Huang n & E. Lobaton n

author keywords: Uncertainty; Neural networks; Bayes methods; Measurement uncertainty; Cameras; Microsoft Windows; Bayesian neural network (BNN); environmental context prediction; prosthesis; uncertainty quantification
TL;DR: A novel vision-based context prediction framework for lower limb prostheses to simultaneously predict human’s environmental context for multiple forecast windows by leveraging the Bayesian neural networks (BNNs) and producing a calibrated predicted probability for online decision-making. (via Semantic Scholar)
UN Sustainable Development Goal Categories
Sources: ORCID, Web Of Science, NC State University Libraries
Added: May 28, 2020

2020 journal article

Reliable Vision-Based Grasping Target Recognition for Upper Limb Prostheses

IEEE TRANSACTIONS ON CYBERNETICS, 52(3), 1750–1762.

By: B. Zhong n, H. Huang n & E. Lobaton n

Contributors: B. Zhong n, H. Huang n & E. Lobaton n

author keywords: Grasping; Uncertainty; Prosthetics; Task analysis; Target recognition; Bayes methods; Prediction algorithms; Bayesian deep learning (BDL); grasping strategy; reliable computer vision; upper limb prosthesis
MeSH headings : Arm; Artificial Limbs; Bayes Theorem; Hand Strength; Humans; Robotics; Upper Extremity
TL;DR: A novel, reliable vision-based framework to assist upper limb prosthesis grasping during arm reaching using Bayesian deep learning (BDL) and a probability calibration network was developed to fuse the uncertainty measures into one calibrated probability for online decision making. (via Semantic Scholar)
UN Sustainable Development Goal Categories
Sources: ORCID, Web Of Science, NC State University Libraries
Added: June 11, 2020

2019 journal article

Automated species-level identification of planktic foraminifera using convolutional neural networks, with comparison to human performance

MARINE MICROPALEONTOLOGY, 147, 16–24.

By: R. Mitra*, T. Marchitto*, Q. Ge n, B. Zhong n, B. Kanakiya n, M. Cook*, J. Fehrenbacher*, J. Ortiz*, A. Tripati*, E. Lobaton n

author keywords: Foraminifera; Identification; Automation; Artificial intelligence; Neural network
TL;DR: Using machine learning techniques to train convolutional neural networks to identify six species of extant planktic foraminifera that are widely used by paleoceanographers, and to distinguish the six species from other taxa, demonstrates that the approach can provide a versatile ‘brain’ for an eventual automated robotic picking system. (via Semantic Scholar)
UN Sustainable Development Goal Categories
13. Climate Action (Web of Science)
Sources: Web Of Science, NC State University Libraries, ORCID
Added: April 15, 2019

2017 conference paper

A comparative study of image classification algorithms for foraminifera identification

2017 IEEE Symposium Series on Computational Intelligence (SSCI), 3199–3206.

By: B. Zhong n, Q. Ge n, B. Kanakiya n, R. Mitra*, T. Marchitto & E. Lobaton*

TL;DR: A foram identification pipeline is proposed to automatic identify forams based on computer vision and machine learning techniques, and the classification algorithms provide competitive results when compared to human experts labeling of the data set. (via Semantic Scholar)
UN Sustainable Development Goal Categories
14. Life Below Water (OpenAlex)
Sources: NC State University Libraries, NC State University Libraries, ORCID
Added: August 6, 2018

2017 conference paper

Coarse-to-fine Foraminifera image segmentation through 3d and deep features

2017 IEEE Symposium Series on Computational Intelligence (SSCI).

By: Q. Ge n, B. Zhong n, B. Kanakiya n, R. Mitra*, T. Marchitto* & E. Lobaton n

TL;DR: A learning-based edge detection pipeline is proposed, using a coarse-to-fine strategy, to extract the vague edges from foraminifera images for segmentation using a relatively small training set and has the potential to provide useful features for species identification and other applications such as morphological study of foraminifa shells and foraminifiera dataset labeling. (via Semantic Scholar)
UN Sustainable Development Goal Categories
14. Life Below Water (OpenAlex)
Sources: NC State University Libraries, NC State University Libraries, ORCID
Added: August 6, 2018

2017 conference paper

Emotion recognition with facial expressions and physiological signals

2017 IEEE Symposium Series on Computational Intelligence (SSCI), 1170–1177.

By: B. Zhong*, Z. Qin n, S. Yang, J. Chen, N. Mudrick*, M. Taub*, R. Azevedo*, E. Lobaton*

TL;DR: A temporal information preserving multi-modal emotion recognition framework based on physiological and facial expression data streams that significantly improves the emotion recognition performance when physiological signals are used and the best performance is achieved when fusing facial expressions and physiological data. (via Semantic Scholar)
Sources: NC State University Libraries, NC State University Libraries, ORCID
Added: August 6, 2018

2017 conference paper

Energy-efficient activity recognition via multiple time-scale analysis

2017 IEEE Symposium Series on Computational Intelligence (SSCI), 1466–1472.

By: N. Lokare n, S. Samadi n, B. Zhong n, L. Gonzalez n, F. Mohammadzadeh n & E. Lobaton n

TL;DR: This work proposes a novel power-efficient strategy for supervised human activity recognition using a multiple time-scale approach, which takes into account various window sizes, and shows that the proposed approach Sequential Maximum-Likelihood achieves high F1 score across all activities while providing lower power consumption than the standard Maximum- likelihood approach. (via Semantic Scholar)
UN Sustainable Development Goal Categories
7. Affordable and Clean Energy (OpenAlex)
Sources: NC State University Libraries, NC State University Libraries, ORCID
Added: August 6, 2018

Citation Index includes data from a number of different sources. If you have questions about the sources of data in the Citation Index or need a set of data which is free to re-distribute, please contact us.

Certain data included herein are derived from the Web of Science© and InCites© (2024) of Clarivate Analytics. All rights reserved. You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.