PA-Tran: Learning to Estimate 3D Hand Pose with Partial Annotation

Sensors (Basel). 2023 Jan 31;23(3):1555. doi: 10.3390/s23031555.

Abstract

This paper tackles a novel and challenging problem-3D hand pose estimation (HPE) from a single RGB image using partial annotation. Most HPE methods ignore the fact that the keypoints could be partially visible (e.g., under occlusions). In contrast, we propose a deep-learning framework, PA-Tran, that jointly estimates the keypoints status and 3D hand pose from a single RGB image with two dependent branches. The regression branch consists of a Transformer encoder which is trained to predict a set of target keypoints, given an input set of status, position, and visual features embedding from a convolutional neural network (CNN); the classification branch adopts a CNN for estimating the keypoints status. One key idea of PA-Tran is a selective mask training (SMT) objective that uses a binary encoding scheme to represent the status of the keypoints as observed or unobserved during training. In addition, by explicitly encoding the label status (observed/unobserved), the proposed PA-Tran can efficiently handle the condition when only partial annotation is available. Investigating the annotation percentage ranging from 50-100%, we show that training with partial annotation is more efficient (e.g., achieving the best 6.0 PA-MPJPE when using about 85% annotations). Moreover, we provide two new datasets. APDM-Hand, is for synthetic hands with APDM sensor accessories, which is designed for a specific hand task. PD-APDM-Hand, is a real hand dataset collected from Parkinson's Disease (PD) patients with partial annotation. The proposed PA-Tran can achieve higher estimation accuracy when evaluated on both proposed datasets and a more general hand dataset.

Keywords: 3D hand pose estimation; PD (Parkinson’s disease) hand dataset; partial annotation; single RGB image; synthetic dataset; transformer.

MeSH terms

  • Hand*
  • Humans
  • Neural Networks, Computer*

Grants and funding

This research was funded by the Natural Sciences and Engineering Research Council of Canada (NSERC. RGPIN-2022-03049) and a Canadian Institutes of Health Research (CIHR) / NSERC Collaborative Health Research Project (CPG-163986).