Title: Balanced Feature Fusion for Grouped 3D Pose Estimation
Authors: Peng, Jihua
Zhou, Yanghong
Mok, P.Y.
Citation: WSCG 2022: full papers proceedings: 30. International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision, p. 103-108.
Issue Date: 2022
Publisher: Václav Skala - UNION Agency
Document type: conferenceObject
URI: http://hdl.handle.net/11025/49583
ISBN: 978-80-86943-33-6
ISSN: 2464-4617
Keywords: 3D odhad lidské pozice;seskupení funkce fúze;anatomické vztahy
Keywords in different language: 3D human pose estimation;grouping feature fusion;anatomical relationships
Abstract in different language: 3D human pose estimation by grouping human body joints according to anatomical relationship is currently a popular and effective method. For grouped pose estimation, fusing features of different groups together effectively is the key step to ensure the integrity of whole body pose prediction. However, the existing methods for feature fusion between groups require a large number of network parameters, and thus are often computational expensive. In this paper, we propose a simple yet efficient feature fusion method that can improve the accuracy of pose esti- mation while require fewer parameters and less calculations. Experiments have shown that our proposed network outperforms previous state-of-the-art results on Human3.6M dataset.
Rights: © Václav Skala - UNION Agency
Appears in Collections:WSCG 2022: Full Papers Proceedings

Files in This Item:
File Description SizeFormat 
B59-full.pdfPlný text1,4 MBAdobe PDFView/Open


Please use this identifier to cite or link to this item: http://hdl.handle.net/11025/49583

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.