You have requested a machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Neither SPIE nor the owners and publishers of the content make, and they explicitly disclaim, any express or implied representations or warranties of any kind, including, without limitation, representations and warranties as to the functionality of the translation feature or the accuracy or completeness of the translations.
Translations are not retained in our system. Your use of this feature and the translations is subject to all use restrictions contained in the Terms and Conditions of Use of the SPIE website.
16 October 2019Real-time 3D human motion capture from binocular stereo vision and its use on human body animation
Existing motion-capture technologies are implemented through expensive and specialized equipment, which typically requires the wearable devices or the mark of human body joint. This leads to a distinct limitation on the accessibility and popularity for mass market consumer. To address these problems, we present a pipeline for 3D human motion capture from binocular stereo vision, which is based on MobilePose, a supervised learning method for 2D skeletal joint detection in real time. Due to the short time of 2D joint detection and 3D reconstruction, our approach can fulfill 3D human motion capture in real time. Besides, we use the captured 3D motion information to implement a simple application on human body animation. Compared with Microsoft's Kinect method using depth camera, our method is implemented with the available cameras, which is low cost and widely used.
The alert did not successfully save. Please try again later.
Xupeng Wei M.D., Zhongzhi Zhang, Shuang Li, Tao Yang, Xiang Zhou, "Real-time 3D human motion capture from binocular stereo vision and its use on human body animation," Proc. SPIE 11205, Seventh International Conference on Optical and Photonic Engineering (icOPEN 2019), 112050S (16 October 2019); https://doi.org/10.1117/12.2542228