Main > Laser & Optoelectronics Progress >  Volume 57 >  Issue 20 >  Page 201509 > Article
  • Abstract
  • Abstract
  • View Summary
  • Figures (10)
  • Tables (6)
  • Equations (0)
  • References (21)
  • Get PDF(in Chinese)
  • Paper Information
  • Received: Jan. 14, 2020

    Accepted: Mar. 9, 2020

    Posted: Oct. 1, 2020

    Published Online: Oct. 17, 2020

    The Author Email: Liu Yuhong (

    DOI: 10.3788/LOP57.201509

  • Get Citation
  • Copy Citation Text

    Xiankun Zhang, Rongfen Zhang, Yuhong Liu. Human Pose Estimation Based on Secondary Generation Adversary[J]. Laser & Optoelectronics Progress, 2020, 57(20): 201509

    Download Citation

  • Category
  • Machine Vision
  • Share
Laser & Optoelectronics Progress, Vol. 57, Issue 20, 201509 (2020)

Human Pose Estimation Based on Secondary Generation Adversary

Zhang Xiankun, Zhang Rongfen, and Liu Yuhong*

Author Affiliations

  • Key Laboratory of Big Data and Intelligent Technology, College of Big Data and Information Engineering, Guizhou University, Guiyang, Guizhou 550025, China



ing at the problem of inaccurate estimation results caused by the complexity of limbs and environment in human pose estimation, a human pose estimation method based on secondary generation adversary is proposed in this work. The stacked hourglass network (SHN) is trained for generation adversary through two stages. First, the SHN is used as a discriminator in the first generation adversarial network model, and the on-line adversarial data is used to strengthen training to improve the estimation performance of the SHN. Then, the SHN acts as a generator in the second generation adversarial network model, and the limb geometric constraints are used as the discriminator. The estimation performance of the SHN is improved again through the second adversarial training, and the final SHN is obtained. The proposed method is tested on the public data sets LSP and MPII, and the results show that it can effectively improve the estimation accuracy of the SHN.


Please Enter Your Email: