•  
  •  
 

Journal of System Simulation

Abstract

Abstract: A lot of virtual fitting systems only study human-computer interaction and clothing simulation, but they can’t make clothes model to rotate 360 degree along with human. To solve this problem, an improved method of virtual fitting based on Kinect using motion prediction was proposed. With the help of Kinect, the skeleton feature points of user for real-time tracking were obtained. According to the obtained information of the joint point of the head and color image, face was detected and judging the front or back of the user. The motion trajectory of the left and right shoulders’ joint points based on gray model was predicted. When the depth coordinates of joint point varied sharply, the data was corrected that Kinect obtained. The proposed method has the following advantages: Sense of reality, the system realizes the real-time 360 degree virtual fitting; Real-time, the gray forecast can predict the result quickly, which achieves real-time rotation of clothing model with the human body. Experimental results show that the 3D virtual fitting system can achieve a better fitting effect result.

First Page

2378

Revised Date

2016-07-11

Last Page

2385

CLC

TP391.9

Recommended Citation

Zhang Xiaoli, Yao Junfeng, Huang Ping. 360-Degree Virtual Fitting Based on Kinect[J]. Journal of System Simulation, 2016, 28(10): 2378-2385.

Share

COinS