•  
  •  
 

Journal of System Simulation

Abstract

Abstract: The spatiotemporal consistency of multi-modal haptic and immersive visual feedback, as well as the integration of hardware devices in the existing VR software framework, restricts the rapid development of visuo-haptic fusion scenarios in different industries such as medical, business, and entertainment. In order to tackle this problem, an adaptable VR software framework combining multi-modal haptic and immersive visual display is proposed. To meet the different requirements of visual rendering, haptic rendering and hardware control, the double-layer and three-layer architectures are devised according to the different precision requirements of visuo-haptic fusion feedback. Experimental results show that the framework has the characteristics of supporting the synchronous feedback of visuo-haptic fusion, the convenient integration of diverse haptic devices, the simplicity of scene development and the rapid replacement of interactive scene, and it can meet the requirements of different application fields.

First Page

1385

Revised Date

2019-11-14

Last Page

1392

CLC

TP391.9

Recommended Citation

Guo Yuan, Tong Qianqian, Zheng Yukai, Wang Ziqi, Zhang Yuru, Wang Dangxiao. An Adaptable VR Software Framework for Collaborative Multi-modal Haptic and Immersive Visual Display[J]. Journal of System Simulation, 2020, 32(7): 1385-1392.

Corresponding Author

Dangxiao Wang,

DOI

10.16182/j.issn1004731x.joss.19-VR0440

Share

COinS