| Frontiers in Physics | |
| Fusion Coding of 3D Real and Virtual Scenes Information for Augmented Reality-Based Holographic Stereogram | |
| Xingpeng Yan1  Xi Wang1  Yunpeng Liu1  Song Chen1  Xiaoyu Jiang1  Xinlei Liu1  Tao Jing1  Min Lin1  Pei Li2  | |
| [1] Beijing, China;Shen Zhen, China; | |
| 关键词: holographic stereogram; augmented reality; instance segmentation; 3D display; fusion of 3D real and virtual scenes; | |
| DOI : 10.3389/fphy.2021.736268 | |
| 来源: Frontiers | |
PDF
|
|
【 摘 要 】
In this paper, an optical field coding method for the fusion of real and virtual scenes is proposed to implement an augmented reality (AR)-based holographic stereogram. The occlusion relationship between the real and virtual scenes is analyzed, and a fusion strategy based on instance segmentation and depth determination is proposed. A real three-dimensional (3D) scene sampling system is built, and the foreground contour of the sampled perspective image is extracted by the Mask R-CNN instance segmentation algorithm. The virtual 3D scene is rendered by a computer to obtain the virtual sampled images as well as their depth maps. According to the occlusion relation of the fusion scenes, the pseudo-depth map of the real scene is derived, and the fusion coding of 3D real and virtual scenes information is implemented by the depth information comparison. The optical experiment indicates that AR-based holographic stereogram fabricated by our coding method can reconstruct real and virtual fused 3D scenes with correct occlusion and depth cues on full parallax.
【 授权许可】
CC BY
【 预 览 】
| Files | Size | Format | View |
|---|---|---|---|
| RO202110275017203ZK.pdf | 3857KB |
PDF