Title: | Depth Assisted Fast Neural Radiance Fields |
Authors: | Dey, Arnab Ahmine, Yassine Comport, Andrew I. |
Citation: | Journal of WSCG. 2022, vol. 30, no. 1-2, p. 34-43. |
Issue Date: | 2022 |
Publisher: | Václav Skala - UNION Agency |
Document type: | článek article |
URI: | http://hdl.handle.net/11025/49392 |
ISSN: | 1213-6972 (print) 1213-6964 (on-line) |
Keywords: | počítačové vidění;RGB-D NeRF;NeRF;reprezentace neuronové scény;neuronové vykreslování;vykreslování objemu |
Keywords in different language: | computer vision;RGB-D NeRF;NeRF;neural scene representation;neural rendering;volume rendering |
Abstract in different language: | Neural scene representations, such as Neural Radiance Fields (NeRF), are based on training a multilayer perceptron (MLP) using a set of color images with known poses. An increasing number of devices now produce RGB-D(color + depth) information, which has been shown to be very important for a wide range of tasks. Therefore, the aim of this paper is to investigate what improvements can be made to these promising implicit representations by incorporating depth information with the color images. In particular, the recently proposed Mip-NeRF approach, which uses conical frustums instead of rays for volume rendering, allows one to account for the varying area of a pixel with distance from the camera center. The proposed method additionally models depth uncertainty. This allows to address major limitations of NeRF-based approaches including improving the accuracy of geometry, reduced artifacts, faster training time, and shortened prediction time. Experiments are performed on well-known benchmark scenes, and comparisons show improved accuracy in scene geometry and photometric reconstruction, while reducing the training time by 3 - 5 times. |
Rights: | © Václav Skala - UNION Agency |
Appears in Collections: | Volume 30, Number 1-2 (2021) |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
B02-full.pdf | Plný text | 10,83 MB | Adobe PDF | View/Open |
Please use this identifier to cite or link to this item:
http://hdl.handle.net/11025/49392
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.