Full metadata record
DC FieldValueLanguage
dc.contributor.authorPourmand, S.
dc.contributor.authorMerillou, N.
dc.contributor.authorMerillou, S.
dc.contributor.editorSkala, Václav
dc.date.accessioned2022-09-01T10:31:10Z
dc.date.available2022-09-01T10:31:10Z
dc.date.issued2022
dc.identifier.citationWSCG 2022: full papers proceedings: 30. International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision, p. 135-141.en
dc.identifier.isbn978-80-86943-33-6
dc.identifier.issn2464-4617
dc.identifier.urihttp://hdl.handle.net/11025/49587
dc.format7 s.cs
dc.format.mimetypeapplication/pdf
dc.language.isoenen
dc.publisherVáclav Skala - UNION Agencyen
dc.rights© Václav Skala - UNION Agencyen
dc.subjectdokončení hloubkycs
dc.subjectRGB-D obrázkycs
dc.subjectsyntetický datový souborcs
dc.subjectzrcadlové odrazycs
dc.titleDepth Completion for Close-Range Specular Objectsen
dc.typeconferenceObjecten
dc.rights.accessopenAccessen
dc.type.versionpublishedVersionen
dc.description.abstract-translatedMany objects in the real world exhibit specular reflections. Due to the limitations of the basic RGB-D cameras, it is particularly challenging to accurately capture their 3D shapes. In this work, we present an approach to correct the depth of close-range specular objects using convolutional neural networks. We first generate a synthetic dataset containing such close-range objects. We then train a deep convolutional network to estimate normal and boundary maps from a single image.With these results, we propose an algorithm to detect the incorrect area of the raw depth map. After removing the erroneous zone, we complete the depth channel.en
dc.subject.translateddepth completionen
dc.subject.translatedRGB-D imagesen
dc.subject.translatedsynthetic dataseten
dc.subject.translatedspecular reflectionsen
dc.identifier.doihttps://www.doi.org/10.24132/CSRN.3201.17
dc.type.statusPeer-revieweden
Appears in Collections:WSCG 2022: Full Papers Proceedings

Files in This Item:
File Description SizeFormat 
B89-full.pdfPlný text3,59 MBAdobe PDFView/Open


Please use this identifier to cite or link to this item: http://hdl.handle.net/11025/49587

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.