Full metadata record
DC poleHodnotaJazyk
dc.contributor.authorDischer, Sören
dc.contributor.authorMasopust, Leon
dc.contributor.authorSchulz, Sebastian
dc.contributor.editorSkala, Václav
dc.date.accessioned2019-05-07T07:23:01Z-
dc.date.available2019-05-07T07:23:01Z-
dc.date.issued2018
dc.identifier.citationJournal of WSCG. 2018, vol. 26, no. 2, p. 76-84.en
dc.identifier.issn1213-6972 (print)
dc.identifier.issn1213-6980 (CD-ROM)
dc.identifier.issn1213-6964 (on-line)
dc.identifier.uriwscg.zcu.cz/WSCG2018/!_2018_Journal_WSCG-No-2.pdf
dc.identifier.urihttp://hdl.handle.net/11025/34593
dc.format9 s.cs
dc.format.mimetypeapplication/pdf
dc.language.isoenen
dc.publisherVáclav Skala - UNION Agencycs
dc.relation.ispartofseriesJournal of WSCGen
dc.rights© Václav Skala - UNION Agencycs
dc.subjectvirtuální realitacs
dc.subject3D bodová mračnacs
dc.subjectvykreslování v reálném časecs
dc.titleA point-based and image-based multi-pass rendering technique for visualizing massive 3D point clouds in VR environmentsen
dc.typečlánekcs
dc.typearticleen
dc.rights.accessopenAccessen
dc.type.versionpublishedVersionen
dc.description.abstract-translatedReal-time rendering for 3D point clouds allows for interactively exploring and inspecting real-world assets, sites, or regions on a broad range of devices but has to cope with their vastly different computing capabilities. Virtual reality (VR) applications rely on high frame rates (i.e., around 90 fps as opposed to 30 - 60 fps) and show high sensitivity to any kind of visual artifacts, which are typical for 3D point cloud depictions (e.g., holey surfaces or visual clutter due to inappropriate point sizes). We present a novel rendering system that allows for an immersive, nausea-free exploration of arbitrary large 3D point clouds on state-of-the-art VR devices such as HTC Vive and Oculus Rift. Our approach applies several point-based and image-based rendering techniques that are combined using a multipass rendering pipeline. The approach does not require to derive generalized, mesh-based representations in a preprocessing step and preserves precision and density of the raw 3D point cloud data. The presented techniques have been implemented and evaluated with massive real-world data sets from aerial, mobile, and terrestrial acquisition campaigns containing up to 2.6 billion points to show the practicability and scalability of our approach.en
dc.subject.translatedvirtual realityen
dc.subject.translated3D point cloudsen
dc.subject.translatedreal-time renderingen
dc.identifier.doihttps://doi.org/10.24132/JWSCG.2018.26.2.2
dc.type.statusPeer-revieweden
Vyskytuje se v kolekcích:Volume 26, Number 2 (2018)

Soubory připojené k záznamu:
Soubor Popis VelikostFormát 
Discher.pdfPlný text16,07 MBAdobe PDFZobrazit/otevřít


Použijte tento identifikátor k citaci nebo jako odkaz na tento záznam: http://hdl.handle.net/11025/34593

Všechny záznamy v DSpace jsou chráněny autorskými právy, všechna práva vyhrazena.