Full metadata record
DC FieldValueLanguage
dc.contributor.authorKlóska, Dominika
dc.contributor.authorDziembowski, Adrian
dc.contributor.authorSamelak, Jarosław
dc.contributor.editorSkala, Václav
dc.date.accessioned2023-10-17T16:05:17Z
dc.date.available2023-10-17T16:05:17Z
dc.date.issued2023
dc.identifier.citationWSCG 2023: full papers proceedings: 1. International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision, p. 268-276.en
dc.identifier.isbn978-80-86943-32-9
dc.identifier.issn2464–4617 (print)
dc.identifier.issn2464–4625 (CD/DVD)
dc.identifier.urihttp://hdl.handle.net/11025/54433
dc.description.sponsorshipThis work was supported by the Ministry of Education and Science of Republic of Poland.en
dc.format9 s.cs
dc.format.mimetypeapplication/pdf
dc.language.isoenen
dc.publisherVáclav Skala - UNION Agencyen
dc.rights© Václav Skala - UNION Agencyen
dc.subjectpohlcující videocs
dc.subjectsyntéza virtuálního pohleducs
dc.subjectpohlcující video MPEGcs
dc.subjectMIVcs
dc.titleVersatile Input View Selection for Efficient Immersive Video Transmissionen
dc.typekonferenční příspěvekcs
dc.typeconferenceObjecten
dc.rights.accessopenAccessen
dc.type.versionpublishedVersionen
dc.description.abstract-translatedIn this paper we deal with the problem of the optimal selection of input views, which are transmitted within an immersive video bitstream. Due to limited bitrate and pixel rate, only a subset of input views available on the encoder side can be fully transmitted to the decoder. Remaining views are – in the simplest approach – omitted or – in the newest immersive video encoding standard (MPEG immersive video, MIV) – pruned in order to remove less important information. Selecting proper views for transmission is crucial in terms of the quality of immersive video system user’s experience. In the paper we have analyzed which input views have to be selected for providing the best possible quality of virtual views, independently on the viewport requested by the viewer. Moreover, we have proposed an algorithm, which takes into account a non-uniform probability of user’s viewing direction, allowing for the increase of the subjective quality of virtual navigation for omnidirectional content.en
dc.subject.translatedimmersive videoen
dc.subject.translatedvirtual view synthesisen
dc.subject.translatedMPEG immersive videoen
dc.subject.translatedMIVen
dc.identifier.doihttps://www.doi.org/10.24132/CSRN.3301.31
dc.type.statusPeer-revieweden
Appears in Collections:WSCG 2023: Full Papers Proceedings

Files in This Item:
File Description SizeFormat 
G13-full.pdfPlný text2,18 MBAdobe PDFView/Open


Please use this identifier to cite or link to this item: http://hdl.handle.net/11025/54433

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.