Full metadata record
DC poleHodnotaJazyk
dc.contributor.authorDel-Tejo-Catala, Omar
dc.contributor.authorPerez, Javier
dc.contributor.authorGuardiola, Jose
dc.contributor.authorPerez, Alberto J.
dc.contributor.authorPerez-Cortes, Juan-Carlos
dc.contributor.editorSkala, Václav
dc.date.accessioned2023-10-17T14:40:06Z
dc.date.available2023-10-17T14:40:06Z
dc.date.issued2023
dc.identifier.citationWSCG 2023: full papers proceedings: 1. International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision, p. 127-136.en
dc.identifier.isbn978-80-86943-32-9
dc.identifier.issn2464–4617 (print)
dc.identifier.issn2464–4625 (CD/DVD)
dc.identifier.urihttp://hdl.handle.net/11025/54418
dc.description.sponsorshipThis work was funded with grants from the Generalitat Valencia with grant number IMAMCA/2023/11, the European Regional Development Fund (ERDF) with grant number IMDEEA/2022/46, and CDTI’s CEL.IA project with grant number CER-20211022.en
dc.format10 s.cs
dc.format.mimetypeapplication/pdf
dc.language.isoenen
dc.publisherVáclav Skala - UNION Agencyen
dc.rights© Václav Skala - UNION Agencyen
dc.subjectodhad pozicecs
dc.subjectCycleGANcs
dc.subjectobraz od obrazucs
dc.subjectgraf neuronových sítícs
dc.subjectUNetcs
dc.subjectadaptace doménycs
dc.subjectCUTcs
dc.subjectSymmetry Robust Pose Estimationcs
dc.titleSynthetic-Real Domain Adaptation for Probabilistic Pose Estimationen
dc.typekonferenční příspěvekcs
dc.typeconferenceObjecten
dc.rights.accessopenAccessen
dc.type.versionpublishedVersionen
dc.description.abstract-translatedReal samples are costly to acquire in many real-world problems. Thus, employing synthetic samples is usually the primary solution to train models that require large amounts of data. However, the difference between synthetically generated and real images, called domain gap, is the most significant hindrance to this solution, as it affects the model’s generalization capacity. Domain adaptation techniques are crucial to train models using synthetic samples. Thus, this article explores different domain adaptation techniques to perform pose estimation from a probabilistic multiview perspective. Probabilistic multiview pose estimation solves the problem of object symmetries, where a single view of an object might not be able to determine the 6D pose of an object, and it must consider its prediction as a distribution of possible candidates. GANs are currently state-of-the-art in domain adaptation. In particular, this paper explores CUT and CycleGAN, which have unique training losses that address the problem of domain adaptation from different perspectives. This work evaluates a patch-wise variation of the CycleGAN to keep local information in the same place. The datasets explored are a cylinder and a sphere extracted from a Kaggle challenge with perspective-wise symmetries, although they holistically have unique 6D poses. One of the main findings is that probabilistic pose estimation, trained with synthetic samples, cannot be solved without addressing domain gap between synthetic and real samples. CUT outperforms CycleGAN in feature adaptation, although it is less robust than CycleGAN in keeping keypoints intact after translation, leading to pose prediction errors for some objects. Moreover, this paper found that training the models using synthetic-to-real images and evaluating them with real images improves the model’s accuracy for datasets without complex features. This approach is more suitable for industrial applications to reduce inference overhead.en
dc.subject.translatedpose estimationen
dc.subject.translatedCycleGANen
dc.subject.translatedimage-to-imageen
dc.subject.translatedGraph Neural Networksen
dc.subject.translatedUNeten
dc.subject.translateddomain adaptationen
dc.subject.translatedCUTen
dc.subject.translatedSymmetry Robust Pose Estimationen
dc.identifier.doihttps://www.doi.org/10.24132/CSRN.3301.16
dc.type.statusPeer-revieweden
Vyskytuje se v kolekcích:WSCG 2023: Full Papers Proceedings

Soubory připojené k záznamu:
Soubor Popis VelikostFormát 
E83-full.pdfPlný text2 MBAdobe PDFZobrazit/otevřít


Použijte tento identifikátor k citaci nebo jako odkaz na tento záznam: http://hdl.handle.net/11025/54418

Všechny záznamy v DSpace jsou chráněny autorskými právy, všechna práva vyhrazena.