Full metadata record
DC poleHodnotaJazyk
dc.contributor.authorVeyret, Morgan
dc.contributor.authorMaisel, Eric
dc.contributor.editorJorge, Joaquim
dc.contributor.editorSkala, Václav
dc.date.accessioned2013-12-16T12:45:27Z-
dc.date.available2013-12-16T12:45:27Z-
dc.date.issued2006
dc.identifier.citationWSCG '2006: Short Papers Proceedings: The 14-th International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision 2006: University of West Bohemia, Plzen, Czech Republic, January 31 - February 2, 2006, p. 101-108.en
dc.identifier.isbn80-86943-05-4
dc.identifier.urihttp://wscg.zcu.cz/WSCG2006/Papers_2006/Short/!WSCG2006_Short_Proceedings_Final.pdf
dc.identifier.urihttp://hdl.handle.net/11025/6603
dc.description.abstractWhen visiting an aquarium, people may be disturbed or, at least, disappointed by the amount and diversity of available information. Moreover, one can find it very difficult to match the information of notices on the wall to the reality of the fishes. Therefore, we propose a virtual guide, an autonomous teaching assistant embodied in the real world using augmented reality techniques, for helping people in their visit of aquariums. This virtual guide will interact with the real world using multiple modalities (e.g. speech, facial expression, ...). Thus, it should be aware of the aquarium’s state and content, and use perceived information and prior knowledge to inform the visitor in a structured fashion. Due to the high mobility and unpredictable behaviour of the fishes, our guide requires an adequate perception system. This camera-based system has to keep track of the fishes and their behavior. It is based on the focalisation of visual attention that allows to select interesting information in the field of view. This is achieved by extracting a number of focuses of attention (FOA) using a saliency map and a multi-level memory system, which is filled (or updated) with the extracted information. It allows our system to detect and track targets in the aquarium. This article describes how we use the saliency map and memory system, along with their interactions, to set up the first part of our perception system.en
dc.format8 s.cs
dc.format.mimetypeapplication/pdf
dc.language.isoenen
dc.publisherVáclav Skala - UNION Agencyen
dc.relation.ispartofseriesWSCG '2006: Short Papers Proceedingsen
dc.rights© Václav Skala - UNION Agencycs
dc.subjectrozšířená realitacs
dc.subjectvirtuální průvodcecs
dc.subjectpočítačové viděnícs
dc.titleAttention-Based Target Tracking for an Augmented Reality Applicationen
dc.typekonferenční příspěvekcs
dc.typeconferenceObjecten
dc.rights.accessopenAccessen
dc.type.versionpublishedVersionen
dc.subject.translatedaugmented realityen
dc.subject.translatedvirtual guideen
dc.subject.translatedcomputer visionen
dc.type.statusPeer-revieweden
Vyskytuje se v kolekcích:WSCG '2006: Short Papers Proceedings

Soubory připojené k záznamu:
Soubor Popis VelikostFormát 
Veyret.pdfPlný text834,96 kBAdobe PDFZobrazit/otevřít


Použijte tento identifikátor k citaci nebo jako odkaz na tento záznam: http://hdl.handle.net/11025/6603

Všechny záznamy v DSpace jsou chráněny autorskými právy, všechna práva vyhrazena.