Full metadata record
DC poleHodnotaJazyk
dc.contributor.authorPicek, Lukáš
dc.contributor.authorŠulc, Milan
dc.contributor.authorPatel, Yash
dc.contributor.authorMatas, Jiří
dc.date.accessioned2023-01-30T11:00:29Z-
dc.date.available2023-01-30T11:00:29Z-
dc.date.issued2022
dc.identifier.citationPICEK, L. ŠULC, M. PATEL, Y. MATAS, J. Plant recognition by AI: Deep neural nets, transformers, and kNN in deep embeddings. Frontiers in Plant Science, 2022, roč. 13, č. September, s. 1-16. ISSN: 1664-462Xcs
dc.identifier.issn1664-462X
dc.identifier.uri2-s2.0-85139567145
dc.identifier.urihttp://hdl.handle.net/11025/51180
dc.format16 s.cs
dc.format.mimetypeapplication/pdf
dc.language.isoenen
dc.publisherFrontiers Media S.A.en
dc.relation.ispartofseriesFrontiers in Plant Scienceen
dc.rights© authorsen
dc.titlePlant recognition by AI: Deep neural nets, transformers, and kNN in deep embeddingsen
dc.typečlánekcs
dc.typearticleen
dc.rights.accessopenAccessen
dc.type.versionpublishedVersionen
dc.description.abstract-translatedThe article reviews and benchmarks machine learning methods for automatic image-based plant species recognition and proposes a novel retrieval-based method for recognition by nearest neighbor classification in a deep embedding space. The image retrieval method relies on a model trained via the Recall@k surrogate loss. State-of-the-art approaches to image classification, based on Convolutional Neural Networks (CNN) and Vision Transformers (ViT), are benchmarked and compared with the proposed image retrieval-based method. The impact of performance-enhancing techniques, e.g., class prior adaptation, image augmentations, learning rate scheduling, and loss functions, is studied. The evaluation is carried out on the PlantCLEF 2017, the ExpertLifeCLEF 2018, and the iNaturalist 2018 Datasets-the largest publicly available datasets for plant recognition. The evaluation of CNN and ViT classifiers shows a gradual improvement in classification accuracy. The current state-of-the-art Vision Transformer model, ViT-Large/16, achieves 91.15% and 83.54% accuracy on the PlantCLEF 2017 and ExpertLifeCLEF 2018 test sets, respectively; the best CNN model (ResNeSt-269e) error rate dropped by 22.91% and 28.34%. Apart from that, additional tricks increased the performance for the ViT-Base/32 by 3.72% on ExpertLifeCLEF 2018 and by 4.67% on PlantCLEF 2017. The retrieval approach achieved superior performance in all measured scenarios with accuracy margins of 0.28%, 4.13%, and 10.25% on ExpertLifeCLEF 2018, PlantCLEF 2017, and iNat2018-Plantae, respectively.en
dc.subject.translatedplanten
dc.subject.translatedspeciesen
dc.subject.translatedclassificationen
dc.subject.translatedrecognitionen
dc.subject.translatedmachine learningen
dc.subject.translatedcomputer visionen
dc.subject.translatedspecies recognitionen
dc.subject.translatedfine-graineden
dc.identifier.doi10.3389/fpls.2022.787527
dc.type.statusPeer-revieweden
dc.identifier.document-number868264700001
dc.identifier.obd43937116
dc.project.IDSS05010008/Detekce, identifikace a monitoring živočichů pokročilými metodami počítačového viděnícs
dc.project.IDSGS-2022-017/Inteligentní metody strojového vnímání a porozumění 5cs
dc.project.ID90140/Velká výzkumná infrastruktura_(J) - e-INFRA CZcs
Vyskytuje se v kolekcích:Články / Articles (NTIS)
Články / Articles (KKY)
OBD

Soubory připojené k záznamu:
Soubor VelikostFormát 
Picek_Plant_Recognition_by_AI_092022.pdf3,4 MBAdobe PDFZobrazit/otevřít


Použijte tento identifikátor k citaci nebo jako odkaz na tento záznam: http://hdl.handle.net/11025/51180

Všechny záznamy v DSpace jsou chráněny autorskými právy, všechna práva vyhrazena.

hledání
navigace
  1. DSpace at University of West Bohemia
  2. Publikační činnost / Publications
  3. OBD