Title: Fast and Precise Binary Instance Segmentation of 2D Objects for Automotive Applications
Authors: Ganganna Ravindra, Darshan
Dinges, Laslo
Al-Hamadi, Ayoub
Baranau, Vasili
Citation: WSCG 2022: full papers proceedings: 30. International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision, p. 302-305.
Issue Date: 2022
Publisher: Václav Skala - UNION Agency
Document type: conferenceObject
URI: http://hdl.handle.net/11025/49609
ISBN: 978-80-86943-33-6
ISSN: 2464-4617
Keywords: extrémní body;IoU;kodér-dekodér;instance binární segmentace
Keywords in different language: extreme points;IoU;encoder-decoder;instance binary segmentation
Abstract in different language: In this paper, we focus on improving binary 2D instance segmentation to assist humans in labeling ground truth datasets with polygons. Humans labeler just have to draw boxes around objects, and polygons are generated automatically. To be useful, our system has to run on CPUs in real-time. The most usual approach for binary instance segmentation involves encoder-decoder networks. This report evaluates state-of-the-art encoder-decoder networks and proposes a method for improving instance segmentation quality using these networks. Alongside network architecture improvements, our proposed method relies upon providing extra information to the network input, so-called “extreme points”, i.e. the outermost points on the object silhouette. The user can label them instead of a bounding box almost as quickly. The bounding box can be deduced from the extreme points as well. This method produces better IoU compared to other state-of-the-art encoder-decoder networks and also runs fast enough when it is deployed on a CPU.
Rights: © Václav Skala - UNION Agency
Appears in Collections:WSCG 2022: Full Papers Proceedings

Files in This Item:
File Description SizeFormat 
B61-full.pdfPlný text2,78 MBAdobe PDFView/Open


Please use this identifier to cite or link to this item: http://hdl.handle.net/11025/49609

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.