Matches in SemOpenAlex for { <https://semopenalex.org/work/W4384917045> ?p ?o ?g. }
- W4384917045 endingPage "76882" @default.
- W4384917045 startingPage "76869" @default.
- W4384917045 abstract "Increasingly, automation helps to minimize human involvement in many mundane aspects of life, especially retail. During the pandemic it became clear that shop automation helps not only to reduce labor and speedup service but also to reduce the spread of disease. The recognition of produce that has no barcode remains among the processes that are complicated to automate. The ability to distinguish weighted goods is necessary to correctly bill a customer at a self checkout station. A computer vision system can be deployed on either smart scales or smart cash registers. Such a system needs to recognize all the varieties of fruits, vegetables, groats and other commodities which are available for purchase unpacked. The difficulty of this problem is in the diversity of goods and visual variability of items within the same category. Furthermore, the produce at a shop frequently changes between seasons as different varieties are introduced. In this work, we present a computer vision approach that allows efficient scaling for new goods classes without any manual image labelling. To the best of our knowledge, this is the first approach that allows a smart checkout system to recognize new items without manual labelling. We provide open access to the collected dataset in conjunction with our methods. The proposed method uses top-view images of a new class, applies a pseudo-labelling algorithm to crop the samples, and uses object-based augmentation to create training data for neural networks. We test this approach to classify five fruits varieties, and show that when the number of natural training images is below 50, the baseline pipeline result is almost random guess (20% for 5 classes). PseudoAugment can achieve over 92% accuracy with only top-view images that have no pixel-level annotations. The substantial advantage of our approach remains when the number of original training images is below 250. In practice, it means that when a new fruit is introduced in a shop, we need just a handful of top-view images of containers filled with a new class for the system to start operating. The PseudoAugment method is well-suited for continual learning as it can effectively handle an ever-expanding set of classes. Other computer vision problems can be also addressed using the suggested approach." @default.
- W4384917045 created "2023-07-21" @default.
- W4384917045 creator A5004111307 @default.
- W4384917045 creator A5007982964 @default.
- W4384917045 creator A5029256765 @default.
- W4384917045 creator A5048027930 @default.
- W4384917045 creator A5069020946 @default.
- W4384917045 creator A5084140816 @default.
- W4384917045 creator A5092505969 @default.
- W4384917045 date "2023-01-01" @default.
- W4384917045 modified "2023-09-30" @default.
- W4384917045 title "PseudoAugment: Enabling Smart Checkout Adoption for New Classes Without Human Annotation" @default.
- W4384917045 cites W1508404128 @default.
- W4384917045 cites W1968729274 @default.
- W4384917045 cites W2015159529 @default.
- W4384917045 cites W2065429801 @default.
- W4384917045 cites W2108598243 @default.
- W4384917045 cites W2114808114 @default.
- W4384917045 cites W2116134417 @default.
- W4384917045 cites W2140116088 @default.
- W4384917045 cites W2164944168 @default.
- W4384917045 cites W2194775991 @default.
- W4384917045 cites W2292806858 @default.
- W4384917045 cites W2295214370 @default.
- W4384917045 cites W2565639579 @default.
- W4384917045 cites W2783616455 @default.
- W4384917045 cites W2809612736 @default.
- W4384917045 cites W2890902815 @default.
- W4384917045 cites W2891865018 @default.
- W4384917045 cites W2892336558 @default.
- W4384917045 cites W2897842311 @default.
- W4384917045 cites W2921056832 @default.
- W4384917045 cites W2954996726 @default.
- W4384917045 cites W2962843773 @default.
- W4384917045 cites W2963150697 @default.
- W4384917045 cites W2963271314 @default.
- W4384917045 cites W2963855133 @default.
- W4384917045 cites W2983005688 @default.
- W4384917045 cites W2992308087 @default.
- W4384917045 cites W2997199480 @default.
- W4384917045 cites W3000673855 @default.
- W4384917045 cites W3024764818 @default.
- W4384917045 cites W3036664904 @default.
- W4384917045 cites W3040318838 @default.
- W4384917045 cites W3043569052 @default.
- W4384917045 cites W3046541806 @default.
- W4384917045 cites W3083926560 @default.
- W4384917045 cites W3092032479 @default.
- W4384917045 cites W3092210341 @default.
- W4384917045 cites W3094336722 @default.
- W4384917045 cites W3105335547 @default.
- W4384917045 cites W3118794675 @default.
- W4384917045 cites W3126875673 @default.
- W4384917045 cites W3164581645 @default.
- W4384917045 cites W3171873561 @default.
- W4384917045 cites W3172742367 @default.
- W4384917045 cites W3176315233 @default.
- W4384917045 cites W3176659256 @default.
- W4384917045 cites W3208077096 @default.
- W4384917045 cites W3209773779 @default.
- W4384917045 cites W3211330693 @default.
- W4384917045 cites W3212659030 @default.
- W4384917045 cites W4205635493 @default.
- W4384917045 cites W4206008408 @default.
- W4384917045 cites W4206272272 @default.
- W4384917045 cites W4210248601 @default.
- W4384917045 cites W4212812394 @default.
- W4384917045 cites W4214530221 @default.
- W4384917045 cites W4221001881 @default.
- W4384917045 cites W4226414849 @default.
- W4384917045 cites W4229445738 @default.
- W4384917045 cites W4285193640 @default.
- W4384917045 cites W4287882763 @default.
- W4384917045 cites W4309717531 @default.
- W4384917045 cites W4310915349 @default.
- W4384917045 cites W4311194244 @default.
- W4384917045 cites W4312327301 @default.
- W4384917045 cites W4317877814 @default.
- W4384917045 cites W4323926613 @default.
- W4384917045 cites W4365146704 @default.
- W4384917045 cites W4367624768 @default.
- W4384917045 cites W4378078060 @default.
- W4384917045 doi "https://doi.org/10.1109/access.2023.3296854" @default.
- W4384917045 hasPublicationYear "2023" @default.
- W4384917045 type Work @default.
- W4384917045 citedByCount "1" @default.
- W4384917045 countsByYear W43849170452023 @default.
- W4384917045 crossrefType "journal-article" @default.
- W4384917045 hasAuthorship W4384917045A5004111307 @default.
- W4384917045 hasAuthorship W4384917045A5007982964 @default.
- W4384917045 hasAuthorship W4384917045A5029256765 @default.
- W4384917045 hasAuthorship W4384917045A5048027930 @default.
- W4384917045 hasAuthorship W4384917045A5069020946 @default.
- W4384917045 hasAuthorship W4384917045A5084140816 @default.
- W4384917045 hasAuthorship W4384917045A5092505969 @default.
- W4384917045 hasBestOaLocation W43849170451 @default.
- W4384917045 hasConcept C10138342 @default.
- W4384917045 hasConcept C111919701 @default.