Matches in SemOpenAlex for { <https://semopenalex.org/work/W4306649930> ?p ?o ?g. }
Showing items 1 to 100 of
100
with 100 items per page.
- W4306649930 endingPage "107294" @default.
- W4306649930 startingPage "107294" @default.
- W4306649930 abstract "• A light-weight defect classification model is proposed to achieve high-accuracy industrial defect classification under the few-shot setting. • Batch-size independent Group Normalization (GN) is integrated into the pre-trained model for feature normalization and avoiding over-fitting. • A two-stream architecture with a joint loss function is designed to perform defect classification of single images and similarity comparison of a pair of samples simultaneously. • Our method runs in real-time and achieves significantly higher classification accuracy compared with the state-of-the-art defect classifiers. Visual surface defect inspection provides an important tool for product quality assessment in a wide range of industrial applications. In recent years, diverse convolutional neural network (CNN) models have been developed for the high-accuracy image classification of natural objects. However, it remains a challenging task to develop deep-learning-based approaches for surface defect inspection of industrial products. The main challenge is that surface defects of industrial products occur in low probability, therefore it is impractical to build a large image dataset containing different types of defects to train/fine-tune deep CNN models to achieve satisfactory performance when applied to industrial inspection tasks. To overcome the above-mentioned problem, we build a light-weight defect classification model based on the pre-trained SqueezeNet architecture and present three effective techniques to achieve high-accuracy defect classification using very few collected defective training samples. First, we experimentally demonstrate that it is reasonable to freeze shallower convolutional layers and only fine-tune deeper layers in the last convolutional stage when there are not enough training samples. Second, we integrate batch-size independent Group Normalization (GN) into SqueezeNet to compute stable statistics based on a limited number of defective samples for training the classification model. Third, we add an auxiliary task to perform the similarity comparison of two defective samples during the training phase, significantly increasing the size of the training data and generating more distinctive defect-specific representations. The experimental results on the NEU-CLS dataset and USB-FS dataset verify the effectiveness of the proposed method. Our method can obtain accurate defect classification results when very few defective training samples are provided, i.e., about 97.69% and 82.92% average classification accuracy on the NEU-CLS and USB-FS datasets, respectively, under the 5-shot setting (5 images per defect category)." @default.
- W4306649930 created "2022-10-18" @default.
- W4306649930 creator A5014111141 @default.
- W4306649930 creator A5025678265 @default.
- W4306649930 creator A5031390008 @default.
- W4306649930 creator A5039198895 @default.
- W4306649930 creator A5060316958 @default.
- W4306649930 creator A5083250669 @default.
- W4306649930 date "2023-02-01" @default.
- W4306649930 modified "2023-09-26" @default.
- W4306649930 title "An effective industrial defect classification method under the few-shot setting via two-stream training" @default.
- W4306649930 cites W2028526243 @default.
- W4306649930 cites W2034810702 @default.
- W4306649930 cites W2044465660 @default.
- W4306649930 cites W2044552234 @default.
- W4306649930 cites W2050202310 @default.
- W4306649930 cites W2052194651 @default.
- W4306649930 cites W2078087367 @default.
- W4306649930 cites W2092072518 @default.
- W4306649930 cites W2155903085 @default.
- W4306649930 cites W2163352848 @default.
- W4306649930 cites W2337601638 @default.
- W4306649930 cites W2589306531 @default.
- W4306649930 cites W2761796288 @default.
- W4306649930 cites W2765854388 @default.
- W4306649930 cites W2912069721 @default.
- W4306649930 cites W2945708832 @default.
- W4306649930 cites W2953868242 @default.
- W4306649930 cites W2966341653 @default.
- W4306649930 cites W2996701347 @default.
- W4306649930 cites W3016453002 @default.
- W4306649930 cites W3022345324 @default.
- W4306649930 cites W3048440790 @default.
- W4306649930 cites W3094625083 @default.
- W4306649930 cites W3131675218 @default.
- W4306649930 cites W3164289800 @default.
- W4306649930 cites W4205420848 @default.
- W4306649930 cites W4252684946 @default.
- W4306649930 doi "https://doi.org/10.1016/j.optlaseng.2022.107294" @default.
- W4306649930 hasPublicationYear "2023" @default.
- W4306649930 type Work @default.
- W4306649930 citedByCount "7" @default.
- W4306649930 countsByYear W43066499302023 @default.
- W4306649930 crossrefType "journal-article" @default.
- W4306649930 hasAuthorship W4306649930A5014111141 @default.
- W4306649930 hasAuthorship W4306649930A5025678265 @default.
- W4306649930 hasAuthorship W4306649930A5031390008 @default.
- W4306649930 hasAuthorship W4306649930A5039198895 @default.
- W4306649930 hasAuthorship W4306649930A5060316958 @default.
- W4306649930 hasAuthorship W4306649930A5083250669 @default.
- W4306649930 hasConcept C119857082 @default.
- W4306649930 hasConcept C121332964 @default.
- W4306649930 hasConcept C127413603 @default.
- W4306649930 hasConcept C153180895 @default.
- W4306649930 hasConcept C153294291 @default.
- W4306649930 hasConcept C154945302 @default.
- W4306649930 hasConcept C191897082 @default.
- W4306649930 hasConcept C192562407 @default.
- W4306649930 hasConcept C2777211547 @default.
- W4306649930 hasConcept C2778344882 @default.
- W4306649930 hasConcept C2992734406 @default.
- W4306649930 hasConcept C41008148 @default.
- W4306649930 hasConcept C51632099 @default.
- W4306649930 hasConcept C78519656 @default.
- W4306649930 hasConceptScore W4306649930C119857082 @default.
- W4306649930 hasConceptScore W4306649930C121332964 @default.
- W4306649930 hasConceptScore W4306649930C127413603 @default.
- W4306649930 hasConceptScore W4306649930C153180895 @default.
- W4306649930 hasConceptScore W4306649930C153294291 @default.
- W4306649930 hasConceptScore W4306649930C154945302 @default.
- W4306649930 hasConceptScore W4306649930C191897082 @default.
- W4306649930 hasConceptScore W4306649930C192562407 @default.
- W4306649930 hasConceptScore W4306649930C2777211547 @default.
- W4306649930 hasConceptScore W4306649930C2778344882 @default.
- W4306649930 hasConceptScore W4306649930C2992734406 @default.
- W4306649930 hasConceptScore W4306649930C41008148 @default.
- W4306649930 hasConceptScore W4306649930C51632099 @default.
- W4306649930 hasConceptScore W4306649930C78519656 @default.
- W4306649930 hasFunder F4320321001 @default.
- W4306649930 hasFunder F4320333596 @default.
- W4306649930 hasFunder F4320335777 @default.
- W4306649930 hasLocation W43066499301 @default.
- W4306649930 hasOpenAccess W4306649930 @default.
- W4306649930 hasPrimaryLocation W43066499301 @default.
- W4306649930 hasRelatedWork W2354455445 @default.
- W4306649930 hasRelatedWork W2363701422 @default.
- W4306649930 hasRelatedWork W2383320921 @default.
- W4306649930 hasRelatedWork W2981091784 @default.
- W4306649930 hasRelatedWork W3034096603 @default.
- W4306649930 hasRelatedWork W3099765033 @default.
- W4306649930 hasRelatedWork W4285194539 @default.
- W4306649930 hasRelatedWork W4306649930 @default.
- W4306649930 hasRelatedWork W4306975314 @default.
- W4306649930 hasRelatedWork W4312903001 @default.
- W4306649930 hasVolume "161" @default.
- W4306649930 isParatext "false" @default.
- W4306649930 isRetracted "false" @default.
- W4306649930 workType "article" @default.