Matches in SemOpenAlex for { <https://semopenalex.org/work/W4285131446> ?p ?o ?g. }
Showing items 1 to 77 of
77
with 100 items per page.
- W4285131446 endingPage "60557" @default.
- W4285131446 startingPage "60550" @default.
- W4285131446 abstract "As the memory footprint requirement and computational scale concerned, the light-weighted Binary Neural Networks (BNNs) have great advantages in limited-resources platforms, such as AIoT (Artificial Intelligence in Internet of Things) edge terminals, wearable and portable devices, etc. However, the binarization process naturally brings considerable information losses and further deteriorates the accuracy. In this article, three aspects are introduced to better the binarized ReActNet accuracy performance with a more low-complex computation. Firstly, an improved Binarized Ghost Module (BGM) for the ReActNet is proposed to increase the feature maps information. At the same time, the computational scale of this structure is still kept at a very low level. Secondly, we propose a new Label-aware Loss Function (LLF) in the penultimate layer as a supervisor which takes the label information into consideration. This auxiliary loss function makes each category’s feature vectors more separate, and improve the final fully-connected layer’s classification accuracy accordingly. Thirdly, the Normalization-based Attention Module (NAM) method is adopted to regulate the activation flow. The module helps to avoid the gradient saturation problem. With these three approaches, our improved binarized network outperforms the other state-of-the-art methods. It can achieve 71.4% Top-1 accuracy on the ImageNet and 86.45% accuracy on the CIFAR-10 respectively. Meanwhile, its computational scale OPs is the least 0.86×10<sup>8</sup> compared with the other mainstream BNN models. The experimental results prove the effectiveness of our proposals, and the study is very helpful and promising for the future low-power hardware implementations." @default.
- W4285131446 created "2022-07-14" @default.
- W4285131446 creator A5025273685 @default.
- W4285131446 creator A5032357435 @default.
- W4285131446 creator A5064762015 @default.
- W4285131446 date "2022-01-01" @default.
- W4285131446 modified "2023-10-14" @default.
- W4285131446 title "“Ghost” and Attention in Binary Neural Network" @default.
- W4285131446 cites W2884150179 @default.
- W4285131446 cites W2887447938 @default.
- W4285131446 cites W2896632102 @default.
- W4285131446 cites W2963351448 @default.
- W4285131446 cites W2982344224 @default.
- W4285131446 cites W3034297393 @default.
- W4285131446 cites W3035414587 @default.
- W4285131446 cites W3044553317 @default.
- W4285131446 cites W3104151879 @default.
- W4285131446 cites W3173275561 @default.
- W4285131446 cites W3175426148 @default.
- W4285131446 cites W3176211720 @default.
- W4285131446 cites W4212847409 @default.
- W4285131446 cites W4214888034 @default.
- W4285131446 doi "https://doi.org/10.1109/access.2022.3181192" @default.
- W4285131446 hasPublicationYear "2022" @default.
- W4285131446 type Work @default.
- W4285131446 citedByCount "1" @default.
- W4285131446 countsByYear W42851314462023 @default.
- W4285131446 crossrefType "journal-article" @default.
- W4285131446 hasAuthorship W4285131446A5025273685 @default.
- W4285131446 hasAuthorship W4285131446A5032357435 @default.
- W4285131446 hasAuthorship W4285131446A5064762015 @default.
- W4285131446 hasBestOaLocation W42851314461 @default.
- W4285131446 hasConcept C111919701 @default.
- W4285131446 hasConcept C11413529 @default.
- W4285131446 hasConcept C124101348 @default.
- W4285131446 hasConcept C136886441 @default.
- W4285131446 hasConcept C144024400 @default.
- W4285131446 hasConcept C153180895 @default.
- W4285131446 hasConcept C154945302 @default.
- W4285131446 hasConcept C19165224 @default.
- W4285131446 hasConcept C41008148 @default.
- W4285131446 hasConcept C45374587 @default.
- W4285131446 hasConcept C50644808 @default.
- W4285131446 hasConcept C74912251 @default.
- W4285131446 hasConceptScore W4285131446C111919701 @default.
- W4285131446 hasConceptScore W4285131446C11413529 @default.
- W4285131446 hasConceptScore W4285131446C124101348 @default.
- W4285131446 hasConceptScore W4285131446C136886441 @default.
- W4285131446 hasConceptScore W4285131446C144024400 @default.
- W4285131446 hasConceptScore W4285131446C153180895 @default.
- W4285131446 hasConceptScore W4285131446C154945302 @default.
- W4285131446 hasConceptScore W4285131446C19165224 @default.
- W4285131446 hasConceptScore W4285131446C41008148 @default.
- W4285131446 hasConceptScore W4285131446C45374587 @default.
- W4285131446 hasConceptScore W4285131446C50644808 @default.
- W4285131446 hasConceptScore W4285131446C74912251 @default.
- W4285131446 hasFunder F4320335777 @default.
- W4285131446 hasLocation W42851314461 @default.
- W4285131446 hasLocation W42851314462 @default.
- W4285131446 hasOpenAccess W4285131446 @default.
- W4285131446 hasPrimaryLocation W42851314461 @default.
- W4285131446 hasRelatedWork W1991269640 @default.
- W4285131446 hasRelatedWork W2016839265 @default.
- W4285131446 hasRelatedWork W2063185616 @default.
- W4285131446 hasRelatedWork W2090485996 @default.
- W4285131446 hasRelatedWork W2356313285 @default.
- W4285131446 hasRelatedWork W2381065783 @default.
- W4285131446 hasRelatedWork W2386387936 @default.
- W4285131446 hasRelatedWork W2516800609 @default.
- W4285131446 hasRelatedWork W2794115703 @default.
- W4285131446 hasRelatedWork W2508457823 @default.
- W4285131446 hasVolume "10" @default.
- W4285131446 isParatext "false" @default.
- W4285131446 isRetracted "false" @default.
- W4285131446 workType "article" @default.