Matches in SemOpenAlex for { <https://semopenalex.org/work/W4304183744> ?p ?o ?g. }
- W4304183744 endingPage "2055" @default.
- W4304183744 startingPage "2055" @default.
- W4304183744 abstract "Existing facial expression recognition methods have some drawbacks. For example, it becomes difficult for network learning on cross-dataset facial expressions, multi-region learning on an image did not extract the overall image information, and a frequency multiplication network did not take into account the inter-class and intra-class features in image classification. In order to deal with the above problems, in our current research, we raise a symmetric mode to extract the inter-class features and intra-class diversity features, and then propose a triple-structure network model based upon MobileNet V1, which is trained via a new multi-branch loss function. Such a proposed network consists of triple structures, viz., a global branch network, an attention mechanism branch network, and a diversified feature learning branch network. To begin with, the global branch network is used to extract the global features of the facial expression images. Furthermore, an attention mechanism branch network concentrates to extract inter-class features. In addition, the diversified feature learning branch network is utilized to extract intra-class diverse features. The network training is performed by using multiple loss functions to decrease intra-class differences and inter-class similarities. Finally, through ablation experiments and visualization, the intrinsic mechanism of our triple-structure network model is proved to be very reasonable. Experiments on the KDEF, MMI, and CK+ datasets show that the accuracy of facial expression recognition using the proposed model is 1.224%, 13.051%, and 3.085% higher than that using MC-loss (VGG16), respectively. In addition, related comparison tests and analyses proved that our raised triple-structure network model reaches better performance than dozens of state-of-the-art methods." @default.
- W4304183744 created "2022-10-11" @default.
- W4304183744 creator A5016297325 @default.
- W4304183744 creator A5066988252 @default.
- W4304183744 creator A5071943346 @default.
- W4304183744 creator A5091586702 @default.
- W4304183744 date "2022-10-02" @default.
- W4304183744 modified "2023-09-25" @default.
- W4304183744 title "A Triple-Structure Network Model Based upon MobileNet V1 and Multi-Loss Function for Facial Expression Recognition" @default.
- W4304183744 cites W2012485643 @default.
- W4304183744 cites W2012945211 @default.
- W4304183744 cites W2093033615 @default.
- W4304183744 cites W2479639417 @default.
- W4304183744 cites W2606933083 @default.
- W4304183744 cites W2730601341 @default.
- W4304183744 cites W2755860313 @default.
- W4304183744 cites W2889978276 @default.
- W4304183744 cites W2890402983 @default.
- W4304183744 cites W2898278742 @default.
- W4304183744 cites W2900164843 @default.
- W4304183744 cites W2910673030 @default.
- W4304183744 cites W2912628469 @default.
- W4304183744 cites W2916690307 @default.
- W4304183744 cites W2931011950 @default.
- W4304183744 cites W2940314039 @default.
- W4304183744 cites W2943172630 @default.
- W4304183744 cites W2953766875 @default.
- W4304183744 cites W2983751658 @default.
- W4304183744 cites W3000577085 @default.
- W4304183744 cites W3008809756 @default.
- W4304183744 cites W3012325339 @default.
- W4304183744 cites W3031391834 @default.
- W4304183744 cites W3039664831 @default.
- W4304183744 cites W3049234687 @default.
- W4304183744 cites W3080425388 @default.
- W4304183744 cites W3084031222 @default.
- W4304183744 cites W3097020048 @default.
- W4304183744 cites W3118290699 @default.
- W4304183744 cites W3122955671 @default.
- W4304183744 cites W3124241913 @default.
- W4304183744 cites W3138598836 @default.
- W4304183744 cites W3140216465 @default.
- W4304183744 cites W3157999215 @default.
- W4304183744 cites W3181903047 @default.
- W4304183744 cites W3191969609 @default.
- W4304183744 cites W3205895757 @default.
- W4304183744 cites W3213195076 @default.
- W4304183744 cites W3215476595 @default.
- W4304183744 cites W4205991583 @default.
- W4304183744 cites W4214552004 @default.
- W4304183744 cites W4221101020 @default.
- W4304183744 cites W4281998067 @default.
- W4304183744 cites W4289315407 @default.
- W4304183744 cites W4294311511 @default.
- W4304183744 doi "https://doi.org/10.3390/sym14102055" @default.
- W4304183744 hasPublicationYear "2022" @default.
- W4304183744 type Work @default.
- W4304183744 citedByCount "2" @default.
- W4304183744 countsByYear W43041837442023 @default.
- W4304183744 crossrefType "journal-article" @default.
- W4304183744 hasAuthorship W4304183744A5016297325 @default.
- W4304183744 hasAuthorship W4304183744A5066988252 @default.
- W4304183744 hasAuthorship W4304183744A5071943346 @default.
- W4304183744 hasAuthorship W4304183744A5091586702 @default.
- W4304183744 hasBestOaLocation W43041837441 @default.
- W4304183744 hasConcept C108583219 @default.
- W4304183744 hasConcept C115961682 @default.
- W4304183744 hasConcept C138885662 @default.
- W4304183744 hasConcept C14036430 @default.
- W4304183744 hasConcept C144024400 @default.
- W4304183744 hasConcept C153180895 @default.
- W4304183744 hasConcept C154945302 @default.
- W4304183744 hasConcept C199360897 @default.
- W4304183744 hasConcept C2776401178 @default.
- W4304183744 hasConcept C2777212361 @default.
- W4304183744 hasConcept C2779304628 @default.
- W4304183744 hasConcept C2987714656 @default.
- W4304183744 hasConcept C31510193 @default.
- W4304183744 hasConcept C36289849 @default.
- W4304183744 hasConcept C41008148 @default.
- W4304183744 hasConcept C41895202 @default.
- W4304183744 hasConcept C78458016 @default.
- W4304183744 hasConcept C86803240 @default.
- W4304183744 hasConcept C90559484 @default.
- W4304183744 hasConceptScore W4304183744C108583219 @default.
- W4304183744 hasConceptScore W4304183744C115961682 @default.
- W4304183744 hasConceptScore W4304183744C138885662 @default.
- W4304183744 hasConceptScore W4304183744C14036430 @default.
- W4304183744 hasConceptScore W4304183744C144024400 @default.
- W4304183744 hasConceptScore W4304183744C153180895 @default.
- W4304183744 hasConceptScore W4304183744C154945302 @default.
- W4304183744 hasConceptScore W4304183744C199360897 @default.
- W4304183744 hasConceptScore W4304183744C2776401178 @default.
- W4304183744 hasConceptScore W4304183744C2777212361 @default.
- W4304183744 hasConceptScore W4304183744C2779304628 @default.
- W4304183744 hasConceptScore W4304183744C2987714656 @default.
- W4304183744 hasConceptScore W4304183744C31510193 @default.
- W4304183744 hasConceptScore W4304183744C36289849 @default.