Matches in SemOpenAlex for { <https://semopenalex.org/work/W2972818652> ?p ?o ?g. }
Showing items 1 to 89 of
89
with 100 items per page.
- W2972818652 endingPage "408" @default.
- W2972818652 startingPage "397" @default.
- W2972818652 abstract "Although deep convolutional neural networks (CNNs) have achieved the state-of-the-arts for facial expression recognition (FER), FER is still challenging due to two aspects: class imbalance and hard expression examples. However, most existing FER methods recognize facial expression images by training the CNN models with cross-entropy (CE) loss in a single stage, which have limited capability to deal with these problems because each expression example is assigned equal weight of loss. Inspired by the recently proposed focal loss which reduces the relative loss for those well-classified expression examples and pay more attention to those misclassified ones, we can mitigate these problems by introducing the focal loss into the existing FER system when facing imbalanced data or hard expression examples. Considering that the focal loss allows the network to further extract discriminative features based on the learned feature-separating capability, we present a two-stage training strategy utilizing CE loss in the first stage and focal loss in the second stage to boost the FER performance. Extensive experiments have been conducted on two well-known FER datasets called CK+ and Oulu-CASIA. We gain improvements compared with the common one-stage training strategy and achieve the state-of-the-art results on the datasets in terms of average classification accuracy, which demonstrate the effectiveness of our proposed two-stage training strategy." @default.
- W2972818652 created "2019-09-19" @default.
- W2972818652 creator A5002795838 @default.
- W2972818652 creator A5032186548 @default.
- W2972818652 creator A5076252609 @default.
- W2972818652 date "2019-01-01" @default.
- W2972818652 modified "2023-09-24" @default.
- W2972818652 title "Discriminative Feature Learning Using Two-Stage Training Strategy for Facial Expression Recognition" @default.
- W2972818652 cites W132448360 @default.
- W2972818652 cites W1951380363 @default.
- W2972818652 cites W1974210421 @default.
- W2972818652 cites W2003238582 @default.
- W2972818652 cites W2024868105 @default.
- W2972818652 cites W2035372623 @default.
- W2972818652 cites W2097117768 @default.
- W2972818652 cites W2103943262 @default.
- W2972818652 cites W2134860945 @default.
- W2972818652 cites W2194775991 @default.
- W2972818652 cites W2198512331 @default.
- W2972818652 cites W2217426128 @default.
- W2972818652 cites W2244142460 @default.
- W2972818652 cites W2253728219 @default.
- W2972818652 cites W2325939864 @default.
- W2972818652 cites W2737398044 @default.
- W2972818652 cites W2754447548 @default.
- W2972818652 cites W2963351448 @default.
- W2972818652 cites W3101998545 @default.
- W2972818652 doi "https://doi.org/10.1007/978-3-030-30508-6_32" @default.
- W2972818652 hasPublicationYear "2019" @default.
- W2972818652 type Work @default.
- W2972818652 sameAs 2972818652 @default.
- W2972818652 citedByCount "0" @default.
- W2972818652 crossrefType "book-chapter" @default.
- W2972818652 hasAuthorship W2972818652A5002795838 @default.
- W2972818652 hasAuthorship W2972818652A5032186548 @default.
- W2972818652 hasAuthorship W2972818652A5076252609 @default.
- W2972818652 hasConcept C121332964 @default.
- W2972818652 hasConcept C138885662 @default.
- W2972818652 hasConcept C153180895 @default.
- W2972818652 hasConcept C153294291 @default.
- W2972818652 hasConcept C154945302 @default.
- W2972818652 hasConcept C195704467 @default.
- W2972818652 hasConcept C199360897 @default.
- W2972818652 hasConcept C2776401178 @default.
- W2972818652 hasConcept C2777211547 @default.
- W2972818652 hasConcept C28490314 @default.
- W2972818652 hasConcept C2987714656 @default.
- W2972818652 hasConcept C31510193 @default.
- W2972818652 hasConcept C41008148 @default.
- W2972818652 hasConcept C41895202 @default.
- W2972818652 hasConcept C59404180 @default.
- W2972818652 hasConcept C90559484 @default.
- W2972818652 hasConcept C97931131 @default.
- W2972818652 hasConceptScore W2972818652C121332964 @default.
- W2972818652 hasConceptScore W2972818652C138885662 @default.
- W2972818652 hasConceptScore W2972818652C153180895 @default.
- W2972818652 hasConceptScore W2972818652C153294291 @default.
- W2972818652 hasConceptScore W2972818652C154945302 @default.
- W2972818652 hasConceptScore W2972818652C195704467 @default.
- W2972818652 hasConceptScore W2972818652C199360897 @default.
- W2972818652 hasConceptScore W2972818652C2776401178 @default.
- W2972818652 hasConceptScore W2972818652C2777211547 @default.
- W2972818652 hasConceptScore W2972818652C28490314 @default.
- W2972818652 hasConceptScore W2972818652C2987714656 @default.
- W2972818652 hasConceptScore W2972818652C31510193 @default.
- W2972818652 hasConceptScore W2972818652C41008148 @default.
- W2972818652 hasConceptScore W2972818652C41895202 @default.
- W2972818652 hasConceptScore W2972818652C59404180 @default.
- W2972818652 hasConceptScore W2972818652C90559484 @default.
- W2972818652 hasConceptScore W2972818652C97931131 @default.
- W2972818652 hasLocation W29728186521 @default.
- W2972818652 hasOpenAccess W2972818652 @default.
- W2972818652 hasPrimaryLocation W29728186521 @default.
- W2972818652 hasRelatedWork W130490334 @default.
- W2972818652 hasRelatedWork W1982770690 @default.
- W2972818652 hasRelatedWork W2005051400 @default.
- W2972818652 hasRelatedWork W2016461833 @default.
- W2972818652 hasRelatedWork W2050806332 @default.
- W2972818652 hasRelatedWork W2507989420 @default.
- W2972818652 hasRelatedWork W2782592381 @default.
- W2972818652 hasRelatedWork W2886198169 @default.
- W2972818652 hasRelatedWork W2970216048 @default.
- W2972818652 hasRelatedWork W3018375584 @default.
- W2972818652 isParatext "false" @default.
- W2972818652 isRetracted "false" @default.
- W2972818652 magId "2972818652" @default.
- W2972818652 workType "book-chapter" @default.