Matches in SemOpenAlex for { <https://semopenalex.org/work/W4225319377> ?p ?o ?g. }
- W4225319377 abstract "Emotion recognition is defined as identifying human emotion and is directly related to different fields such as human-computer interfaces, human emotional processing, irrational analysis, medical diagnostics, data-driven animation, human-robot communication, and many more. This paper proposes a new facial emotional recognition model using a convolutional neural network. Our proposed model, ConvNet, detects seven specific emotions from image data including anger, disgust, fear, happiness, neutrality, sadness, and surprise. The features extracted by the Local Binary Pattern (LBP), region based Oriented FAST and rotated BRIEF (ORB) and Convolutional Neural network (CNN) from facial expressions images were fused to develop the classification model through training by our proposed CNN model (ConvNet). Our method can converge quickly and achieves good performance which the authors can develop a real-time schema that can easily fit the model and sense emotions. Furthermore, this study focuses on the mental or emotional stuff of a man or woman using the behavioral aspects. To complete the training of the CNN network model, we use the FER2013 databases at first, and then apply the generalization techniques to the JAFFE and CK+ datasets respectively in the testing stage to evaluate the performance of the model. In the generalization approach on the JAFFE dataset, we get a 92.05% accuracy, while on the CK+ dataset, we acquire a 98.13% accuracy which achieve the best performance among existing methods. We also test the system's success by identifying facial expressions in real-time. ConvNet consists of four layers of convolution together with two fully connected layers. The experimental results show that the ConvNet is able to achieve 96% training accuracy which is much better than current existing models. However, when compared to other validation methods, the suggested technique was more accurate. ConvNet also achieved validation accuracy of 91.01% for the FER2013 dataset. We also made all the materials publicly accessible for the research community at: https://github.com/Tanoy004/Emotion-recognition-through-CNN ." @default.
- W4225319377 created "2022-05-05" @default.
- W4225319377 creator A5006293953 @default.
- W4225319377 creator A5007781426 @default.
- W4225319377 creator A5019852514 @default.
- W4225319377 creator A5056293251 @default.
- W4225319377 creator A5072851920 @default.
- W4225319377 creator A5080737790 @default.
- W4225319377 date "2022-04-28" @default.
- W4225319377 modified "2023-09-26" @default.
- W4225319377 title "Four-layer ConvNet to facial emotion recognition with minimal epochs and the significance of data diversity" @default.
- W4225319377 cites W1572063013 @default.
- W4225319377 cites W1989188126 @default.
- W4225319377 cites W1995734290 @default.
- W4225319377 cites W2041616772 @default.
- W4225319377 cites W2081666543 @default.
- W4225319377 cites W2103943262 @default.
- W4225319377 cites W2139916508 @default.
- W4225319377 cites W2145310492 @default.
- W4225319377 cites W2152473410 @default.
- W4225319377 cites W2244142460 @default.
- W4225319377 cites W2246249023 @default.
- W4225319377 cites W2326873259 @default.
- W4225319377 cites W2330045552 @default.
- W4225319377 cites W2476124734 @default.
- W4225319377 cites W2490049321 @default.
- W4225319377 cites W2512118432 @default.
- W4225319377 cites W2523270796 @default.
- W4225319377 cites W2525436393 @default.
- W4225319377 cites W2729796907 @default.
- W4225319377 cites W2744909235 @default.
- W4225319377 cites W2759166014 @default.
- W4225319377 cites W2764197504 @default.
- W4225319377 cites W2774147932 @default.
- W4225319377 cites W2794717016 @default.
- W4225319377 cites W2805810266 @default.
- W4225319377 cites W2893935103 @default.
- W4225319377 cites W2900046143 @default.
- W4225319377 cites W2903547951 @default.
- W4225319377 cites W2905373149 @default.
- W4225319377 cites W2922183696 @default.
- W4225319377 cites W2931011950 @default.
- W4225319377 cites W2963092169 @default.
- W4225319377 cites W2971794874 @default.
- W4225319377 cites W2974660039 @default.
- W4225319377 cites W2982383280 @default.
- W4225319377 cites W2985108241 @default.
- W4225319377 cites W2988478392 @default.
- W4225319377 cites W2993503860 @default.
- W4225319377 cites W2997514191 @default.
- W4225319377 cites W3005529595 @default.
- W4225319377 cites W3011242243 @default.
- W4225319377 cites W3012325339 @default.
- W4225319377 cites W3018040156 @default.
- W4225319377 cites W3032135501 @default.
- W4225319377 cites W3033283024 @default.
- W4225319377 cites W3085252772 @default.
- W4225319377 cites W3091973516 @default.
- W4225319377 cites W3107335993 @default.
- W4225319377 cites W3119213473 @default.
- W4225319377 cites W3125625827 @default.
- W4225319377 cites W3158712146 @default.
- W4225319377 doi "https://doi.org/10.1038/s41598-022-11173-0" @default.
- W4225319377 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/35484318" @default.
- W4225319377 hasPublicationYear "2022" @default.
- W4225319377 type Work @default.
- W4225319377 citedByCount "12" @default.
- W4225319377 countsByYear W42253193772022 @default.
- W4225319377 countsByYear W42253193772023 @default.
- W4225319377 crossrefType "journal-article" @default.
- W4225319377 hasAuthorship W4225319377A5006293953 @default.
- W4225319377 hasAuthorship W4225319377A5007781426 @default.
- W4225319377 hasAuthorship W4225319377A5019852514 @default.
- W4225319377 hasAuthorship W4225319377A5056293251 @default.
- W4225319377 hasAuthorship W4225319377A5072851920 @default.
- W4225319377 hasAuthorship W4225319377A5080737790 @default.
- W4225319377 hasBestOaLocation W42253193771 @default.
- W4225319377 hasConcept C118552586 @default.
- W4225319377 hasConcept C153180895 @default.
- W4225319377 hasConcept C154945302 @default.
- W4225319377 hasConcept C15744967 @default.
- W4225319377 hasConcept C195704467 @default.
- W4225319377 hasConcept C206310091 @default.
- W4225319377 hasConcept C2777375102 @default.
- W4225319377 hasConcept C2779302386 @default.
- W4225319377 hasConcept C2779812673 @default.
- W4225319377 hasConcept C2780343955 @default.
- W4225319377 hasConcept C41008148 @default.
- W4225319377 hasConcept C77805123 @default.
- W4225319377 hasConcept C81363708 @default.
- W4225319377 hasConceptScore W4225319377C118552586 @default.
- W4225319377 hasConceptScore W4225319377C153180895 @default.
- W4225319377 hasConceptScore W4225319377C154945302 @default.
- W4225319377 hasConceptScore W4225319377C15744967 @default.
- W4225319377 hasConceptScore W4225319377C195704467 @default.
- W4225319377 hasConceptScore W4225319377C206310091 @default.
- W4225319377 hasConceptScore W4225319377C2777375102 @default.
- W4225319377 hasConceptScore W4225319377C2779302386 @default.
- W4225319377 hasConceptScore W4225319377C2779812673 @default.
- W4225319377 hasConceptScore W4225319377C2780343955 @default.