Matches in SemOpenAlex for { <https://semopenalex.org/work/W3175554999> ?p ?o ?g. }
Showing items 1 to 80 of
80
with 100 items per page.
- W3175554999 endingPage "334" @default.
- W3175554999 startingPage "319" @default.
- W3175554999 abstract "With the convergence and development of the Internet of things (IoT) and artificial intelligence, closed-circuit television, wearable devices, and artificial neural networks have been combined and applied to crime prevention and follow-up measures against crimes. However, these IoT devices have various limitations based on the physical environment and face the fundamental problem of privacy violations. In this study, voice data are collected and emotions are classified based on an acoustic sensor that is free of privacy violations and is not sensitive to changes in external environments, to overcome these limitations. For the classification of emotions in the voice, the data generated from an acoustic sensor are combined with the convolution neural network algorithm of an artificial neural network. Short-time Fourier transform and wavelet transform as frequency spectrum representation methods are used as preprocessing techniques for the analysis of a pattern of acoustic data. The preprocessed spectrum data are represented as a 2D image of the pattern of emotion felt through hearing, which is applied to the image classification learning model of an artificial neural network. The image classification learning model uses the ResNet. The artificial neural network internally uses various forms of gradient descent to compare the learning of each node and analyzes the pattern through a feature map. The classification model facilitates the classification of voice data into three emotion types: angry, fearful, and surprised. Thus, a system that can detect situations around sensors and predict danger can be established. Despite the different emotional intensities of the base data and sentence-based learning data, the established voice classification model demonstrated an accuracy of more than 77.2%. This model is applicable to various areas, including the prediction of crime situations and the management of work environments for emotional labor." @default.
- W3175554999 created "2021-07-05" @default.
- W3175554999 creator A5039378204 @default.
- W3175554999 creator A5046012749 @default.
- W3175554999 creator A5089226921 @default.
- W3175554999 date "2021-01-01" @default.
- W3175554999 modified "2023-09-27" @default.
- W3175554999 title "CNN-Based Voice Emotion Classification Model for Risk Detection" @default.
- W3175554999 cites W1973057208 @default.
- W3175554999 cites W1994616650 @default.
- W3175554999 cites W2162540381 @default.
- W3175554999 cites W2305750651 @default.
- W3175554999 cites W2576107800 @default.
- W3175554999 cites W2577361510 @default.
- W3175554999 cites W2613371493 @default.
- W3175554999 cites W2760981611 @default.
- W3175554999 cites W2795771732 @default.
- W3175554999 cites W2798138305 @default.
- W3175554999 cites W2803193013 @default.
- W3175554999 cites W2885756821 @default.
- W3175554999 cites W2943011355 @default.
- W3175554999 cites W2944461164 @default.
- W3175554999 cites W2950439520 @default.
- W3175554999 cites W2967011433 @default.
- W3175554999 cites W2971147635 @default.
- W3175554999 cites W2996844929 @default.
- W3175554999 cites W3002289107 @default.
- W3175554999 cites W3014009303 @default.
- W3175554999 cites W3043664831 @default.
- W3175554999 cites W3049554854 @default.
- W3175554999 cites W3127179589 @default.
- W3175554999 doi "https://doi.org/10.32604/iasc.2021.018115" @default.
- W3175554999 hasPublicationYear "2021" @default.
- W3175554999 type Work @default.
- W3175554999 sameAs 3175554999 @default.
- W3175554999 citedByCount "0" @default.
- W3175554999 crossrefType "journal-article" @default.
- W3175554999 hasAuthorship W3175554999A5039378204 @default.
- W3175554999 hasAuthorship W3175554999A5046012749 @default.
- W3175554999 hasAuthorship W3175554999A5089226921 @default.
- W3175554999 hasBestOaLocation W31755549991 @default.
- W3175554999 hasConcept C138885662 @default.
- W3175554999 hasConcept C153180895 @default.
- W3175554999 hasConcept C154945302 @default.
- W3175554999 hasConcept C2776401178 @default.
- W3175554999 hasConcept C28490314 @default.
- W3175554999 hasConcept C41008148 @default.
- W3175554999 hasConcept C41895202 @default.
- W3175554999 hasConcept C50644808 @default.
- W3175554999 hasConcept C81363708 @default.
- W3175554999 hasConceptScore W3175554999C138885662 @default.
- W3175554999 hasConceptScore W3175554999C153180895 @default.
- W3175554999 hasConceptScore W3175554999C154945302 @default.
- W3175554999 hasConceptScore W3175554999C2776401178 @default.
- W3175554999 hasConceptScore W3175554999C28490314 @default.
- W3175554999 hasConceptScore W3175554999C41008148 @default.
- W3175554999 hasConceptScore W3175554999C41895202 @default.
- W3175554999 hasConceptScore W3175554999C50644808 @default.
- W3175554999 hasConceptScore W3175554999C81363708 @default.
- W3175554999 hasIssue "2" @default.
- W3175554999 hasLocation W31755549991 @default.
- W3175554999 hasOpenAccess W3175554999 @default.
- W3175554999 hasPrimaryLocation W31755549991 @default.
- W3175554999 hasRelatedWork W2175746458 @default.
- W3175554999 hasRelatedWork W2732542196 @default.
- W3175554999 hasRelatedWork W2738221750 @default.
- W3175554999 hasRelatedWork W2760085659 @default.
- W3175554999 hasRelatedWork W2767651786 @default.
- W3175554999 hasRelatedWork W2883200793 @default.
- W3175554999 hasRelatedWork W2912288872 @default.
- W3175554999 hasRelatedWork W2940661641 @default.
- W3175554999 hasRelatedWork W3012978760 @default.
- W3175554999 hasRelatedWork W3093612317 @default.
- W3175554999 hasVolume "29" @default.
- W3175554999 isParatext "false" @default.
- W3175554999 isRetracted "false" @default.
- W3175554999 magId "3175554999" @default.
- W3175554999 workType "article" @default.