Matches in SemOpenAlex for { <https://semopenalex.org/work/W2938404524> ?p ?o ?g. }
- W2938404524 endingPage "1863" @default.
- W2938404524 startingPage "1863" @default.
- W2938404524 abstract "Facial Expression Recognition (FER) can be widely applied to various research areas, such as mental diseases diagnosis and human social/physiological interaction detection. With the emerging advanced technologies in hardware and sensors, FER systems have been developed to support real-world application scenes, instead of laboratory environments. Although the laboratory-controlled FER systems achieve very high accuracy, around 97%, the technical transferring from the laboratory to real-world applications faces a great barrier of very low accuracy, approximately 50%. In this survey, we comprehensively discuss three significant challenges in the unconstrained real-world environments, such as illumination variation, head pose, and subject-dependence, which may not be resolved by only analysing images/videos in the FER system. We focus on those sensors that may provide extra information and help the FER systems to detect emotion in both static images and video sequences. We introduce three categories of sensors that may help improve the accuracy and reliability of an expression recognition system by tackling the challenges mentioned above in pure image/video processing. The first group is detailed-face sensors, which detect a small dynamic change of a face component, such as eye-trackers, which may help differentiate the background noise and the feature of faces. The second is non-visual sensors, such as audio, depth, and EEG sensors, which provide extra information in addition to visual dimension and improve the recognition reliability for example in illumination variation and position shift situation. The last is target-focused sensors, such as infrared thermal sensors, which can facilitate the FER systems to filter useless visual contents and may help resist illumination variation. Also, we discuss the methods of fusing different inputs obtained from multimodal sensors in an emotion system. We comparatively review the most prominent multimodal emotional expression recognition approaches and point out their advantages and limitations. We briefly introduce the benchmark data sets related to FER systems for each category of sensors and extend our survey to the open challenges and issues. Meanwhile, we design a framework of an expression recognition system, which uses multimodal sensor data (provided by the three categories of sensors) to provide complete information about emotions to assist the pure face image/video analysis. We theoretically analyse the feasibility and achievability of our new expression recognition system, especially for the use in the wild environment, and point out the future directions to design an efficient, emotional expression recognition system." @default.
- W2938404524 created "2019-04-25" @default.
- W2938404524 creator A5039790126 @default.
- W2938404524 creator A5051297830 @default.
- W2938404524 creator A5054720605 @default.
- W2938404524 creator A5058644115 @default.
- W2938404524 creator A5066831688 @default.
- W2938404524 creator A5081879366 @default.
- W2938404524 creator A5090419741 @default.
- W2938404524 date "2019-04-18" @default.
- W2938404524 modified "2023-10-06" @default.
- W2938404524 title "A Review on Automatic Facial Expression Recognition Systems Assisted by Multimodal Sensor Data" @default.
- W2938404524 cites W1536775035 @default.
- W2938404524 cites W1666243891 @default.
- W2938404524 cites W1854318472 @default.
- W2938404524 cites W1963599662 @default.
- W2938404524 cites W1965947362 @default.
- W2938404524 cites W1979189411 @default.
- W2938404524 cites W1981918162 @default.
- W2938404524 cites W1995265288 @default.
- W2938404524 cites W1997135319 @default.
- W2938404524 cites W1997714027 @default.
- W2938404524 cites W2007350404 @default.
- W2938404524 cites W2009250875 @default.
- W2938404524 cites W2010796033 @default.
- W2938404524 cites W2032254851 @default.
- W2938404524 cites W2036291125 @default.
- W2938404524 cites W2056049575 @default.
- W2938404524 cites W2067789110 @default.
- W2938404524 cites W2069762684 @default.
- W2938404524 cites W2073942619 @default.
- W2938404524 cites W2079742827 @default.
- W2938404524 cites W2096027770 @default.
- W2938404524 cites W2106390385 @default.
- W2938404524 cites W2106947945 @default.
- W2938404524 cites W2117645142 @default.
- W2938404524 cites W2122788387 @default.
- W2938404524 cites W2143829622 @default.
- W2938404524 cites W2145511413 @default.
- W2938404524 cites W2156503193 @default.
- W2938404524 cites W2162720330 @default.
- W2938404524 cites W2164985412 @default.
- W2938404524 cites W2167232714 @default.
- W2938404524 cites W2297337743 @default.
- W2938404524 cites W2345305417 @default.
- W2938404524 cites W2410803725 @default.
- W2938404524 cites W2418112915 @default.
- W2938404524 cites W2479639417 @default.
- W2938404524 cites W2568045728 @default.
- W2938404524 cites W2571743746 @default.
- W2938404524 cites W2591940486 @default.
- W2938404524 cites W2594075324 @default.
- W2938404524 cites W2600389231 @default.
- W2938404524 cites W2608598365 @default.
- W2938404524 cites W2609211153 @default.
- W2938404524 cites W2617151543 @default.
- W2938404524 cites W2624323315 @default.
- W2938404524 cites W2744078350 @default.
- W2938404524 cites W2750692136 @default.
- W2938404524 cites W2762323924 @default.
- W2938404524 cites W2765906447 @default.
- W2938404524 cites W2782360958 @default.
- W2938404524 cites W2791220068 @default.
- W2938404524 cites W2792191740 @default.
- W2938404524 cites W2792578050 @default.
- W2938404524 cites W2795913243 @default.
- W2938404524 cites W2800428102 @default.
- W2938404524 cites W2804789474 @default.
- W2938404524 cites W2805375425 @default.
- W2938404524 cites W2808505323 @default.
- W2938404524 cites W2891399311 @default.
- W2938404524 cites W2907678173 @default.
- W2938404524 cites W2908758932 @default.
- W2938404524 cites W2909070361 @default.
- W2938404524 cites W2912990735 @default.
- W2938404524 cites W2941914178 @default.
- W2938404524 cites W3150894447 @default.
- W2938404524 cites W4255380470 @default.
- W2938404524 doi "https://doi.org/10.3390/s19081863" @default.
- W2938404524 hasPubMedCentralId "https://www.ncbi.nlm.nih.gov/pmc/articles/6514576" @default.
- W2938404524 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/31003522" @default.
- W2938404524 hasPublicationYear "2019" @default.
- W2938404524 type Work @default.
- W2938404524 sameAs 2938404524 @default.
- W2938404524 citedByCount "110" @default.
- W2938404524 countsByYear W29384045242019 @default.
- W2938404524 countsByYear W29384045242020 @default.
- W2938404524 countsByYear W29384045242021 @default.
- W2938404524 countsByYear W29384045242022 @default.
- W2938404524 countsByYear W29384045242023 @default.
- W2938404524 crossrefType "journal-article" @default.
- W2938404524 hasAuthorship W2938404524A5039790126 @default.
- W2938404524 hasAuthorship W2938404524A5051297830 @default.
- W2938404524 hasAuthorship W2938404524A5054720605 @default.
- W2938404524 hasAuthorship W2938404524A5058644115 @default.
- W2938404524 hasAuthorship W2938404524A5066831688 @default.
- W2938404524 hasAuthorship W2938404524A5081879366 @default.
- W2938404524 hasAuthorship W2938404524A5090419741 @default.