Matches in SemOpenAlex for { <https://semopenalex.org/work/W3128121831> ?p ?o ?g. }
- W3128121831 endingPage "214" @default.
- W3128121831 startingPage "214" @default.
- W3128121831 abstract "Artificial Neural Networks (ANNs) were created inspired by the neural networks in the human brain and have been widely applied in speech processing. The application areas of ANN include: Speech recognition, speech emotion recognition, language identification, speech enhancement, and speech separation, amongst others. Likewise, given that speech processing performed by humans involves complex cognitive processes known as auditory attention, there has been a growing amount of papers proposing ANNs supported by deep learning algorithms in conjunction with some mechanism to achieve symmetry with the human attention process. However, while these ANN approaches include attention, there is no categorization of attention integrated into the deep learning algorithms and their relation with human auditory attention. Therefore, we consider it necessary to have a review of the different ANN approaches inspired in attention to show both academic and industry experts the available models for a wide variety of applications. Based on the PRISMA methodology, we present a systematic review of the literature published since 2000, in which deep learning algorithms are applied to diverse problems related to speech processing. In this paper 133 research works are selected and the following aspects are described: (i) Most relevant features, (ii) ways in which attention has been implemented, (iii) their hypothetical relationship with human attention, and (iv) the evaluation metrics used. Additionally, the four publications most related with human attention were analyzed and their strengths and weaknesses were determined." @default.
- W3128121831 created "2021-02-15" @default.
- W3128121831 creator A5002575060 @default.
- W3128121831 creator A5028040623 @default.
- W3128121831 creator A5028817934 @default.
- W3128121831 creator A5046636421 @default.
- W3128121831 date "2021-01-28" @default.
- W3128121831 modified "2023-09-30" @default.
- W3128121831 title "Attention-Inspired Artificial Neural Networks for Speech Processing: A Systematic Review" @default.
- W3128121831 cites W1587353960 @default.
- W3128121831 cites W1597988475 @default.
- W3128121831 cites W1685780174 @default.
- W3128121831 cites W2161374186 @default.
- W3128121831 cites W2327501763 @default.
- W3128121831 cites W2333091651 @default.
- W3128121831 cites W2563356726 @default.
- W3128121831 cites W2618099328 @default.
- W3128121831 cites W2750666523 @default.
- W3128121831 cites W2751179137 @default.
- W3128121831 cites W2754219472 @default.
- W3128121831 cites W2766219058 @default.
- W3128121831 cites W2774422254 @default.
- W3128121831 cites W2786835190 @default.
- W3128121831 cites W2791616807 @default.
- W3128121831 cites W2800921391 @default.
- W3128121831 cites W2805662243 @default.
- W3128121831 cites W2807005208 @default.
- W3128121831 cites W2808479354 @default.
- W3128121831 cites W2809271438 @default.
- W3128121831 cites W2883651221 @default.
- W3128121831 cites W2890197052 @default.
- W3128121831 cites W2891980359 @default.
- W3128121831 cites W2892370324 @default.
- W3128121831 cites W2894651928 @default.
- W3128121831 cites W2897132394 @default.
- W3128121831 cites W2899877258 @default.
- W3128121831 cites W2900130877 @default.
- W3128121831 cites W2901215769 @default.
- W3128121831 cites W2904581126 @default.
- W3128121831 cites W2912581782 @default.
- W3128121831 cites W2912728762 @default.
- W3128121831 cites W2913196735 @default.
- W3128121831 cites W2913718171 @default.
- W3128121831 cites W2914465356 @default.
- W3128121831 cites W2916997151 @default.
- W3128121831 cites W2920796740 @default.
- W3128121831 cites W2921193205 @default.
- W3128121831 cites W2921879147 @default.
- W3128121831 cites W2934533499 @default.
- W3128121831 cites W2935934262 @default.
- W3128121831 cites W2936123380 @default.
- W3128121831 cites W2936302822 @default.
- W3128121831 cites W2937033898 @default.
- W3128121831 cites W2937952020 @default.
- W3128121831 cites W2938486422 @default.
- W3128121831 cites W2938836228 @default.
- W3128121831 cites W2938873567 @default.
- W3128121831 cites W2938974662 @default.
- W3128121831 cites W2939111082 @default.
- W3128121831 cites W2939129695 @default.
- W3128121831 cites W2940073716 @default.
- W3128121831 cites W2940259008 @default.
- W3128121831 cites W2944384275 @default.
- W3128121831 cites W2946508582 @default.
- W3128121831 cites W2947078770 @default.
- W3128121831 cites W2953687843 @default.
- W3128121831 cites W2955805055 @default.
- W3128121831 cites W2962759037 @default.
- W3128121831 cites W2962824709 @default.
- W3128121831 cites W2962826786 @default.
- W3128121831 cites W2963032538 @default.
- W3128121831 cites W2963050656 @default.
- W3128121831 cites W2963087748 @default.
- W3128121831 cites W2963701934 @default.
- W3128121831 cites W2963827914 @default.
- W3128121831 cites W2963929227 @default.
- W3128121831 cites W2968981126 @default.
- W3128121831 cites W2970737019 @default.
- W3128121831 cites W2973320017 @default.
- W3128121831 cites W2974501683 @default.
- W3128121831 cites W2975938118 @default.
- W3128121831 cites W2976196855 @default.
- W3128121831 cites W2980326204 @default.
- W3128121831 cites W2980874206 @default.
- W3128121831 cites W2982600052 @default.
- W3128121831 cites W2982640351 @default.
- W3128121831 cites W2987119394 @default.
- W3128121831 cites W2990074387 @default.
- W3128121831 cites W2990809848 @default.
- W3128121831 cites W2990825125 @default.
- W3128121831 cites W2990836636 @default.
- W3128121831 cites W2992341299 @default.
- W3128121831 cites W2995265915 @default.
- W3128121831 cites W2995819149 @default.
- W3128121831 cites W2996122240 @default.
- W3128121831 cites W2997887270 @default.
- W3128121831 cites W2998072428 @default.
- W3128121831 cites W2998678989 @default.