Matches in SemOpenAlex for { <https://semopenalex.org/work/W3200339936> ?p ?o ?g. }
Showing items 1 to 85 of
85
with 100 items per page.
- W3200339936 abstract "The transformer models and their variations currently are considered the prime model architectures in speech recognition since they yield state-of-the-art results on several datasets. Their main strength lies in the self-attention mechanism, where the models receive the ability to calculate a score over the whole input sequence and focus on essential aspects of the sequence. However, the attention score has some flaws. It is heavily global-dependent since it takes the whole sequence into account and normalizes along the sequence length. Our work presents a novel approach for a dynamic fusion between the global and a local attention score based on a Gaussian mask. The small networks for learning the fusion process and the Gaussian masks require only few additional parameters and are simple to add to current transformer architectures. With our exhaustive evaluation, we determine the effect of localness in the encoder layers and examine the most effective fusion approach. The results on the dataset TEDLIUMv2 demonstrate a steady improvement on the dev and the test set for the base transformer model equipped with our proposed fusion procedure for local attention." @default.
- W3200339936 created "2021-09-27" @default.
- W3200339936 creator A5035094320 @default.
- W3200339936 creator A5039092855 @default.
- W3200339936 creator A5079663771 @default.
- W3200339936 creator A5085411179 @default.
- W3200339936 date "2021-01-01" @default.
- W3200339936 modified "2023-10-17" @default.
- W3200339936 title "Induced Local Attention for Transformer Models in Speech Recognition" @default.
- W3200339936 cites W1902237438 @default.
- W3200339936 cites W2127141656 @default.
- W3200339936 cites W2183341477 @default.
- W3200339936 cites W2407080277 @default.
- W3200339936 cites W2911291251 @default.
- W3200339936 cites W2936774411 @default.
- W3200339936 cites W2962780374 @default.
- W3200339936 cites W2962784628 @default.
- W3200339936 cites W2963925437 @default.
- W3200339936 cites W2964089206 @default.
- W3200339936 cites W2964302946 @default.
- W3200339936 cites W2976556660 @default.
- W3200339936 cites W3034608103 @default.
- W3200339936 cites W3097777922 @default.
- W3200339936 doi "https://doi.org/10.1007/978-3-030-87802-3_71" @default.
- W3200339936 hasPublicationYear "2021" @default.
- W3200339936 type Work @default.
- W3200339936 sameAs 3200339936 @default.
- W3200339936 citedByCount "1" @default.
- W3200339936 countsByYear W32003399362022 @default.
- W3200339936 crossrefType "book-chapter" @default.
- W3200339936 hasAuthorship W3200339936A5035094320 @default.
- W3200339936 hasAuthorship W3200339936A5039092855 @default.
- W3200339936 hasAuthorship W3200339936A5079663771 @default.
- W3200339936 hasAuthorship W3200339936A5085411179 @default.
- W3200339936 hasConcept C111919701 @default.
- W3200339936 hasConcept C118505674 @default.
- W3200339936 hasConcept C119599485 @default.
- W3200339936 hasConcept C119857082 @default.
- W3200339936 hasConcept C121332964 @default.
- W3200339936 hasConcept C127413603 @default.
- W3200339936 hasConcept C153180895 @default.
- W3200339936 hasConcept C154945302 @default.
- W3200339936 hasConcept C163716315 @default.
- W3200339936 hasConcept C165801399 @default.
- W3200339936 hasConcept C2778112365 @default.
- W3200339936 hasConcept C28490314 @default.
- W3200339936 hasConcept C41008148 @default.
- W3200339936 hasConcept C54355233 @default.
- W3200339936 hasConcept C62520636 @default.
- W3200339936 hasConcept C66322947 @default.
- W3200339936 hasConcept C86803240 @default.
- W3200339936 hasConceptScore W3200339936C111919701 @default.
- W3200339936 hasConceptScore W3200339936C118505674 @default.
- W3200339936 hasConceptScore W3200339936C119599485 @default.
- W3200339936 hasConceptScore W3200339936C119857082 @default.
- W3200339936 hasConceptScore W3200339936C121332964 @default.
- W3200339936 hasConceptScore W3200339936C127413603 @default.
- W3200339936 hasConceptScore W3200339936C153180895 @default.
- W3200339936 hasConceptScore W3200339936C154945302 @default.
- W3200339936 hasConceptScore W3200339936C163716315 @default.
- W3200339936 hasConceptScore W3200339936C165801399 @default.
- W3200339936 hasConceptScore W3200339936C2778112365 @default.
- W3200339936 hasConceptScore W3200339936C28490314 @default.
- W3200339936 hasConceptScore W3200339936C41008148 @default.
- W3200339936 hasConceptScore W3200339936C54355233 @default.
- W3200339936 hasConceptScore W3200339936C62520636 @default.
- W3200339936 hasConceptScore W3200339936C66322947 @default.
- W3200339936 hasConceptScore W3200339936C86803240 @default.
- W3200339936 hasLocation W32003399361 @default.
- W3200339936 hasOpenAccess W3200339936 @default.
- W3200339936 hasPrimaryLocation W32003399361 @default.
- W3200339936 hasRelatedWork W10412386 @default.
- W3200339936 hasRelatedWork W11512698 @default.
- W3200339936 hasRelatedWork W13112433 @default.
- W3200339936 hasRelatedWork W2580338 @default.
- W3200339936 hasRelatedWork W2607572 @default.
- W3200339936 hasRelatedWork W2883085 @default.
- W3200339936 hasRelatedWork W5979161 @default.
- W3200339936 hasRelatedWork W7120470 @default.
- W3200339936 hasRelatedWork W7655147 @default.
- W3200339936 hasRelatedWork W9761094 @default.
- W3200339936 isParatext "false" @default.
- W3200339936 isRetracted "false" @default.
- W3200339936 magId "3200339936" @default.
- W3200339936 workType "book-chapter" @default.