Matches in SemOpenAlex for { <https://semopenalex.org/work/W4296567006> ?p ?o ?g. }
Showing items 1 to 31 of
31
with 100 items per page.
- W4296567006 abstract "This study introduces a new normalization layer termed Batch Layer Normalization (BLN) to reduce the problem of internal covariate shift in deep neural network layers. As a combined version of batch and layer normalization, BLN adaptively puts appropriate weight on mini-batch and feature normalization based on the inverse size of mini-batches to normalize the input to a layer during the learning process. It also performs the exact computation with a minor change at inference times, using either mini-batch statistics or population statistics. The decision process to either use statistics of mini-batch or population gives BLN the ability to play a comprehensive role in the hyper-parameter optimization process of models. The key advantage of BLN is the support of the theoretical analysis of being independent of the input data, and its statistical configuration heavily depends on the task performed, the amount of training data, and the size of batches. Test results indicate the application potential of BLN and its faster convergence than batch normalization and layer normalization in both Convolutional and Recurrent Neural Networks. The code of the experiments is publicly available online (https://github.com/A2Amir/Batch-Layer-Normalization)." @default.
- W4296567006 created "2022-09-21" @default.
- W4296567006 creator A5032826838 @default.
- W4296567006 creator A5089364715 @default.
- W4296567006 date "2022-09-19" @default.
- W4296567006 modified "2023-10-17" @default.
- W4296567006 title "Batch Layer Normalization, A new normalization layer for CNNs and RNN" @default.
- W4296567006 doi "https://doi.org/10.48550/arxiv.2209.08898" @default.
- W4296567006 hasPublicationYear "2022" @default.
- W4296567006 type Work @default.
- W4296567006 citedByCount "0" @default.
- W4296567006 crossrefType "posted-content" @default.
- W4296567006 hasAuthorship W4296567006A5032826838 @default.
- W4296567006 hasAuthorship W4296567006A5089364715 @default.
- W4296567006 hasBestOaLocation W42965670061 @default.
- W4296567006 hasLocation W42965670061 @default.
- W4296567006 hasOpenAccess W4296567006 @default.
- W4296567006 hasPrimaryLocation W42965670061 @default.
- W4296567006 hasRelatedWork W1965343560 @default.
- W4296567006 hasRelatedWork W2063185616 @default.
- W4296567006 hasRelatedWork W2211820962 @default.
- W4296567006 hasRelatedWork W2354062721 @default.
- W4296567006 hasRelatedWork W2356313285 @default.
- W4296567006 hasRelatedWork W2751888684 @default.
- W4296567006 hasRelatedWork W2794115703 @default.
- W4296567006 hasRelatedWork W2899084033 @default.
- W4296567006 hasRelatedWork W3107474891 @default.
- W4296567006 hasRelatedWork W180874098 @default.
- W4296567006 isParatext "false" @default.
- W4296567006 isRetracted "false" @default.
- W4296567006 workType "article" @default.