Matches in SemOpenAlex for { <https://semopenalex.org/work/W3092168082> ?p ?o ?g. }
- W3092168082 abstract "Deep neural networks (DNNs) have achieved outstanding performance in a wide range of applications, e.g., image classification, natural language processing, etc. Despite the good performance, the huge number of parameters in DNNs brings challenges to efficient training of DNNs and also their deployment in low-end devices with limited computing resources. In this paper, we explore the correlations in the weight matrices, and approximate the weight matrices with the low-rank block-term tensors. We name the new corresponding structure as block-term tensor layers (BT-layers), which can be easily adapted to neural network models, such as CNNs and RNNs. In particular, the inputs and the outputs in BT-layers are reshaped into low-dimensional high-order tensors with a similar or improved representation power. Sufficient experiments have demonstrated that BT-layers in CNNs and RNNs can achieve a very large compression ratio on the number of parameters while preserving or improving the representation power of the original DNNs." @default.
- W3092168082 created "2020-10-15" @default.
- W3092168082 creator A5024663093 @default.
- W3092168082 creator A5031574868 @default.
- W3092168082 creator A5043901777 @default.
- W3092168082 creator A5051227924 @default.
- W3092168082 creator A5054050815 @default.
- W3092168082 creator A5081662508 @default.
- W3092168082 date "2020-10-10" @default.
- W3092168082 modified "2023-09-26" @default.
- W3092168082 title "Block-term Tensor Neural Networks" @default.
- W3092168082 cites W1498436455 @default.
- W3092168082 cites W1724438581 @default.
- W3092168082 cites W1798945469 @default.
- W3092168082 cites W1821462560 @default.
- W3092168082 cites W1825959699 @default.
- W3092168082 cites W1861492603 @default.
- W3092168082 cites W1895577753 @default.
- W3092168082 cites W1963826206 @default.
- W3092168082 cites W1993482030 @default.
- W3092168082 cites W1997076489 @default.
- W3092168082 cites W2000045479 @default.
- W3092168082 cites W2000215628 @default.
- W3092168082 cites W2040006565 @default.
- W3092168082 cites W2040376026 @default.
- W3092168082 cites W2076063813 @default.
- W3092168082 cites W2100916003 @default.
- W3092168082 cites W2104636679 @default.
- W3092168082 cites W2112796928 @default.
- W3092168082 cites W2117539524 @default.
- W3092168082 cites W2126707628 @default.
- W3092168082 cites W2130942839 @default.
- W3092168082 cites W2131524184 @default.
- W3092168082 cites W2150355110 @default.
- W3092168082 cites W2163605009 @default.
- W3092168082 cites W2167215970 @default.
- W3092168082 cites W2172166488 @default.
- W3092168082 cites W2172806452 @default.
- W3092168082 cites W2194775991 @default.
- W3092168082 cites W2267635276 @default.
- W3092168082 cites W2294543795 @default.
- W3092168082 cites W2431890537 @default.
- W3092168082 cites W2511730936 @default.
- W3092168082 cites W2512245234 @default.
- W3092168082 cites W2551156993 @default.
- W3092168082 cites W2594739706 @default.
- W3092168082 cites W2608554408 @default.
- W3092168082 cites W2619959423 @default.
- W3092168082 cites W2707890836 @default.
- W3092168082 cites W2733236492 @default.
- W3092168082 cites W2740645440 @default.
- W3092168082 cites W2753545915 @default.
- W3092168082 cites W2754526845 @default.
- W3092168082 cites W2766839578 @default.
- W3092168082 cites W2767785892 @default.
- W3092168082 cites W2779200694 @default.
- W3092168082 cites W2788717888 @default.
- W3092168082 cites W2800040198 @default.
- W3092168082 cites W2807812398 @default.
- W3092168082 cites W2808455355 @default.
- W3092168082 cites W2888987473 @default.
- W3092168082 cites W2889186417 @default.
- W3092168082 cites W2911892705 @default.
- W3092168082 cites W2942782905 @default.
- W3092168082 cites W2950967261 @default.
- W3092168082 cites W2962698165 @default.
- W3092168082 cites W2962766718 @default.
- W3092168082 cites W2963334029 @default.
- W3092168082 cites W2963674932 @default.
- W3092168082 cites W2963733622 @default.
- W3092168082 cites W2963838731 @default.
- W3092168082 cites W2963923362 @default.
- W3092168082 cites W2963991999 @default.
- W3092168082 cites W2966234499 @default.
- W3092168082 cites W2969855422 @default.
- W3092168082 cites W2970213198 @default.
- W3092168082 cites W2978451921 @default.
- W3092168082 cites W3099497510 @default.
- W3092168082 cites W3103713775 @default.
- W3092168082 cites W3103894541 @default.
- W3092168082 cites W3118608800 @default.
- W3092168082 cites W3121797243 @default.
- W3092168082 doi "https://doi.org/10.48550/arxiv.2010.04963" @default.
- W3092168082 hasPublicationYear "2020" @default.
- W3092168082 type Work @default.
- W3092168082 sameAs 3092168082 @default.
- W3092168082 citedByCount "0" @default.
- W3092168082 crossrefType "posted-content" @default.
- W3092168082 hasAuthorship W3092168082A5024663093 @default.
- W3092168082 hasAuthorship W3092168082A5031574868 @default.
- W3092168082 hasAuthorship W3092168082A5043901777 @default.
- W3092168082 hasAuthorship W3092168082A5051227924 @default.
- W3092168082 hasAuthorship W3092168082A5054050815 @default.
- W3092168082 hasAuthorship W3092168082A5081662508 @default.
- W3092168082 hasBestOaLocation W30921680821 @default.
- W3092168082 hasConcept C105339364 @default.
- W3092168082 hasConcept C111919701 @default.
- W3092168082 hasConcept C11413529 @default.
- W3092168082 hasConcept C114614502 @default.
- W3092168082 hasConcept C121332964 @default.