Matches in SemOpenAlex for { <https://semopenalex.org/work/W4386937882> ?p ?o ?g. }
Showing items 1 to 81 of
81
with 100 items per page.
- W4386937882 endingPage "151" @default.
- W4386937882 startingPage "140" @default.
- W4386937882 abstract "While deep neural networks have excelled at static data such as images, temporal data - the data that these networks will need to process in the real world - remains an open challenge. Handling temporal data with neural networks requires one of three options: backpropagation through time using recurrent neural networks (RNNs), treating the time series as static data for a convolutional neural network (CNNs) or attention-based transformer architectures. RNNs are an elegant autoregressive network type that naturally keep a memory of the past while performing computations. Although recurrent networks such as LSTMs have shown strong success across a multitude of fields and tasks such as natural language processing, they can be difficult to train. Transformers and 1-D CNNs, two feed-forward alternatives for temporal data, have gained popularity but in their base forms lack memory to keep track of past activations. Random recurrent networks, also known as Reservoir Computing, have shown that one need not necessarily backpropagate the error through the recurrent component. Here, we propose a novel hybrid approach that brings together the temporal memory capabilities of a random recurrent network with the powerful learning capacity of a deep temporal convolutional readout, which we call t-ConvESN. We experimentally verify that although the recurrent component remains random and unlearned, its combination with a deep readout achieves superior accuracy on a number of datasets from the UCR time series classification dataset collection compared to other state of the art deep learning architectures. Our experiments also show that our proposed method excels in datasets in the low-data regime." @default.
- W4386937882 created "2023-09-22" @default.
- W4386937882 creator A5017906200 @default.
- W4386937882 creator A5020641794 @default.
- W4386937882 creator A5020655064 @default.
- W4386937882 creator A5021632634 @default.
- W4386937882 creator A5036912867 @default.
- W4386937882 creator A5083553427 @default.
- W4386937882 date "2023-01-01" @default.
- W4386937882 modified "2023-09-28" @default.
- W4386937882 title "t-ConvESN: Temporal Convolution-Readout for Random Recurrent Neural Networks" @default.
- W4386937882 cites W119403003 @default.
- W4386937882 cites W2043487837 @default.
- W4386937882 cites W2048060899 @default.
- W4386937882 cites W2064675550 @default.
- W4386937882 cites W2068470708 @default.
- W4386937882 cites W2112796928 @default.
- W4386937882 cites W2145339207 @default.
- W4386937882 cites W2150355110 @default.
- W4386937882 cites W2159006998 @default.
- W4386937882 cites W2194775991 @default.
- W4386937882 cites W2371946849 @default.
- W4386937882 cites W2555077524 @default.
- W4386937882 cites W2897647030 @default.
- W4386937882 cites W2902613830 @default.
- W4386937882 cites W2937484199 @default.
- W4386937882 cites W2949676527 @default.
- W4386937882 cites W2950032177 @default.
- W4386937882 cites W2952136840 @default.
- W4386937882 cites W2952744660 @default.
- W4386937882 cites W2962911926 @default.
- W4386937882 cites W2975924256 @default.
- W4386937882 cites W3096347622 @default.
- W4386937882 cites W4223588071 @default.
- W4386937882 doi "https://doi.org/10.1007/978-3-031-44223-0_12" @default.
- W4386937882 hasPublicationYear "2023" @default.
- W4386937882 type Work @default.
- W4386937882 citedByCount "0" @default.
- W4386937882 crossrefType "book-chapter" @default.
- W4386937882 hasAuthorship W4386937882A5017906200 @default.
- W4386937882 hasAuthorship W4386937882A5020641794 @default.
- W4386937882 hasAuthorship W4386937882A5020655064 @default.
- W4386937882 hasAuthorship W4386937882A5021632634 @default.
- W4386937882 hasAuthorship W4386937882A5036912867 @default.
- W4386937882 hasAuthorship W4386937882A5083553427 @default.
- W4386937882 hasConcept C108583219 @default.
- W4386937882 hasConcept C119857082 @default.
- W4386937882 hasConcept C124101348 @default.
- W4386937882 hasConcept C147168706 @default.
- W4386937882 hasConcept C154945302 @default.
- W4386937882 hasConcept C41008148 @default.
- W4386937882 hasConcept C50644808 @default.
- W4386937882 hasConcept C77277458 @default.
- W4386937882 hasConcept C81363708 @default.
- W4386937882 hasConceptScore W4386937882C108583219 @default.
- W4386937882 hasConceptScore W4386937882C119857082 @default.
- W4386937882 hasConceptScore W4386937882C124101348 @default.
- W4386937882 hasConceptScore W4386937882C147168706 @default.
- W4386937882 hasConceptScore W4386937882C154945302 @default.
- W4386937882 hasConceptScore W4386937882C41008148 @default.
- W4386937882 hasConceptScore W4386937882C50644808 @default.
- W4386937882 hasConceptScore W4386937882C77277458 @default.
- W4386937882 hasConceptScore W4386937882C81363708 @default.
- W4386937882 hasLocation W43869378821 @default.
- W4386937882 hasOpenAccess W4386937882 @default.
- W4386937882 hasPrimaryLocation W43869378821 @default.
- W4386937882 hasRelatedWork W2731899572 @default.
- W4386937882 hasRelatedWork W2999805992 @default.
- W4386937882 hasRelatedWork W3116150086 @default.
- W4386937882 hasRelatedWork W3133861977 @default.
- W4386937882 hasRelatedWork W4200173597 @default.
- W4386937882 hasRelatedWork W4223943233 @default.
- W4386937882 hasRelatedWork W4291897433 @default.
- W4386937882 hasRelatedWork W4312417841 @default.
- W4386937882 hasRelatedWork W4321369474 @default.
- W4386937882 hasRelatedWork W4380075502 @default.
- W4386937882 isParatext "false" @default.
- W4386937882 isRetracted "false" @default.
- W4386937882 workType "book-chapter" @default.