Matches in SemOpenAlex for { <https://semopenalex.org/work/W2112796928> ?p ?o ?g. }
- W2112796928 endingPage "2324" @default.
- W2112796928 startingPage "2278" @default.
- W2112796928 abstract "Multilayer neural networks trained with the back-propagation algorithm constitute the best example of a successful gradient based learning technique. Given an appropriate network architecture, gradient-based learning algorithms can be used to synthesize a complex decision surface that can classify high-dimensional patterns, such as handwritten characters, with minimal preprocessing. This paper reviews various methods applied to handwritten character recognition and compares them on a standard handwritten digit recognition task. Convolutional neural networks, which are specifically designed to deal with the variability of 2D shapes, are shown to outperform all other techniques. Real-life document recognition systems are composed of multiple modules including field extraction, segmentation recognition, and language modeling. A new learning paradigm, called graph transformer networks (GTN), allows such multimodule systems to be trained globally using gradient-based methods so as to minimize an overall performance measure. Two systems for online handwriting recognition are described. Experiments demonstrate the advantage of global training, and the flexibility of graph transformer networks. A graph transformer network for reading a bank cheque is also described. It uses convolutional neural network character recognizers combined with global training techniques to provide record accuracy on business and personal cheques. It is deployed commercially and reads several million cheques per day." @default.
- W2112796928 created "2016-06-24" @default.
- W2112796928 creator A5001226970 @default.
- W2112796928 creator A5019206666 @default.
- W2112796928 creator A5026605560 @default.
- W2112796928 creator A5086198262 @default.
- W2112796928 date "1998-01-01" @default.
- W2112796928 modified "2023-10-18" @default.
- W2112796928 title "Gradient-based learning applied to document recognition" @default.
- W2112796928 cites W1676820704 @default.
- W2112796928 cites W1761621746 @default.
- W2112796928 cites W1877570817 @default.
- W2112796928 cites W1991133427 @default.
- W2112796928 cites W1992774725 @default.
- W2112796928 cites W1994065256 @default.
- W2112796928 cites W1998255116 @default.
- W2112796928 cites W1999497791 @default.
- W2112796928 cites W2006544565 @default.
- W2112796928 cites W2007857129 @default.
- W2112796928 cites W2010315761 @default.
- W2112796928 cites W2010581677 @default.
- W2112796928 cites W2030781528 @default.
- W2112796928 cites W2035526030 @default.
- W2112796928 cites W2041460909 @default.
- W2112796928 cites W2042492924 @default.
- W2112796928 cites W2046485094 @default.
- W2112796928 cites W2053176763 @default.
- W2112796928 cites W2055075080 @default.
- W2112796928 cites W2056695679 @default.
- W2112796928 cites W2056763477 @default.
- W2112796928 cites W2057200159 @default.
- W2112796928 cites W2057619148 @default.
- W2112796928 cites W2060604179 @default.
- W2112796928 cites W2063541597 @default.
- W2112796928 cites W2087347434 @default.
- W2112796928 cites W2090614046 @default.
- W2112796928 cites W2091987367 @default.
- W2112796928 cites W2095757522 @default.
- W2112796928 cites W2095891604 @default.
- W2112796928 cites W2097316030 @default.
- W2112796928 cites W2099070536 @default.
- W2112796928 cites W2103496339 @default.
- W2112796928 cites W2107878631 @default.
- W2112796928 cites W2113292028 @default.
- W2112796928 cites W2116360511 @default.
- W2112796928 cites W2117671523 @default.
- W2112796928 cites W2124351082 @default.
- W2112796928 cites W2125838338 @default.
- W2112796928 cites W2128652941 @default.
- W2112796928 cites W2131877510 @default.
- W2112796928 cites W2132131403 @default.
- W2112796928 cites W2132793646 @default.
- W2112796928 cites W2132904398 @default.
- W2112796928 cites W2134267682 @default.
- W2112796928 cites W2135936685 @default.
- W2112796928 cites W2140005396 @default.
- W2112796928 cites W2144354855 @default.
- W2112796928 cites W2144405074 @default.
- W2112796928 cites W2147800946 @default.
- W2112796928 cites W2148099973 @default.
- W2112796928 cites W2151871503 @default.
- W2112796928 cites W2156909104 @default.
- W2112796928 cites W2158670134 @default.
- W2112796928 cites W2162794177 @default.
- W2112796928 cites W2165668746 @default.
- W2112796928 cites W2165959773 @default.
- W2112796928 cites W2170599822 @default.
- W2112796928 cites W2171590421 @default.
- W2112796928 cites W2566703758 @default.
- W2112796928 cites W3004732066 @default.
- W2112796928 cites W413857758 @default.
- W2112796928 cites W811578723 @default.
- W2112796928 doi "https://doi.org/10.1109/5.726791" @default.
- W2112796928 hasPublicationYear "1998" @default.
- W2112796928 type Work @default.
- W2112796928 sameAs 2112796928 @default.
- W2112796928 citedByCount "38852" @default.
- W2112796928 countsByYear W21127969282012 @default.
- W2112796928 countsByYear W21127969282013 @default.
- W2112796928 countsByYear W21127969282014 @default.
- W2112796928 countsByYear W21127969282015 @default.
- W2112796928 countsByYear W21127969282016 @default.
- W2112796928 countsByYear W21127969282017 @default.
- W2112796928 countsByYear W21127969282018 @default.
- W2112796928 countsByYear W21127969282019 @default.
- W2112796928 countsByYear W21127969282020 @default.
- W2112796928 countsByYear W21127969282021 @default.
- W2112796928 countsByYear W21127969282022 @default.
- W2112796928 countsByYear W21127969282023 @default.
- W2112796928 crossrefType "journal-article" @default.
- W2112796928 hasAuthorship W2112796928A5001226970 @default.
- W2112796928 hasAuthorship W2112796928A5019206666 @default.
- W2112796928 hasAuthorship W2112796928A5026605560 @default.
- W2112796928 hasAuthorship W2112796928A5086198262 @default.
- W2112796928 hasBestOaLocation W21127969282 @default.
- W2112796928 hasConcept C108583219 @default.
- W2112796928 hasConcept C112640561 @default.
- W2112796928 hasConcept C115961682 @default.