Matches in SemOpenAlex for { <https://semopenalex.org/work/W2750822272> ?p ?o ?g. }
- W2750822272 abstract "Segments that span contiguous parts of inputs, such as phonemes in speech, named entities in sentences, actions in videos, occur frequently in sequence prediction problems. Segmental models, a class of models that explicitly hypothesizes segments, have allowed the exploration of rich segment features for sequence prediction. However, segmental models suffer from slow decoding, hampering the use of computationally expensive features. In this thesis, we introduce discriminative segmental cascades, a multi-pass inference framework that allows us to improve accuracy by adding higher-order features and neural segmental features while maintaining efficiency. We also show that instead of including more features to obtain better accuracy, segmental cascades can be used to speed up training and decoding. Segmental models, similarly to conventional speech recognizers, are typically trained in multiple stages. In the first stage, a frame classifier is trained with manual alignments, and then in the second stage, segmental models are trained with manual alignments and the out- puts of the frame classifier. However, obtaining manual alignments are time-consuming and expensive. We explore end-to-end training for segmental models with various loss functions, and show how end-to-end training with marginal log loss can eliminate the need for detailed manual alignments. We draw the connections between the marginal log loss and a popular end-to-end training approach called connectionist temporal classification. We present a unifying framework for various end-to-end graph search-based models, such as hidden Markov models, connectionist temporal classification, and segmental models. Finally, we discuss possible extensions of segmental models to large-vocabulary sequence prediction tasks." @default.
- W2750822272 created "2017-09-15" @default.
- W2750822272 creator A5030898980 @default.
- W2750822272 date "2017-09-05" @default.
- W2750822272 modified "2023-09-27" @default.
- W2750822272 title "Sequence Prediction with Neural Segmental Models" @default.
- W2750822272 cites W1491188634 @default.
- W2750822272 cites W1508165687 @default.
- W2750822272 cites W1524333225 @default.
- W2750822272 cites W152459128 @default.
- W2750822272 cites W1533861849 @default.
- W2750822272 cites W1562289873 @default.
- W2750822272 cites W1583497301 @default.
- W2750822272 cites W1583757088 @default.
- W2750822272 cites W1686810756 @default.
- W2750822272 cites W175164482 @default.
- W2750822272 cites W1778492285 @default.
- W2750822272 cites W1828163288 @default.
- W2750822272 cites W1860547055 @default.
- W2750822272 cites W1872806465 @default.
- W2750822272 cites W1877154798 @default.
- W2750822272 cites W1877570817 @default.
- W2750822272 cites W1889843584 @default.
- W2750822272 cites W1942713348 @default.
- W2750822272 cites W1970533835 @default.
- W2750822272 cites W1975638594 @default.
- W2750822272 cites W1990005915 @default.
- W2750822272 cites W1992153276 @default.
- W2750822272 cites W2009540473 @default.
- W2750822272 cites W2010291496 @default.
- W2750822272 cites W2022058071 @default.
- W2750822272 cites W2039679461 @default.
- W2750822272 cites W2048182576 @default.
- W2750822272 cites W2053567709 @default.
- W2750822272 cites W2064218608 @default.
- W2750822272 cites W2064675550 @default.
- W2750822272 cites W2070696251 @default.
- W2750822272 cites W2074546930 @default.
- W2750822272 cites W2077804127 @default.
- W2750822272 cites W2083393647 @default.
- W2750822272 cites W2090823093 @default.
- W2750822272 cites W2093231248 @default.
- W2750822272 cites W2102113734 @default.
- W2750822272 cites W2105080323 @default.
- W2750822272 cites W2111478553 @default.
- W2750822272 cites W2111551391 @default.
- W2750822272 cites W2111732304 @default.
- W2750822272 cites W2112861996 @default.
- W2750822272 cites W2113042716 @default.
- W2750822272 cites W2114347655 @default.
- W2750822272 cites W2115328410 @default.
- W2750822272 cites W2119070704 @default.
- W2750822272 cites W2119155265 @default.
- W2750822272 cites W2121647670 @default.
- W2750822272 cites W2122228338 @default.
- W2750822272 cites W2125528794 @default.
- W2750822272 cites W2125838338 @default.
- W2750822272 cites W2127141656 @default.
- W2750822272 cites W2130647295 @default.
- W2750822272 cites W2131033001 @default.
- W2750822272 cites W2131870124 @default.
- W2750822272 cites W2132714218 @default.
- W2750822272 cites W2134797427 @default.
- W2750822272 cites W2137095888 @default.
- W2750822272 cites W2137143056 @default.
- W2750822272 cites W2137807925 @default.
- W2750822272 cites W2142303499 @default.
- W2750822272 cites W2143612262 @default.
- W2750822272 cites W2143908786 @default.
- W2750822272 cites W2144165558 @default.
- W2750822272 cites W2146502635 @default.
- W2750822272 cites W2146871184 @default.
- W2750822272 cites W2150907703 @default.
- W2750822272 cites W2151484683 @default.
- W2750822272 cites W2152263452 @default.
- W2750822272 cites W2152790380 @default.
- W2750822272 cites W2153568660 @default.
- W2750822272 cites W2158069733 @default.
- W2750822272 cites W2158510249 @default.
- W2750822272 cites W2161562001 @default.
- W2750822272 cites W2162152253 @default.
- W2750822272 cites W2164714022 @default.
- W2750822272 cites W2166765763 @default.
- W2750822272 cites W2166866311 @default.
- W2750822272 cites W2169819436 @default.
- W2750822272 cites W2170585560 @default.
- W2750822272 cites W2196580907 @default.
- W2750822272 cites W2265877413 @default.
- W2750822272 cites W2291022022 @default.
- W2750822272 cites W2293858598 @default.
- W2750822272 cites W232191560 @default.
- W2750822272 cites W2395342389 @default.
- W2750822272 cites W2401167848 @default.
- W2750822272 cites W2403708103 @default.
- W2750822272 cites W2407151108 @default.
- W2750822272 cites W2520160253 @default.
- W2750822272 cites W2521779032 @default.
- W2750822272 cites W2521999726 @default.
- W2750822272 cites W2525734268 @default.
- W2750822272 cites W2543899171 @default.