Matches in SemOpenAlex for { <https://semopenalex.org/work/W2897963792> ?p ?o ?g. }
- W2897963792 endingPage "220" @default.
- W2897963792 startingPage "193" @default.
- W2897963792 abstract "AboutSectionsView PDF ToolsAdd to favoritesDownload CitationsTrack CitationsPermissionsReprints ShareShare onFacebookTwitterLinked InEmail Go to Section HomeINFORMS TutORials in Operations ResearchRecent Advances in Optimization and Modeling of Contemporary Problems Stochastic Gradient Descent: Recent TrendsDavid Newton, Farzad Yousefian, Raghu PasupathyDavid Newton, Farzad Yousefian, Raghu PasupathyPublished Online:19 Oct 2018https://doi.org/10.1287/educ.2018.0191AbstractStochastic gradient descent (SGD), also known as stochastic approximation, refers to certain simple iterative structures used for solving stochastic optimization and root-finding problems. The identifying feature of SGD is that, much like gradient descent for deterministic optimization, each successive iterate in the recursion is determined by adding an appropriately scaled gradient estimate to the prior iterate. Owing to several factors, SGD has become the leading method to solve optimization problems arising within large-scale machine learning and “big data” contexts such as classification and regression. In this tutorial, we cover the basics of SGD with an emphasis on modern developments. The tutorial starts with stochastic optimization examples and problem variations where SGD is applicable, and then it details important flavors of SGD that are currently in use. The oral presentation of this tutorial will include numerical examples.Video of this TutORial from the 2018 INFORMS Annual Meeting in Phoenix, Arizona, November 6, 2018, is available at https://youtu.be/wKTH81w9hqE. Your Access Options Login Options INFORMS Member Login Nonmember Login Purchase Options Save for later Item saved, go to cart Tutorials in OR, TutorialsNew $20.00 Add to cart Tutorials in OR, TutorialsNew Checkout Other Options Token Access Insert token number Claim access using a token Restore guest access Applies for purchases made as a guest Previous Back to Top Next FiguresReferencesRelatedInformationCited byDeep Learning in Computer Vision: Methods, Interpretation, Causation, and FairnessNikhil Malik, Param Vir Singh2 October 2019 Recent Advances in Optimization and Modeling of Contemporary ProblemsOctober 2018 Article Information Metrics Information Published Online:October 19, 2018 Copyright © 2018, INFORMSCite asDavid Newton, Farzad Yousefian, Raghu Pasupathy (2018) Stochastic Gradient Descent: Recent Trends. INFORMS TutORials in Operations Research null(null):193-220. https://doi.org/10.1287/educ.2018.0191 Keywordsstochastic gradient descentstochastic approximationstochastic optimizationmachine learningoptimizationbig dataPDF download" @default.
- W2897963792 created "2018-10-26" @default.
- W2897963792 creator A5003164670 @default.
- W2897963792 creator A5055129023 @default.
- W2897963792 creator A5055704220 @default.
- W2897963792 date "2018-10-01" @default.
- W2897963792 modified "2023-10-03" @default.
- W2897963792 title "Stochastic Gradient Descent: Recent Trends" @default.
- W2897963792 cites W1523985187 @default.
- W2897963792 cites W1594004154 @default.
- W2897963792 cites W1602773783 @default.
- W2897963792 cites W1832379062 @default.
- W2897963792 cites W1946620893 @default.
- W2897963792 cites W1947202642 @default.
- W2897963792 cites W1968154520 @default.
- W2897963792 cites W1972711404 @default.
- W2897963792 cites W1975768153 @default.
- W2897963792 cites W1987083649 @default.
- W2897963792 cites W1988720110 @default.
- W2897963792 cites W1988795359 @default.
- W2897963792 cites W1992208280 @default.
- W2897963792 cites W1994616650 @default.
- W2897963792 cites W2000769684 @default.
- W2897963792 cites W2009702064 @default.
- W2897963792 cites W2009797711 @default.
- W2897963792 cites W2013850411 @default.
- W2897963792 cites W2018497421 @default.
- W2897963792 cites W2019441776 @default.
- W2897963792 cites W2019569173 @default.
- W2897963792 cites W2022922836 @default.
- W2897963792 cites W2023901033 @default.
- W2897963792 cites W2026272724 @default.
- W2897963792 cites W2039050532 @default.
- W2897963792 cites W2039957107 @default.
- W2897963792 cites W2040358553 @default.
- W2897963792 cites W2042650576 @default.
- W2897963792 cites W2043382637 @default.
- W2897963792 cites W2043819123 @default.
- W2897963792 cites W2045192889 @default.
- W2897963792 cites W2060777387 @default.
- W2897963792 cites W2061570747 @default.
- W2897963792 cites W2066623346 @default.
- W2897963792 cites W2080335539 @default.
- W2897963792 cites W2086161653 @default.
- W2897963792 cites W2086996405 @default.
- W2897963792 cites W2087714402 @default.
- W2897963792 cites W2092554297 @default.
- W2897963792 cites W2093417350 @default.
- W2897963792 cites W2094364653 @default.
- W2897963792 cites W2095984592 @default.
- W2897963792 cites W2098840904 @default.
- W2897963792 cites W2100556411 @default.
- W2897963792 cites W2110505738 @default.
- W2897963792 cites W2115706991 @default.
- W2897963792 cites W2117686388 @default.
- W2897963792 cites W2118550318 @default.
- W2897963792 cites W2121402967 @default.
- W2897963792 cites W2124541940 @default.
- W2897963792 cites W2131448432 @default.
- W2897963792 cites W2137385155 @default.
- W2897963792 cites W2158252006 @default.
- W2897963792 cites W2162746441 @default.
- W2897963792 cites W2163786124 @default.
- W2897963792 cites W2273889207 @default.
- W2897963792 cites W2336104608 @default.
- W2897963792 cites W2480492758 @default.
- W2897963792 cites W2621667117 @default.
- W2897963792 cites W2750616763 @default.
- W2897963792 cites W2763081248 @default.
- W2897963792 cites W2963155955 @default.
- W2897963792 cites W2963433607 @default.
- W2897963792 cites W2963465983 @default.
- W2897963792 cites W2963541115 @default.
- W2897963792 cites W3022380717 @default.
- W2897963792 cites W3103657382 @default.
- W2897963792 cites W3106438735 @default.
- W2897963792 cites W4206742934 @default.
- W2897963792 cites W4211042066 @default.
- W2897963792 cites W4236139273 @default.
- W2897963792 cites W4245104967 @default.
- W2897963792 cites W4247165901 @default.
- W2897963792 cites W4292022450 @default.
- W2897963792 cites W4301621763 @default.
- W2897963792 doi "https://doi.org/10.1287/educ.2018.0191" @default.
- W2897963792 hasPublicationYear "2018" @default.
- W2897963792 type Work @default.
- W2897963792 sameAs 2897963792 @default.
- W2897963792 citedByCount "16" @default.
- W2897963792 countsByYear W28979637922019 @default.
- W2897963792 countsByYear W28979637922020 @default.
- W2897963792 countsByYear W28979637922021 @default.
- W2897963792 countsByYear W28979637922022 @default.
- W2897963792 countsByYear W28979637922023 @default.
- W2897963792 crossrefType "book-chapter" @default.
- W2897963792 hasAuthorship W2897963792A5003164670 @default.
- W2897963792 hasAuthorship W2897963792A5055129023 @default.
- W2897963792 hasAuthorship W2897963792A5055704220 @default.
- W2897963792 hasConcept C113324615 @default.