Matches in SemOpenAlex for { <https://semopenalex.org/work/W3017083044> ?p ?o ?g. }
- W3017083044 endingPage "210" @default.
- W3017083044 startingPage "187" @default.
- W3017083044 abstract "In this chapter, we approach the unsupervised learning problem in both space and time head-on, by considering both spatial and temporal dimensions from the very beginning. We couple, from the start, the appearance of objects. It also defines their spatial properties with their motion, which defines their existence in time and provide a unique graph clustering formulation in both space and time for the problem of unsupervised object discovery in video. Again we resort to a graph formulation: we have one-to-one correspondences between graph nodes and video pixels. Graph nodes that belong to the main object of interest should form a strong cluster: they are linked through trajectories defined by pixels that belong to the same physical object point and they should also have similar appearance features along those trajectories. The clustering problem aims to maximize both their agreements in motion and their similarities in appearance. Therefore, we regard this object discovery problem as a graph clustering problem for which we provide an efficient spectral approach: the object segmentation in the space-time volume is captured by the principal eigenvector of a Feature-Motion matrix, which we introduce for the first time. Even though the matrix is huge, we never built it explicitly. However, our segmentation algorithm is guaranteed to converge to its principal eigenvector. The eigenvector is known to provide an optimal solution to an eigenvalue problem, which does not depend on initialization and, for this reason, is very robust in practice and able to find the foreground object in the video as the strongest cluster in the space-time graph. The approach in this chapter naturally relates to the principles of unsupervised learning presented in the introduction and also provides a first, lower level approach to coupling space and time processing. That is further explored at a higher semantic level in the final chapter, where we present our recurrent space-time graph neural network model (Nicolicioiu et al. Neural information processing systems, [1]). In practice, our method, termed GO-VOS, achieves state-of-the-art results on three difficult datasets in the literature: DAVIS, SegTrack, and YouTube-Objects." @default.
- W3017083044 created "2020-04-24" @default.
- W3017083044 creator A5086392501 @default.
- W3017083044 date "2020-01-01" @default.
- W3017083044 modified "2023-09-25" @default.
- W3017083044 title "Coupling Appearance and Motion: Unsupervised Clustering for Object Segmentation Through Space and Time" @default.
- W3017083044 cites W1496571393 @default.
- W3017083044 cites W1861492603 @default.
- W3017083044 cites W1901129140 @default.
- W3017083044 cites W1920142129 @default.
- W3017083044 cites W1951289974 @default.
- W3017083044 cites W1954128991 @default.
- W3017083044 cites W1973054923 @default.
- W3017083044 cites W1989348325 @default.
- W3017083044 cites W2060565253 @default.
- W3017083044 cites W2108598243 @default.
- W3017083044 cites W2113708607 @default.
- W3017083044 cites W2138682569 @default.
- W3017083044 cites W2148835469 @default.
- W3017083044 cites W2152953631 @default.
- W3017083044 cites W2166820607 @default.
- W3017083044 cites W2194775991 @default.
- W3017083044 cites W2197046994 @default.
- W3017083044 cites W2322739735 @default.
- W3017083044 cites W2460260369 @default.
- W3017083044 cites W2470139095 @default.
- W3017083044 cites W2564998703 @default.
- W3017083044 cites W2566030665 @default.
- W3017083044 cites W2610147486 @default.
- W3017083044 cites W2737008123 @default.
- W3017083044 cites W2777774479 @default.
- W3017083044 cites W2794847483 @default.
- W3017083044 cites W2799239273 @default.
- W3017083044 cites W2895293811 @default.
- W3017083044 cites W2895340898 @default.
- W3017083044 cites W2955084925 @default.
- W3017083044 cites W2957408986 @default.
- W3017083044 cites W2962699453 @default.
- W3017083044 cites W2963253279 @default.
- W3017083044 cites W2963395775 @default.
- W3017083044 cites W2963408063 @default.
- W3017083044 cites W2963732700 @default.
- W3017083044 cites W2963983744 @default.
- W3017083044 cites W2964157492 @default.
- W3017083044 cites W2964218467 @default.
- W3017083044 cites W2964226882 @default.
- W3017083044 cites W2967199722 @default.
- W3017083044 cites W3100388886 @default.
- W3017083044 cites W764651262 @default.
- W3017083044 doi "https://doi.org/10.1007/978-3-030-42128-1_6" @default.
- W3017083044 hasPublicationYear "2020" @default.
- W3017083044 type Work @default.
- W3017083044 sameAs 3017083044 @default.
- W3017083044 citedByCount "2" @default.
- W3017083044 countsByYear W30170830442020 @default.
- W3017083044 countsByYear W30170830442022 @default.
- W3017083044 crossrefType "book-chapter" @default.
- W3017083044 hasAuthorship W3017083044A5086392501 @default.
- W3017083044 hasConcept C104114177 @default.
- W3017083044 hasConcept C111919701 @default.
- W3017083044 hasConcept C131584629 @default.
- W3017083044 hasConcept C153180895 @default.
- W3017083044 hasConcept C154945302 @default.
- W3017083044 hasConcept C191897082 @default.
- W3017083044 hasConcept C192562407 @default.
- W3017083044 hasConcept C2778572836 @default.
- W3017083044 hasConcept C2781238097 @default.
- W3017083044 hasConcept C31972630 @default.
- W3017083044 hasConcept C41008148 @default.
- W3017083044 hasConcept C73555534 @default.
- W3017083044 hasConcept C89600930 @default.
- W3017083044 hasConceptScore W3017083044C104114177 @default.
- W3017083044 hasConceptScore W3017083044C111919701 @default.
- W3017083044 hasConceptScore W3017083044C131584629 @default.
- W3017083044 hasConceptScore W3017083044C153180895 @default.
- W3017083044 hasConceptScore W3017083044C154945302 @default.
- W3017083044 hasConceptScore W3017083044C191897082 @default.
- W3017083044 hasConceptScore W3017083044C192562407 @default.
- W3017083044 hasConceptScore W3017083044C2778572836 @default.
- W3017083044 hasConceptScore W3017083044C2781238097 @default.
- W3017083044 hasConceptScore W3017083044C31972630 @default.
- W3017083044 hasConceptScore W3017083044C41008148 @default.
- W3017083044 hasConceptScore W3017083044C73555534 @default.
- W3017083044 hasConceptScore W3017083044C89600930 @default.
- W3017083044 hasLocation W30170830441 @default.
- W3017083044 hasOpenAccess W3017083044 @default.
- W3017083044 hasPrimaryLocation W30170830441 @default.
- W3017083044 hasRelatedWork W103252489 @default.
- W3017083044 hasRelatedWork W1963494852 @default.
- W3017083044 hasRelatedWork W2004370856 @default.
- W3017083044 hasRelatedWork W2019566805 @default.
- W3017083044 hasRelatedWork W2052670720 @default.
- W3017083044 hasRelatedWork W2101128524 @default.
- W3017083044 hasRelatedWork W2383464976 @default.
- W3017083044 hasRelatedWork W2895616727 @default.
- W3017083044 hasRelatedWork W2975200075 @default.
- W3017083044 hasRelatedWork W1967061043 @default.
- W3017083044 isParatext "false" @default.