Matches in SemOpenAlex for { <https://semopenalex.org/work/W76496840> ?p ?o ?g. }
Showing items 1 to 85 of
85
with 100 items per page.
- W76496840 abstract "Tracking pedestrians in surveillance videos is an important task, not only in itself but also as a component of pedestrian counting, activity and event recognition, and scene understanding in general. Robust tracking in crowded environments remains a major challenge, mainly due to the occlusions and interactions between pedestrians. Methods to detect humans in a single frame are becoming increasingly accurate. Therefore, the majority of multi-target tracking algorithms in crowds follow a tracking-by-detection approach, along with models of individual and group behaviour, and various types of features to re-identify any given pedestrian (and discriminate them from the remainder). The aim is, given a Closed Circuit TeleVision (CCTV) camera view (moving or static) of a crowded scene, to produce tracks that indicate which pedestrians are entering and leaving the scene to be used in further applications (e.g. a multi-camera tracking scenario). Therefore, this output should be accurate in terms of position, have few false alarms and identity changes (i.e. tracks have not to be fragmented nor switch identity). Consequently, the presented algorithm concentrates on two important characteristics. Firstly, production of a real-time or near real-time output to be practically usable for further applications without penalising the final system. Secondly, management of occlusions which is the main challenge in crowds. The methodology presented, based on a tracking-by-detection approach, proposes an advance over those two aspects through a hierarchical framework to solve short and long occlusions with two novel methods. First, at a fine temporal scale, kinematic features and appearance features based on non-occluded parts are combined to generate short and reliable 'tracklets'. More specifically, this part uses an occlusion map which attributes a local measurement (by searching over the non-occluded parts) to a target without a global measurement (i.e. a measurement generated by the global detector), and demonstratesbetter results in terms of tracklet length without generating more false alarms or identity changes. Over a longer scale, these tracklets are associated with each other to build up longer tracks for each pedestrian in the scene. This tracklet data association is based on a novel approach that uses dynamic time warping to locate and measure the possible similarities of appearances between tracklets, by varying the time step and phase of the frame-based visual feature. The method, which does not require any target initialisations or camera calibrations, shows significant improvements in terms of false alarms and identity changes, thelatter being a critical point for evaluating tracking algorithms. The evaluation framework, based on different metrics introduced in the literature, consists of a set of new track-based metrics (in contrast toframe-based) which enables failure parts of a tracker to be identified and algorithms to be compared as a single value. Finally, advantages of the dual method proposed to solve long and short occlusions are toreduce simultaneously the problem of track fragmentation and identity switches, and to make it naturally extensible to a multi-camera scenario. Results are presented as a tag and track system over a networkof moving and static cameras. In addition to public datasets for multi-target tracking in crowds(e.g. Oxford Town Centre (OTC) dataset) where the new methodology introduced (i.e. building tracklets based on non-occluded pedestrian parts plus re-identification with dynamic time warping) showssignificant improvements. Two new datasets are introduced to test the robustness of the algorithm proposed in more challenging scenarios. Firstly, a CCTV shopping view centre is used to demonstrate the effectiveness of the algorithm in a more crowded scenario. Secondly, a dataset with a network of CCTV Pan Tilt Zoom (PTZ) cameras tracking a single pedestrian, demonstrates the capability of the algorithmto handle a very difficult scenario (abrupt motion and non-overlapping camera views) and therefore its applicability as a component of a multitarget tracker in a network of static and PTZ cameras. The thesis concludes with a critical analysis of the work and presents future research opportunities (notably the use of this framework in a non-overlapping network of static and PTZ cameras)." @default.
- W76496840 created "2016-06-24" @default.
- W76496840 creator A5011035821 @default.
- W76496840 date "2012-10-01" @default.
- W76496840 modified "2023-09-25" @default.
- W76496840 title "Detecting and tracking humans in crowded scenes based on 2D image understanding" @default.
- W76496840 hasPublicationYear "2012" @default.
- W76496840 type Work @default.
- W76496840 sameAs 76496840 @default.
- W76496840 citedByCount "0" @default.
- W76496840 crossrefType "dissertation" @default.
- W76496840 hasAuthorship W76496840A5011035821 @default.
- W76496840 hasConcept C121332964 @default.
- W76496840 hasConcept C126042441 @default.
- W76496840 hasConcept C127413603 @default.
- W76496840 hasConcept C136764020 @default.
- W76496840 hasConcept C153180895 @default.
- W76496840 hasConcept C154945302 @default.
- W76496840 hasConcept C15744967 @default.
- W76496840 hasConcept C19417346 @default.
- W76496840 hasConcept C22212356 @default.
- W76496840 hasConcept C2775936607 @default.
- W76496840 hasConcept C2777113093 @default.
- W76496840 hasConcept C2777852691 @default.
- W76496840 hasConcept C2779662365 @default.
- W76496840 hasConcept C2780615836 @default.
- W76496840 hasConcept C31972630 @default.
- W76496840 hasConcept C38652104 @default.
- W76496840 hasConcept C39920418 @default.
- W76496840 hasConcept C41008148 @default.
- W76496840 hasConcept C56461940 @default.
- W76496840 hasConcept C57501372 @default.
- W76496840 hasConcept C62520636 @default.
- W76496840 hasConcept C74650414 @default.
- W76496840 hasConcept C76155785 @default.
- W76496840 hasConceptScore W76496840C121332964 @default.
- W76496840 hasConceptScore W76496840C126042441 @default.
- W76496840 hasConceptScore W76496840C127413603 @default.
- W76496840 hasConceptScore W76496840C136764020 @default.
- W76496840 hasConceptScore W76496840C153180895 @default.
- W76496840 hasConceptScore W76496840C154945302 @default.
- W76496840 hasConceptScore W76496840C15744967 @default.
- W76496840 hasConceptScore W76496840C19417346 @default.
- W76496840 hasConceptScore W76496840C22212356 @default.
- W76496840 hasConceptScore W76496840C2775936607 @default.
- W76496840 hasConceptScore W76496840C2777113093 @default.
- W76496840 hasConceptScore W76496840C2777852691 @default.
- W76496840 hasConceptScore W76496840C2779662365 @default.
- W76496840 hasConceptScore W76496840C2780615836 @default.
- W76496840 hasConceptScore W76496840C31972630 @default.
- W76496840 hasConceptScore W76496840C38652104 @default.
- W76496840 hasConceptScore W76496840C39920418 @default.
- W76496840 hasConceptScore W76496840C41008148 @default.
- W76496840 hasConceptScore W76496840C56461940 @default.
- W76496840 hasConceptScore W76496840C57501372 @default.
- W76496840 hasConceptScore W76496840C62520636 @default.
- W76496840 hasConceptScore W76496840C74650414 @default.
- W76496840 hasConceptScore W76496840C76155785 @default.
- W76496840 hasLocation W764968401 @default.
- W76496840 hasOpenAccess W76496840 @default.
- W76496840 hasPrimaryLocation W764968401 @default.
- W76496840 hasRelatedWork W1542930596 @default.
- W76496840 hasRelatedWork W1577235989 @default.
- W76496840 hasRelatedWork W1988765588 @default.
- W76496840 hasRelatedWork W2039174113 @default.
- W76496840 hasRelatedWork W2118767885 @default.
- W76496840 hasRelatedWork W2148958980 @default.
- W76496840 hasRelatedWork W2321212537 @default.
- W76496840 hasRelatedWork W2322297670 @default.
- W76496840 hasRelatedWork W2505784706 @default.
- W76496840 hasRelatedWork W2724574847 @default.
- W76496840 hasRelatedWork W2794124357 @default.
- W76496840 hasRelatedWork W2901012461 @default.
- W76496840 hasRelatedWork W2914819564 @default.
- W76496840 hasRelatedWork W2931561020 @default.
- W76496840 hasRelatedWork W2980284240 @default.
- W76496840 hasRelatedWork W3015961813 @default.
- W76496840 hasRelatedWork W3084046410 @default.
- W76496840 hasRelatedWork W3153062234 @default.
- W76496840 hasRelatedWork W3175782503 @default.
- W76496840 hasRelatedWork W3184453945 @default.
- W76496840 isParatext "false" @default.
- W76496840 isRetracted "false" @default.
- W76496840 magId "76496840" @default.
- W76496840 workType "dissertation" @default.