Matches in SemOpenAlex for { <https://semopenalex.org/work/W1568405145> ?p ?o ?g. }
Showing items 1 to 88 of
88
with 100 items per page.
- W1568405145 endingPage "4" @default.
- W1568405145 startingPage "3" @default.
- W1568405145 abstract "In the last years, researchers from the Computational Vision working field have been developing new methods to perform image segmentation for human motion analysis. The development of computational techniques suitable to automatically identify the structures involved is necessary to obtain more representative and robust features to be further used in the analysis of human motion from image sequences. The first step of human motion analysis from image sequences is strongly related with image segmentation. In fact, the first goal of any system designed for this aim is the identification of the structures’ features to be analysed in the image frames. If, for a human, the task of identifying moving structures in images is more or less trivial, computationally this task has proofed not to be so simple. Image segmentation methods need to deal with some challenges concerning the image sequences which are used. As examples of these challenges we can mention: lighting conditions that can change along the sequences; occlusion problems, when the structure does not remain inside the workspace or when the structure is partially occluded; the existence of dynamic backgrounds, when the camera is in motion or the scene is changing; or multiple moving structures, when there is more than one structure moving in the workspace at the same time. It is not straightforward to develop methods suitable to deal with all these problems and difficulties at once, so it is common to make some assumptions and simplifications; however, each day more and more robust and accurate methods are being developed. The use of an edge detection algorithm, by itself, is obviously not enough to identify a structure in an image sequence. The most typical method of image segmentation is background subtraction, which involves the calculi of a reference image followed by the subtraction of each frame of the image sequence from the reference and further threshold of the result. The simplest form is using a time-averaged background image as reference but it requires a training period absent of foreground objects. Other possibility is describing each pixel in the scene by a mixture of Gaussian distributions, where the weight parameters of the mixture represent the time proportions that those colours stay in the scene, so background components will be the ones with the highest probable colours. However, this last method usually fails in busy environments where a clean background is rare. Other possibility is the use of a method based on Bayes decision theory to detect foreground structures from complex image sequences. In the first step of this method, non-change pixels in the image stream are filtered out by simple background and temporal differences. Then, the detected changes are separated as pixels belonging to stationary and moving structure according to inter-frame changes. After that, the pixels associated with stationary or moving structures are classified as background or foreground based on the learned statistics of colours through the use of the Bayes decision rule. Finally, foreground structures are segmented by fusing the results from both stationary and motion pixels and the background model is updated. This method showed to work well in complex backgrounds including sequences with variable light conditions and shadows of moving structures. In this work we will explore in more detail the two segmentation methods referred, present some experimental results and address possible practical applications related with human motion." @default.
- W1568405145 created "2016-06-24" @default.
- W1568405145 creator A5033364047 @default.
- W1568405145 creator A5064741301 @default.
- W1568405145 date "2009-03-01" @default.
- W1568405145 modified "2023-09-24" @default.
- W1568405145 title "Segmentation methods for human motion analysis from image sequences" @default.
- W1568405145 doi "https://doi.org/10.3970/icces.2009.010.003" @default.
- W1568405145 hasPublicationYear "2009" @default.
- W1568405145 type Work @default.
- W1568405145 sameAs 1568405145 @default.
- W1568405145 citedByCount "0" @default.
- W1568405145 crossrefType "journal-article" @default.
- W1568405145 hasAuthorship W1568405145A5033364047 @default.
- W1568405145 hasAuthorship W1568405145A5064741301 @default.
- W1568405145 hasConcept C104114177 @default.
- W1568405145 hasConcept C115961682 @default.
- W1568405145 hasConcept C116834253 @default.
- W1568405145 hasConcept C124504099 @default.
- W1568405145 hasConcept C124774092 @default.
- W1568405145 hasConcept C127413603 @default.
- W1568405145 hasConcept C146159030 @default.
- W1568405145 hasConcept C154945302 @default.
- W1568405145 hasConcept C201995342 @default.
- W1568405145 hasConcept C202444582 @default.
- W1568405145 hasConcept C2777036941 @default.
- W1568405145 hasConcept C2780451532 @default.
- W1568405145 hasConcept C31972630 @default.
- W1568405145 hasConcept C33923547 @default.
- W1568405145 hasConcept C41008148 @default.
- W1568405145 hasConcept C58581272 @default.
- W1568405145 hasConcept C59822182 @default.
- W1568405145 hasConcept C86803240 @default.
- W1568405145 hasConcept C89600930 @default.
- W1568405145 hasConcept C90509273 @default.
- W1568405145 hasConcept C9652623 @default.
- W1568405145 hasConceptScore W1568405145C104114177 @default.
- W1568405145 hasConceptScore W1568405145C115961682 @default.
- W1568405145 hasConceptScore W1568405145C116834253 @default.
- W1568405145 hasConceptScore W1568405145C124504099 @default.
- W1568405145 hasConceptScore W1568405145C124774092 @default.
- W1568405145 hasConceptScore W1568405145C127413603 @default.
- W1568405145 hasConceptScore W1568405145C146159030 @default.
- W1568405145 hasConceptScore W1568405145C154945302 @default.
- W1568405145 hasConceptScore W1568405145C201995342 @default.
- W1568405145 hasConceptScore W1568405145C202444582 @default.
- W1568405145 hasConceptScore W1568405145C2777036941 @default.
- W1568405145 hasConceptScore W1568405145C2780451532 @default.
- W1568405145 hasConceptScore W1568405145C31972630 @default.
- W1568405145 hasConceptScore W1568405145C33923547 @default.
- W1568405145 hasConceptScore W1568405145C41008148 @default.
- W1568405145 hasConceptScore W1568405145C58581272 @default.
- W1568405145 hasConceptScore W1568405145C59822182 @default.
- W1568405145 hasConceptScore W1568405145C86803240 @default.
- W1568405145 hasConceptScore W1568405145C89600930 @default.
- W1568405145 hasConceptScore W1568405145C90509273 @default.
- W1568405145 hasConceptScore W1568405145C9652623 @default.
- W1568405145 hasIssue "1" @default.
- W1568405145 hasLocation W15684051451 @default.
- W1568405145 hasOpenAccess W1568405145 @default.
- W1568405145 hasPrimaryLocation W15684051451 @default.
- W1568405145 hasRelatedWork W104662191 @default.
- W1568405145 hasRelatedWork W105997796 @default.
- W1568405145 hasRelatedWork W1516641347 @default.
- W1568405145 hasRelatedWork W1520802838 @default.
- W1568405145 hasRelatedWork W176903447 @default.
- W1568405145 hasRelatedWork W2145396557 @default.
- W1568405145 hasRelatedWork W2232711945 @default.
- W1568405145 hasRelatedWork W2274039799 @default.
- W1568405145 hasRelatedWork W2280044960 @default.
- W1568405145 hasRelatedWork W228506742 @default.
- W1568405145 hasRelatedWork W245293689 @default.
- W1568405145 hasRelatedWork W246815355 @default.
- W1568405145 hasRelatedWork W27436710 @default.
- W1568405145 hasRelatedWork W28233186 @default.
- W1568405145 hasRelatedWork W314121311 @default.
- W1568405145 hasRelatedWork W354181973 @default.
- W1568405145 hasRelatedWork W45655808 @default.
- W1568405145 hasRelatedWork W63684730 @default.
- W1568405145 hasRelatedWork W67686417 @default.
- W1568405145 hasRelatedWork W97530353 @default.
- W1568405145 hasVolume "10" @default.
- W1568405145 isParatext "false" @default.
- W1568405145 isRetracted "false" @default.
- W1568405145 magId "1568405145" @default.
- W1568405145 workType "article" @default.