Matches in SemOpenAlex for { <https://semopenalex.org/work/W2069797029> ?p ?o ?g. }
- W2069797029 endingPage "48" @default.
- W2069797029 startingPage "35" @default.
- W2069797029 abstract "Plant phenotyping investigates how a plant's genome, interacting with the environment, affects the observable traits of a plant (phenome). It is becoming increasingly important in our quest towards efficient and sustainable agriculture. While sequencing the genome is becoming increasingly efficient, acquiring phenotype information has remained largely of low throughput. Current solutions for automated image-based plant phenotyping, rely either on semi-automated or manual analysis of the imaging data, or on expensive and proprietary software which accompanies costly hardware infrastructure. While some attempts have been made to create software applications that enable the analysis of such images in an automated fashion, most solutions are tailored to particular acquisition scenarios and restrictions on experimental design. In this paper we propose and test, a method for the segmentation and the automated analysis of time-lapse plant images from phenotyping experiments in a general laboratory setting, that can adapt to scene variability. The method involves minimal user interaction, necessary to establish the statistical experiments that may follow. At every time instance (i.e., a digital photograph), it segments the plants in images that contain many specimens of the same species. For accurate plant segmentation we propose a vector valued level set formulation that incorporates features of color intensity, local texture, and prior knowledge. Prior knowledge is incorporated using a plant appearance model implemented with Gaussian mixture models, which utilizes incrementally information from previously segmented instances. The proposed approach is tested on Arabidopsis plant images acquired with a static camera capturing many subjects at the same time. Our validation with ground truth segmentations and comparisons with state-of-the-art methods in the literature shows that the proposed method is able to handle images with complicated and changing background in an automated fashion. An accuracy of 96.7% (dice similarity coefficient) was observed, which was higher than other methods used for comparison. While here it was tested on a single plant species, the fact that we do not employ shape driven models and we do not rely on fully supervised classification (trained on a large dataset) increases the ease of deployment of the proposed solution for the study of different plant species in a variety of laboratory settings. Our solution will be accompanied by an easy to use graphical user interface and, to facilitate adoption, we will make the software available to the scientific community." @default.
- W2069797029 created "2016-06-24" @default.
- W2069797029 creator A5012776735 @default.
- W2069797029 creator A5054437456 @default.
- W2069797029 creator A5079533939 @default.
- W2069797029 date "2014-09-01" @default.
- W2069797029 modified "2023-10-16" @default.
- W2069797029 title "Image-based plant phenotyping with incremental learning and active contours" @default.
- W2069797029 cites W1506879382 @default.
- W2069797029 cites W1524686487 @default.
- W2069797029 cites W1964276621 @default.
- W2069797029 cites W1984407331 @default.
- W2069797029 cites W2002618468 @default.
- W2069797029 cites W2003451183 @default.
- W2069797029 cites W2005287355 @default.
- W2069797029 cites W2007743779 @default.
- W2069797029 cites W2010670785 @default.
- W2069797029 cites W2017101219 @default.
- W2069797029 cites W2026356722 @default.
- W2069797029 cites W2028766141 @default.
- W2069797029 cites W2034110104 @default.
- W2069797029 cites W2045928667 @default.
- W2069797029 cites W2052257689 @default.
- W2069797029 cites W2065350495 @default.
- W2069797029 cites W2069895633 @default.
- W2069797029 cites W2074886162 @default.
- W2069797029 cites W2081379350 @default.
- W2069797029 cites W2086330580 @default.
- W2069797029 cites W2089506586 @default.
- W2069797029 cites W2090950329 @default.
- W2069797029 cites W2103903744 @default.
- W2069797029 cites W2110158442 @default.
- W2069797029 cites W2112740731 @default.
- W2069797029 cites W2114937946 @default.
- W2069797029 cites W2119144239 @default.
- W2069797029 cites W2119531662 @default.
- W2069797029 cites W2132363464 @default.
- W2069797029 cites W2133059825 @default.
- W2069797029 cites W2133353275 @default.
- W2069797029 cites W2135192916 @default.
- W2069797029 cites W2156262565 @default.
- W2069797029 cites W2157295583 @default.
- W2069797029 cites W2166800206 @default.
- W2069797029 cites W2167354555 @default.
- W2069797029 cites W2168036630 @default.
- W2069797029 cites W2173414649 @default.
- W2069797029 doi "https://doi.org/10.1016/j.ecoinf.2013.07.004" @default.
- W2069797029 hasPublicationYear "2014" @default.
- W2069797029 type Work @default.
- W2069797029 sameAs 2069797029 @default.
- W2069797029 citedByCount "101" @default.
- W2069797029 countsByYear W20697970292013 @default.
- W2069797029 countsByYear W20697970292014 @default.
- W2069797029 countsByYear W20697970292015 @default.
- W2069797029 countsByYear W20697970292016 @default.
- W2069797029 countsByYear W20697970292017 @default.
- W2069797029 countsByYear W20697970292018 @default.
- W2069797029 countsByYear W20697970292019 @default.
- W2069797029 countsByYear W20697970292020 @default.
- W2069797029 countsByYear W20697970292021 @default.
- W2069797029 countsByYear W20697970292022 @default.
- W2069797029 countsByYear W20697970292023 @default.
- W2069797029 crossrefType "journal-article" @default.
- W2069797029 hasAuthorship W2069797029A5012776735 @default.
- W2069797029 hasAuthorship W2069797029A5054437456 @default.
- W2069797029 hasAuthorship W2069797029A5079533939 @default.
- W2069797029 hasBestOaLocation W20697970292 @default.
- W2069797029 hasConcept C104317684 @default.
- W2069797029 hasConcept C119857082 @default.
- W2069797029 hasConcept C124101348 @default.
- W2069797029 hasConcept C124504099 @default.
- W2069797029 hasConcept C141231307 @default.
- W2069797029 hasConcept C153180895 @default.
- W2069797029 hasConcept C154945302 @default.
- W2069797029 hasConcept C177264268 @default.
- W2069797029 hasConcept C199360897 @default.
- W2069797029 hasConcept C2777904410 @default.
- W2069797029 hasConcept C31972630 @default.
- W2069797029 hasConcept C41008148 @default.
- W2069797029 hasConcept C55493867 @default.
- W2069797029 hasConcept C7952369 @default.
- W2069797029 hasConcept C86803240 @default.
- W2069797029 hasConcept C89600930 @default.
- W2069797029 hasConceptScore W2069797029C104317684 @default.
- W2069797029 hasConceptScore W2069797029C119857082 @default.
- W2069797029 hasConceptScore W2069797029C124101348 @default.
- W2069797029 hasConceptScore W2069797029C124504099 @default.
- W2069797029 hasConceptScore W2069797029C141231307 @default.
- W2069797029 hasConceptScore W2069797029C153180895 @default.
- W2069797029 hasConceptScore W2069797029C154945302 @default.
- W2069797029 hasConceptScore W2069797029C177264268 @default.
- W2069797029 hasConceptScore W2069797029C199360897 @default.
- W2069797029 hasConceptScore W2069797029C2777904410 @default.
- W2069797029 hasConceptScore W2069797029C31972630 @default.
- W2069797029 hasConceptScore W2069797029C41008148 @default.
- W2069797029 hasConceptScore W2069797029C55493867 @default.
- W2069797029 hasConceptScore W2069797029C7952369 @default.
- W2069797029 hasConceptScore W2069797029C86803240 @default.