Matches in SemOpenAlex for { <https://semopenalex.org/work/W2149471024> ?p ?o ?g. }
- W2149471024 endingPage "2379" @default.
- W2149471024 startingPage "2367" @default.
- W2149471024 abstract "Hyperspectral imaging, which records a detailed spectrum of light for each pixel, provides an invaluable source of information regarding the physical nature of the different materials, leading to the potential of a more accurate classification. However, high dimensionality of hyperspectral data, usually coupled with limited reference data available, limits the performances of supervised classification techniques. The commonly used pixel-wise classification lacks information about spatial structures of the image. In order to increase classification performances, integration of spatial information into the classification process is needed. In this paper, we propose to extend the watershed segmentation algorithm for hyperspectral images, in order to define information about spatial structures. In particular, several approaches to compute a one-band gradient function from hyperspectral images are proposed and investigated. The accuracy of the watershed algorithms is demonstrated by the further incorporation of the segmentation maps into a classifier. A new spectral-spatial classification scheme for hyperspectral images is proposed, based on the pixel-wise Support Vector Machines classification, followed by majority voting within the watershed regions. Experimental segmentation and classification results are presented on two hyperspectral images. It is shown in experiments that when the number of spectral bands increases, the feature extraction and the use of multidimensional gradients appear to be preferable to the use of vectorial gradients. The integration of the spatial information from the watershed segmentation in the hyperspectral image classifier improves the classification accuracies and provides classification maps with more homogeneous regions, compared to pixel-wise classification and previously proposed spectral-spatial classification techniques. The developed method is especially suitable for classifying images with large spatial structures." @default.
- W2149471024 created "2016-06-24" @default.
- W2149471024 creator A5002687387 @default.
- W2149471024 creator A5025954012 @default.
- W2149471024 creator A5035508615 @default.
- W2149471024 date "2010-07-01" @default.
- W2149471024 modified "2023-10-18" @default.
- W2149471024 title "Segmentation and classification of hyperspectral images using watershed transformation" @default.
- W2149471024 cites W1571150687 @default.
- W2149471024 cites W163640015 @default.
- W2149471024 cites W2019249717 @default.
- W2149471024 cites W2020321855 @default.
- W2149471024 cites W2022470997 @default.
- W2149471024 cites W2025818287 @default.
- W2149471024 cites W2028933144 @default.
- W2149471024 cites W2030088149 @default.
- W2149471024 cites W2033849769 @default.
- W2149471024 cites W2034803951 @default.
- W2149471024 cites W2069231830 @default.
- W2149471024 cites W2082691086 @default.
- W2149471024 cites W2082880010 @default.
- W2149471024 cites W2092071303 @default.
- W2149471024 cites W2095190520 @default.
- W2149471024 cites W2097900616 @default.
- W2149471024 cites W2098057602 @default.
- W2149471024 cites W2104269704 @default.
- W2149471024 cites W2109853861 @default.
- W2149471024 cites W2112981956 @default.
- W2149471024 cites W2114256843 @default.
- W2149471024 cites W2114819256 @default.
- W2149471024 cites W2115451191 @default.
- W2149471024 cites W2117922990 @default.
- W2149471024 cites W2124260943 @default.
- W2149471024 cites W2126517171 @default.
- W2149471024 cites W2130463078 @default.
- W2149471024 cites W2131697388 @default.
- W2149471024 cites W2136625467 @default.
- W2149471024 cites W2146532890 @default.
- W2149471024 cites W2151351889 @default.
- W2149471024 cites W2152276863 @default.
- W2149471024 cites W2156635885 @default.
- W2149471024 cites W2162587881 @default.
- W2149471024 cites W2163994077 @default.
- W2149471024 cites W2164769329 @default.
- W2149471024 cites W4251431888 @default.
- W2149471024 doi "https://doi.org/10.1016/j.patcog.2010.01.016" @default.
- W2149471024 hasPublicationYear "2010" @default.
- W2149471024 type Work @default.
- W2149471024 sameAs 2149471024 @default.
- W2149471024 citedByCount "491" @default.
- W2149471024 countsByYear W21494710242012 @default.
- W2149471024 countsByYear W21494710242013 @default.
- W2149471024 countsByYear W21494710242014 @default.
- W2149471024 countsByYear W21494710242015 @default.
- W2149471024 countsByYear W21494710242016 @default.
- W2149471024 countsByYear W21494710242017 @default.
- W2149471024 countsByYear W21494710242018 @default.
- W2149471024 countsByYear W21494710242019 @default.
- W2149471024 countsByYear W21494710242020 @default.
- W2149471024 countsByYear W21494710242021 @default.
- W2149471024 countsByYear W21494710242022 @default.
- W2149471024 countsByYear W21494710242023 @default.
- W2149471024 crossrefType "journal-article" @default.
- W2149471024 hasAuthorship W2149471024A5002687387 @default.
- W2149471024 hasAuthorship W2149471024A5025954012 @default.
- W2149471024 hasAuthorship W2149471024A5035508615 @default.
- W2149471024 hasBestOaLocation W21494710242 @default.
- W2149471024 hasConcept C104317684 @default.
- W2149471024 hasConcept C124504099 @default.
- W2149471024 hasConcept C127313418 @default.
- W2149471024 hasConcept C150547873 @default.
- W2149471024 hasConcept C153180895 @default.
- W2149471024 hasConcept C154945302 @default.
- W2149471024 hasConcept C159078339 @default.
- W2149471024 hasConcept C185592680 @default.
- W2149471024 hasConcept C204241405 @default.
- W2149471024 hasConcept C31972630 @default.
- W2149471024 hasConcept C41008148 @default.
- W2149471024 hasConcept C55493867 @default.
- W2149471024 hasConcept C62649853 @default.
- W2149471024 hasConcept C89600930 @default.
- W2149471024 hasConceptScore W2149471024C104317684 @default.
- W2149471024 hasConceptScore W2149471024C124504099 @default.
- W2149471024 hasConceptScore W2149471024C127313418 @default.
- W2149471024 hasConceptScore W2149471024C150547873 @default.
- W2149471024 hasConceptScore W2149471024C153180895 @default.
- W2149471024 hasConceptScore W2149471024C154945302 @default.
- W2149471024 hasConceptScore W2149471024C159078339 @default.
- W2149471024 hasConceptScore W2149471024C185592680 @default.
- W2149471024 hasConceptScore W2149471024C204241405 @default.
- W2149471024 hasConceptScore W2149471024C31972630 @default.
- W2149471024 hasConceptScore W2149471024C41008148 @default.
- W2149471024 hasConceptScore W2149471024C55493867 @default.
- W2149471024 hasConceptScore W2149471024C62649853 @default.
- W2149471024 hasConceptScore W2149471024C89600930 @default.
- W2149471024 hasIssue "7" @default.
- W2149471024 hasLocation W21494710241 @default.
- W2149471024 hasLocation W21494710242 @default.