Matches in SemOpenAlex for { <https://semopenalex.org/work/W4226337301> ?p ?o ?g. }
- W4226337301 endingPage "384" @default.
- W4226337301 startingPage "372" @default.
- W4226337301 abstract "Natural history collections are invaluable resources for understanding biotic response to global change. Museums around the world are currently imaging specimens, capturing specimen data and making them freely available online. In parallel to the digitisation effort, there have been great advancements in computer vision: the computer trained automated recognition/detection, and measurement of features in digital images. Applying computer vision to digitised natural history collections has the potential to greatly accelerate the use of these collections for biotic response to global change research. In this paper, we apply computer vision to a very large, digitised collection to test hypotheses in an established area of biotic response to climate change research: temperature-size responses. We develop a computer vision pipeline (Mothra) and apply it to the NHM collection of British butterflies (>180,000 imaged specimens). Mothra automatically detects the specimen and other objects in the image, sets the scale, measures wing features (e.g. forewing length), determines the orientation of the specimen (pinned ventrally or dorsally) and identifies the sex. We pair these measurements and specimen collection data with temperature records for 17,726 specimens across a subset of 24 species to test how adult size varies with temperature during the immature stages of species. We also assess patterns of sexual size dimorphism across species and families for 32 species trained for automated sex ID. Mothra accurately measures the forewing lengths of butterfly specimens compared to manual measurements and accurately determines the sex of specimens, with females as the larger sex in most species. An increase in adult body size with warmer monthly temperatures during the late larval stages is the most common temperature-size response. These results confirm suspected patterns and support hypotheses based on recent studies using a smaller dataset of manually measured specimens. We show that computer vision can be a powerful tool to efficiently and accurately extract phenotypic data from a very large collection of digital natural history collections. In the future, computer vision will become widely applied to digital collections to advance ecological and evolutionary research and to accelerate their use to investigate biotic response to global change." @default.
- W4226337301 created "2022-05-05" @default.
- W4226337301 creator A5004196368 @default.
- W4226337301 creator A5008028784 @default.
- W4226337301 creator A5009785618 @default.
- W4226337301 creator A5026466407 @default.
- W4226337301 creator A5026927479 @default.
- W4226337301 creator A5060634309 @default.
- W4226337301 creator A5086388106 @default.
- W4226337301 date "2022-04-05" @default.
- W4226337301 modified "2023-10-10" @default.
- W4226337301 title "Applying computer vision to digitised natural history collections for climate change research: Temperature‐size responses in British butterflies" @default.
- W4226337301 cites W1512370820 @default.
- W4226337301 cites W1901129140 @default.
- W4226337301 cites W1970457066 @default.
- W4226337301 cites W2011301426 @default.
- W4226337301 cites W2015159529 @default.
- W4226337301 cites W2022571253 @default.
- W4226337301 cites W2058153205 @default.
- W4226337301 cites W2108598243 @default.
- W4226337301 cites W2110980529 @default.
- W4226337301 cites W2123386026 @default.
- W4226337301 cites W2124588557 @default.
- W4226337301 cites W2130189419 @default.
- W4226337301 cites W2133059825 @default.
- W4226337301 cites W2150275040 @default.
- W4226337301 cites W2153315796 @default.
- W4226337301 cites W2157740231 @default.
- W4226337301 cites W2166939977 @default.
- W4226337301 cites W2194775991 @default.
- W4226337301 cites W2280564750 @default.
- W4226337301 cites W2292240279 @default.
- W4226337301 cites W2520064935 @default.
- W4226337301 cites W2520641384 @default.
- W4226337301 cites W2551284535 @default.
- W4226337301 cites W2559830671 @default.
- W4226337301 cites W2603897890 @default.
- W4226337301 cites W2774320778 @default.
- W4226337301 cites W2792627933 @default.
- W4226337301 cites W2901057860 @default.
- W4226337301 cites W2901476362 @default.
- W4226337301 cites W2901568210 @default.
- W4226337301 cites W2901960907 @default.
- W4226337301 cites W2909956747 @default.
- W4226337301 cites W2949889661 @default.
- W4226337301 cites W2959883904 @default.
- W4226337301 cites W2966928417 @default.
- W4226337301 cites W3006436762 @default.
- W4226337301 cites W3007938255 @default.
- W4226337301 cites W3099319035 @default.
- W4226337301 cites W3099878876 @default.
- W4226337301 cites W3101181415 @default.
- W4226337301 cites W3103145119 @default.
- W4226337301 cites W3118571930 @default.
- W4226337301 cites W3120258623 @default.
- W4226337301 cites W3155215330 @default.
- W4226337301 cites W3156855227 @default.
- W4226337301 cites W3171496489 @default.
- W4226337301 doi "https://doi.org/10.1111/2041-210x.13844" @default.
- W4226337301 hasPublicationYear "2022" @default.
- W4226337301 type Work @default.
- W4226337301 citedByCount "5" @default.
- W4226337301 countsByYear W42263373012022 @default.
- W4226337301 countsByYear W42263373012023 @default.
- W4226337301 crossrefType "journal-article" @default.
- W4226337301 hasAuthorship W4226337301A5004196368 @default.
- W4226337301 hasAuthorship W4226337301A5008028784 @default.
- W4226337301 hasAuthorship W4226337301A5009785618 @default.
- W4226337301 hasAuthorship W4226337301A5026466407 @default.
- W4226337301 hasAuthorship W4226337301A5026927479 @default.
- W4226337301 hasAuthorship W4226337301A5060634309 @default.
- W4226337301 hasAuthorship W4226337301A5086388106 @default.
- W4226337301 hasBestOaLocation W42263373012 @default.
- W4226337301 hasConcept C105795698 @default.
- W4226337301 hasConcept C133462117 @default.
- W4226337301 hasConcept C154945302 @default.
- W4226337301 hasConcept C16345878 @default.
- W4226337301 hasConcept C18903297 @default.
- W4226337301 hasConcept C197352329 @default.
- W4226337301 hasConcept C199360897 @default.
- W4226337301 hasConcept C2524010 @default.
- W4226337301 hasConcept C31972630 @default.
- W4226337301 hasConcept C33923547 @default.
- W4226337301 hasConcept C41008148 @default.
- W4226337301 hasConcept C43521106 @default.
- W4226337301 hasConcept C59822182 @default.
- W4226337301 hasConcept C86803240 @default.
- W4226337301 hasConceptScore W4226337301C105795698 @default.
- W4226337301 hasConceptScore W4226337301C133462117 @default.
- W4226337301 hasConceptScore W4226337301C154945302 @default.
- W4226337301 hasConceptScore W4226337301C16345878 @default.
- W4226337301 hasConceptScore W4226337301C18903297 @default.
- W4226337301 hasConceptScore W4226337301C197352329 @default.
- W4226337301 hasConceptScore W4226337301C199360897 @default.
- W4226337301 hasConceptScore W4226337301C2524010 @default.
- W4226337301 hasConceptScore W4226337301C31972630 @default.
- W4226337301 hasConceptScore W4226337301C33923547 @default.
- W4226337301 hasConceptScore W4226337301C41008148 @default.