Matches in SemOpenAlex for { <https://semopenalex.org/work/W2951374906> ?p ?o ?g. }
Showing items 1 to 95 of
95
with 100 items per page.
- W2951374906 abstract "Advances in machine vision technology are rapidly enabling new and innovative uses within the field of biodiversity. Computers are now able to use images to identify tens of thousands of species across a wide range of taxonomic groups in real time, notably demonstrated by iNaturalist.org, which suggests species IDs to users (https://www.inaturalist.org/pages/computer_vision_demo) as they create observation records. Soon it will be commonplace to detect species in video feeds or use the camera in a mobile device to search for species-related content on the Internet. The Global Biodiversity Information Facility (GBIF) has an important role to play in advancing and improving this technology, whether in terms of data, collaboration across teams, or citation practice. But in the short term, the most important role may relate to initiating a cultural shift in accepted practices for the use of GBIF-mediated data for training of artificial intelligence (AI). “Training datasets” play a critical role in achieving species recognition capability in any machine vision system. These datasets compile representative images containing the explicit, verifiable identifications of the species they include. High-powered computers run algorithms on these training datasets, analysing the imagery and building complex models that characterize defining features for each species or taxonomic group. Researchers can, in turn, apply the resulting models to new images, determining what species or group they likely contain. Current research in machine vision is exploring (a) the use of location and date information to further improve model results, (b) identification methods beyond species-level into attribute, character, trait, or part-level ID, with an eye toward human interpretability, and (c) expertise modeling for improved determination of “research grade” images and metadata. The GBIF community has amassed one of the largest datasets of labelled species images available on the internet: more than 33 million species occurrence records in GBIF.org have one or more images (https://www.gbif.org/occurrence/gallery). Machine vision models, when integrated into the data collection tools in use across the GBIF network, can improve the user experience. For example, in citizen science applications like iNaturalist, automated species suggestion helps even novice users contribute occurrence records to GBIF. Perhaps most importantly, GBIF has implemented uniform (and open) data licensing, established guidelines on citation and provided consistent methods for tracking data use through the Digital Object Identifiers (DOI) citation chain. GBIF would like to build on the lessons learned in these activities while striving to assist with this technology research and increase its power and availability. We envisage an approach as follows: To assist in developing and refining machine vision models, GBIF plans to provide training datasets, taking effort to ensure license and citation practice are respected. The training datasets will be issued with a DOI, and the contributing datasets will be linked through the DOI citation graph. To assist application developers, Google and Visipedia plan to build and publish openly-licensed models and tutorials for how to adapt them for localized use. Together we will strive to ensure that data is being used responsibly and transparently, to close the gap between machine vision scientists, application developers, and users and to share taxonomic trees capturing the taxon rank to which machine vision models can identify with confidence based on an image’s visual characteristics. To assist in developing and refining machine vision models, GBIF plans to provide training datasets, taking effort to ensure license and citation practice are respected. The training datasets will be issued with a DOI, and the contributing datasets will be linked through the DOI citation graph. To assist application developers, Google and Visipedia plan to build and publish openly-licensed models and tutorials for how to adapt them for localized use. Together we will strive to ensure that data is being used responsibly and transparently, to close the gap between machine vision scientists, application developers, and users and to share taxonomic trees capturing the taxon rank to which machine vision models can identify with confidence based on an image’s visual characteristics." @default.
- W2951374906 created "2019-06-27" @default.
- W2951374906 creator A5000739848 @default.
- W2951374906 creator A5002422319 @default.
- W2951374906 creator A5012770174 @default.
- W2951374906 creator A5013534461 @default.
- W2951374906 creator A5015007109 @default.
- W2951374906 creator A5018609918 @default.
- W2951374906 creator A5024685173 @default.
- W2951374906 creator A5025004380 @default.
- W2951374906 creator A5026825380 @default.
- W2951374906 creator A5030402556 @default.
- W2951374906 creator A5046819662 @default.
- W2951374906 creator A5063600180 @default.
- W2951374906 creator A5071788911 @default.
- W2951374906 creator A5078355227 @default.
- W2951374906 creator A5078989352 @default.
- W2951374906 creator A5081545080 @default.
- W2951374906 date "2019-06-19" @default.
- W2951374906 modified "2023-10-11" @default.
- W2951374906 title "Training Machines to Identify Species using GBIF-mediated Datasets" @default.
- W2951374906 doi "https://doi.org/10.3897/biss.3.37230" @default.
- W2951374906 hasPublicationYear "2019" @default.
- W2951374906 type Work @default.
- W2951374906 sameAs 2951374906 @default.
- W2951374906 citedByCount "4" @default.
- W2951374906 countsByYear W29513749062020 @default.
- W2951374906 countsByYear W29513749062022 @default.
- W2951374906 crossrefType "journal-article" @default.
- W2951374906 hasAuthorship W2951374906A5000739848 @default.
- W2951374906 hasAuthorship W2951374906A5002422319 @default.
- W2951374906 hasAuthorship W2951374906A5012770174 @default.
- W2951374906 hasAuthorship W2951374906A5013534461 @default.
- W2951374906 hasAuthorship W2951374906A5015007109 @default.
- W2951374906 hasAuthorship W2951374906A5018609918 @default.
- W2951374906 hasAuthorship W2951374906A5024685173 @default.
- W2951374906 hasAuthorship W2951374906A5025004380 @default.
- W2951374906 hasAuthorship W2951374906A5026825380 @default.
- W2951374906 hasAuthorship W2951374906A5030402556 @default.
- W2951374906 hasAuthorship W2951374906A5046819662 @default.
- W2951374906 hasAuthorship W2951374906A5063600180 @default.
- W2951374906 hasAuthorship W2951374906A5071788911 @default.
- W2951374906 hasAuthorship W2951374906A5078355227 @default.
- W2951374906 hasAuthorship W2951374906A5078989352 @default.
- W2951374906 hasAuthorship W2951374906A5081545080 @default.
- W2951374906 hasBestOaLocation W29513749061 @default.
- W2951374906 hasConcept C116834253 @default.
- W2951374906 hasConcept C119857082 @default.
- W2951374906 hasConcept C154945302 @default.
- W2951374906 hasConcept C159985019 @default.
- W2951374906 hasConcept C18903297 @default.
- W2951374906 hasConcept C189592816 @default.
- W2951374906 hasConcept C192562407 @default.
- W2951374906 hasConcept C202444582 @default.
- W2951374906 hasConcept C204323151 @default.
- W2951374906 hasConcept C2522767166 @default.
- W2951374906 hasConcept C33923547 @default.
- W2951374906 hasConcept C41008148 @default.
- W2951374906 hasConcept C71640776 @default.
- W2951374906 hasConcept C86803240 @default.
- W2951374906 hasConcept C9652623 @default.
- W2951374906 hasConceptScore W2951374906C116834253 @default.
- W2951374906 hasConceptScore W2951374906C119857082 @default.
- W2951374906 hasConceptScore W2951374906C154945302 @default.
- W2951374906 hasConceptScore W2951374906C159985019 @default.
- W2951374906 hasConceptScore W2951374906C18903297 @default.
- W2951374906 hasConceptScore W2951374906C189592816 @default.
- W2951374906 hasConceptScore W2951374906C192562407 @default.
- W2951374906 hasConceptScore W2951374906C202444582 @default.
- W2951374906 hasConceptScore W2951374906C204323151 @default.
- W2951374906 hasConceptScore W2951374906C2522767166 @default.
- W2951374906 hasConceptScore W2951374906C33923547 @default.
- W2951374906 hasConceptScore W2951374906C41008148 @default.
- W2951374906 hasConceptScore W2951374906C71640776 @default.
- W2951374906 hasConceptScore W2951374906C86803240 @default.
- W2951374906 hasConceptScore W2951374906C9652623 @default.
- W2951374906 hasLocation W29513749061 @default.
- W2951374906 hasLocation W29513749062 @default.
- W2951374906 hasOpenAccess W2951374906 @default.
- W2951374906 hasPrimaryLocation W29513749061 @default.
- W2951374906 hasRelatedWork W2961085424 @default.
- W2951374906 hasRelatedWork W3046775127 @default.
- W2951374906 hasRelatedWork W3170094116 @default.
- W2951374906 hasRelatedWork W3209574120 @default.
- W2951374906 hasRelatedWork W4205958290 @default.
- W2951374906 hasRelatedWork W4285260836 @default.
- W2951374906 hasRelatedWork W4286629047 @default.
- W2951374906 hasRelatedWork W4306321456 @default.
- W2951374906 hasRelatedWork W4306674287 @default.
- W2951374906 hasRelatedWork W4224009465 @default.
- W2951374906 hasVolume "3" @default.
- W2951374906 isParatext "false" @default.
- W2951374906 isRetracted "false" @default.
- W2951374906 magId "2951374906" @default.
- W2951374906 workType "article" @default.