Matches in SemOpenAlex for { <https://semopenalex.org/work/W4289518618> ?p ?o ?g. }
- W4289518618 abstract "Machine-assisted object detection and classification of fish species from Baited Remote Underwater Video Station (BRUVS) surveys using deep learning algorithms presents an opportunity for optimising analysis time and rapid reporting of marine ecosystem statuses. Training object detection algorithms for BRUVS analysis presents significant challenges: the model requires training datasets with bounding boxes already applied identifying the location of all fish individuals in a scene, and it requires training datasets identifying species with labels. In both cases, substantial volumes of data are required and this is currently a manual, labour-intensive process, resulting in a paucity of the labelled data currently required for training object detection models for species detection. Here, we present a “machine-assisted” approach for i) a generalised model to automate the application of bounding boxes to any underwater environment containing fish and ii) fish detection and classification to species identification level, up to 12 target species. A catch-all “ fish ” classification is applied to fish individuals that remain unidentified due to a lack of available training and validation data. Machine-assisted bounding box annotation was shown to detect and label fish on out-of-sample datasets with a recall between 0.70 and 0.89 and automated labelling of 12 targeted species with an F 1 score of 0.79. On average, 12% of fish were given a bounding box with species labels and 88% of fish were located and given a fish label and identified for manual labelling. Taking a combined, machine-assisted approach presents a significant advancement towards the applied use of deep learning for fish species detection in fish analysis and workflows and has potential for future fish ecologist uptake if integrated into video analysis software. Manual labelling and classification effort is still required, and a community effort to address the limitation presented by a severe paucity of training data would improve automation accuracy and encourage increased uptake." @default.
- W4289518618 created "2022-08-03" @default.
- W4289518618 creator A5009593802 @default.
- W4289518618 creator A5015260697 @default.
- W4289518618 creator A5020135998 @default.
- W4289518618 creator A5022059094 @default.
- W4289518618 creator A5039633120 @default.
- W4289518618 creator A5064419121 @default.
- W4289518618 creator A5072023526 @default.
- W4289518618 date "2022-08-02" @default.
- W4289518618 modified "2023-10-08" @default.
- W4289518618 title "Accelerating Species Recognition and Labelling of Fish From Underwater Video With Machine-Assisted Deep Learning" @default.
- W4289518618 cites W1861492603 @default.
- W4289518618 cites W1964650584 @default.
- W4289518618 cites W1965718792 @default.
- W4289518618 cites W1980434677 @default.
- W4289518618 cites W1996253987 @default.
- W4289518618 cites W2010489120 @default.
- W4289518618 cites W2019989105 @default.
- W4289518618 cites W2031340449 @default.
- W4289518618 cites W2039885040 @default.
- W4289518618 cites W2075891217 @default.
- W4289518618 cites W2079732814 @default.
- W4289518618 cites W2084325257 @default.
- W4289518618 cites W2131251535 @default.
- W4289518618 cites W2522256182 @default.
- W4289518618 cites W2529780245 @default.
- W4289518618 cites W2573501226 @default.
- W4289518618 cites W2578353911 @default.
- W4289518618 cites W2622826443 @default.
- W4289518618 cites W2767556927 @default.
- W4289518618 cites W2797538836 @default.
- W4289518618 cites W2807875791 @default.
- W4289518618 cites W2891182582 @default.
- W4289518618 cites W2897654901 @default.
- W4289518618 cites W2920741983 @default.
- W4289518618 cites W2941221368 @default.
- W4289518618 cites W2981272655 @default.
- W4289518618 cites W2995304823 @default.
- W4289518618 cites W3017145925 @default.
- W4289518618 cites W3019461574 @default.
- W4289518618 cites W3022241204 @default.
- W4289518618 cites W3033927182 @default.
- W4289518618 cites W3074310958 @default.
- W4289518618 cites W3083557050 @default.
- W4289518618 cites W3086644986 @default.
- W4289518618 cites W3092213319 @default.
- W4289518618 cites W3106306218 @default.
- W4289518618 cites W3134739465 @default.
- W4289518618 cites W3137631935 @default.
- W4289518618 cites W3161884313 @default.
- W4289518618 cites W3206292130 @default.
- W4289518618 cites W3215753146 @default.
- W4289518618 cites W3092320086 @default.
- W4289518618 doi "https://doi.org/10.3389/fmars.2022.944582" @default.
- W4289518618 hasPublicationYear "2022" @default.
- W4289518618 type Work @default.
- W4289518618 citedByCount "6" @default.
- W4289518618 countsByYear W42895186182022 @default.
- W4289518618 countsByYear W42895186182023 @default.
- W4289518618 crossrefType "journal-article" @default.
- W4289518618 hasAuthorship W4289518618A5009593802 @default.
- W4289518618 hasAuthorship W4289518618A5015260697 @default.
- W4289518618 hasAuthorship W4289518618A5020135998 @default.
- W4289518618 hasAuthorship W4289518618A5022059094 @default.
- W4289518618 hasAuthorship W4289518618A5039633120 @default.
- W4289518618 hasAuthorship W4289518618A5064419121 @default.
- W4289518618 hasAuthorship W4289518618A5072023526 @default.
- W4289518618 hasBestOaLocation W42895186181 @default.
- W4289518618 hasConcept C115961682 @default.
- W4289518618 hasConcept C119857082 @default.
- W4289518618 hasConcept C147037132 @default.
- W4289518618 hasConcept C153180895 @default.
- W4289518618 hasConcept C154945302 @default.
- W4289518618 hasConcept C166957645 @default.
- W4289518618 hasConcept C205649164 @default.
- W4289518618 hasConcept C2776151529 @default.
- W4289518618 hasConcept C2780523633 @default.
- W4289518618 hasConcept C2781238097 @default.
- W4289518618 hasConcept C2909208804 @default.
- W4289518618 hasConcept C41008148 @default.
- W4289518618 hasConcept C505870484 @default.
- W4289518618 hasConcept C55493867 @default.
- W4289518618 hasConcept C63584917 @default.
- W4289518618 hasConcept C86803240 @default.
- W4289518618 hasConcept C98083399 @default.
- W4289518618 hasConceptScore W4289518618C115961682 @default.
- W4289518618 hasConceptScore W4289518618C119857082 @default.
- W4289518618 hasConceptScore W4289518618C147037132 @default.
- W4289518618 hasConceptScore W4289518618C153180895 @default.
- W4289518618 hasConceptScore W4289518618C154945302 @default.
- W4289518618 hasConceptScore W4289518618C166957645 @default.
- W4289518618 hasConceptScore W4289518618C205649164 @default.
- W4289518618 hasConceptScore W4289518618C2776151529 @default.
- W4289518618 hasConceptScore W4289518618C2780523633 @default.
- W4289518618 hasConceptScore W4289518618C2781238097 @default.
- W4289518618 hasConceptScore W4289518618C2909208804 @default.
- W4289518618 hasConceptScore W4289518618C41008148 @default.
- W4289518618 hasConceptScore W4289518618C505870484 @default.
- W4289518618 hasConceptScore W4289518618C55493867 @default.