Matches in SemOpenAlex for { <https://semopenalex.org/work/W4210927905> ?p ?o ?g. }
- W4210927905 endingPage "15" @default.
- W4210927905 startingPage "1" @default.
- W4210927905 abstract "Wild animals are essential for ecosystem structuring and stability, and thus they are important for ecological research. Since most wild animals have high athletic or concealable abilities or both, it is used to be relatively difficult to acquire evidence of animal appearances before applications of camera traps in ecological researches. However, a single camera trap may produce thousands of animal images in a short period of time and inevitably ends up with millions of images requiring classification. Although there have been many methods developed for classifying camera trap images, almost all of them follow the pattern of a very deep convolutional neural network processing all camera trap images. Consequently, the corresponding surveillance area may need to be delicately controlled to match the network capability, and it may be difficult to expand the area in the future. In this study, we consider a scenario in which camera traps are grouped into independent clusters, and images produced by a cluster are processed by an edge device installed with a customized network. Accordingly, edge devices in this scenario may be highly heterogeneous due to cluster scales. Resultantly, networks popular in the classification of camera trap images may not be deployable for edge devices without modifications requiring the expertise which may be hard to obtain. This motivates us to automatize network design via neural architecture search for edge devices. However, the search may be costly due to the evaluations of candidate networks, and its results may be infeasible without considering the resource limits of edge devices. Accordingly, we propose a search method using regression trees to evaluate candidate networks to lower search costs, and candidate networks are built based on a meta-architecture automatically adjusted regarding to the resource limits. In experiments, the search consumes 6.5 hours to find a network applicable to the edge device Jetson X2. The found network is then trained on camera trap images through a workstation and tested on Jetson X2. The network achieves competitive accuracies compared with the automatically and the manually designed networks." @default.
- W4210927905 created "2022-02-09" @default.
- W4210927905 creator A5001892611 @default.
- W4210927905 creator A5040956935 @default.
- W4210927905 creator A5089221278 @default.
- W4210927905 date "2022-02-07" @default.
- W4210927905 modified "2023-09-26" @default.
- W4210927905 title "Identifying Animals in Camera Trap Images via Neural Architecture Search" @default.
- W4210927905 cites W1774267230 @default.
- W4210927905 cites W2042504207 @default.
- W4210927905 cites W2064675550 @default.
- W4210927905 cites W2115006245 @default.
- W4210927905 cites W2119717200 @default.
- W4210927905 cites W2151499205 @default.
- W4210927905 cites W2194775991 @default.
- W4210927905 cites W2413367505 @default.
- W4210927905 cites W2501604154 @default.
- W4210927905 cites W2531409750 @default.
- W4210927905 cites W2547107225 @default.
- W4210927905 cites W2606637138 @default.
- W4210927905 cites W2769210209 @default.
- W4210927905 cites W2796265726 @default.
- W4210927905 cites W2895082331 @default.
- W4210927905 cites W2920205809 @default.
- W4210927905 cites W2947469701 @default.
- W4210927905 cites W2952113774 @default.
- W4210927905 cites W2960010704 @default.
- W4210927905 cites W2963163009 @default.
- W4210927905 cites W2963446712 @default.
- W4210927905 cites W2963821229 @default.
- W4210927905 cites W2963827161 @default.
- W4210927905 cites W2963918968 @default.
- W4210927905 cites W2964081807 @default.
- W4210927905 cites W2964298670 @default.
- W4210927905 cites W2971398159 @default.
- W4210927905 cites W2974449448 @default.
- W4210927905 cites W2980270353 @default.
- W4210927905 cites W2981748264 @default.
- W4210927905 cites W2995694143 @default.
- W4210927905 cites W3030284948 @default.
- W4210927905 cites W3087570018 @default.
- W4210927905 cites W3093414036 @default.
- W4210927905 cites W3127560734 @default.
- W4210927905 cites W3143590514 @default.
- W4210927905 doi "https://doi.org/10.1155/2022/8615374" @default.
- W4210927905 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/35178083" @default.
- W4210927905 hasPublicationYear "2022" @default.
- W4210927905 type Work @default.
- W4210927905 citedByCount "2" @default.
- W4210927905 countsByYear W42109279052022 @default.
- W4210927905 countsByYear W42109279052023 @default.
- W4210927905 crossrefType "journal-article" @default.
- W4210927905 hasAuthorship W4210927905A5001892611 @default.
- W4210927905 hasAuthorship W4210927905A5040956935 @default.
- W4210927905 hasAuthorship W4210927905A5089221278 @default.
- W4210927905 hasBestOaLocation W42109279051 @default.
- W4210927905 hasConcept C10138342 @default.
- W4210927905 hasConcept C111919701 @default.
- W4210927905 hasConcept C121099081 @default.
- W4210927905 hasConcept C138236772 @default.
- W4210927905 hasConcept C153180895 @default.
- W4210927905 hasConcept C153294291 @default.
- W4210927905 hasConcept C154945302 @default.
- W4210927905 hasConcept C161334170 @default.
- W4210927905 hasConcept C162307627 @default.
- W4210927905 hasConcept C162324750 @default.
- W4210927905 hasConcept C164866538 @default.
- W4210927905 hasConcept C18903297 @default.
- W4210927905 hasConcept C199360897 @default.
- W4210927905 hasConcept C205649164 @default.
- W4210927905 hasConcept C2775945657 @default.
- W4210927905 hasConcept C2779101711 @default.
- W4210927905 hasConcept C29376679 @default.
- W4210927905 hasConcept C31972630 @default.
- W4210927905 hasConcept C41008148 @default.
- W4210927905 hasConcept C50644808 @default.
- W4210927905 hasConcept C79974875 @default.
- W4210927905 hasConcept C81363708 @default.
- W4210927905 hasConcept C86803240 @default.
- W4210927905 hasConceptScore W4210927905C10138342 @default.
- W4210927905 hasConceptScore W4210927905C111919701 @default.
- W4210927905 hasConceptScore W4210927905C121099081 @default.
- W4210927905 hasConceptScore W4210927905C138236772 @default.
- W4210927905 hasConceptScore W4210927905C153180895 @default.
- W4210927905 hasConceptScore W4210927905C153294291 @default.
- W4210927905 hasConceptScore W4210927905C154945302 @default.
- W4210927905 hasConceptScore W4210927905C161334170 @default.
- W4210927905 hasConceptScore W4210927905C162307627 @default.
- W4210927905 hasConceptScore W4210927905C162324750 @default.
- W4210927905 hasConceptScore W4210927905C164866538 @default.
- W4210927905 hasConceptScore W4210927905C18903297 @default.
- W4210927905 hasConceptScore W4210927905C199360897 @default.
- W4210927905 hasConceptScore W4210927905C205649164 @default.
- W4210927905 hasConceptScore W4210927905C2775945657 @default.
- W4210927905 hasConceptScore W4210927905C2779101711 @default.
- W4210927905 hasConceptScore W4210927905C29376679 @default.
- W4210927905 hasConceptScore W4210927905C31972630 @default.
- W4210927905 hasConceptScore W4210927905C41008148 @default.