Matches in SemOpenAlex for { <https://semopenalex.org/work/W4385804973> ?p ?o ?g. }
Showing items 1 to 80 of
80
with 100 items per page.
- W4385804973 abstract "Vision-based Transformer have shown huge application in the perception module of autonomous driving in terms of predicting accurate 3D bounding boxes, owing to their strong capability in modeling long-range dependencies between the visual features. However Transformers, initially designed for language models, have mostly focused on the performance accuracy, and not so much on the inference-time budget. For a safety critical system like autonomous driving, real-time inference at the on-board compute is an absolute necessity. This keeps our object detection algorithm under a very tight run-time budget. In this paper, we evaluated a variety of strategies to optimize on the inference-time of vision transformers based object detection methods keeping a close-watch on any performance variations. Our chosen metric for these strategies is accuracy-runtime joint optimization. Moreover, for actual inference-time analysis we profile our strategies with float32 and float16 precision with TensorRT module. This is the most common format used by the industry for deployment of their Machine Learning networks on the edge devices. We showed that our strategies are able to improve inference-time by 63% at the cost of performance drop of mere 3% for our problem-statement defined in Sec. 3. These strategies brings down Vision Transformers detectors [3], [15], [18], [19], [36] inference-time even less than traditional single-image based CNN detectors like FCOS [17], [25], [33]. We recommend practitioners use these techniques to deploy Transformers based hefty multi-view networks on a budge-constrained robotic platform." @default.
- W4385804973 created "2023-08-15" @default.
- W4385804973 creator A5041345752 @default.
- W4385804973 date "2023-06-01" @default.
- W4385804973 modified "2023-09-26" @default.
- W4385804973 title "Training Strategies for Vision Transformers for Object Detection" @default.
- W4385804973 cites W2108598243 @default.
- W4385804973 cites W2194775991 @default.
- W4385804973 cites W2565639579 @default.
- W4385804973 cites W2752782242 @default.
- W4385804973 cites W2796108585 @default.
- W4385804973 cites W2963037989 @default.
- W4385804973 cites W2982770724 @default.
- W4385804973 cites W2989604896 @default.
- W4385804973 cites W3035172746 @default.
- W4385804973 cites W3035574168 @default.
- W4385804973 cites W3106250896 @default.
- W4385804973 cites W3138516171 @default.
- W4385804973 cites W3172942063 @default.
- W4385804973 cites W3181161645 @default.
- W4385804973 cites W3215100485 @default.
- W4385804973 cites W4225793049 @default.
- W4385804973 cites W4312894406 @default.
- W4385804973 cites W4385301252 @default.
- W4385804973 doi "https://doi.org/10.1109/cvprw59228.2023.00016" @default.
- W4385804973 hasPublicationYear "2023" @default.
- W4385804973 type Work @default.
- W4385804973 citedByCount "0" @default.
- W4385804973 crossrefType "proceedings-article" @default.
- W4385804973 hasAuthorship W4385804973A5041345752 @default.
- W4385804973 hasBestOaLocation W43858049732 @default.
- W4385804973 hasConcept C113775141 @default.
- W4385804973 hasConcept C119599485 @default.
- W4385804973 hasConcept C119857082 @default.
- W4385804973 hasConcept C127413603 @default.
- W4385804973 hasConcept C153180895 @default.
- W4385804973 hasConcept C154945302 @default.
- W4385804973 hasConcept C165801399 @default.
- W4385804973 hasConcept C2776151529 @default.
- W4385804973 hasConcept C2776214188 @default.
- W4385804973 hasConcept C31972630 @default.
- W4385804973 hasConcept C41008148 @default.
- W4385804973 hasConcept C46743427 @default.
- W4385804973 hasConcept C5339829 @default.
- W4385804973 hasConcept C63584917 @default.
- W4385804973 hasConcept C66322947 @default.
- W4385804973 hasConcept C79403827 @default.
- W4385804973 hasConceptScore W4385804973C113775141 @default.
- W4385804973 hasConceptScore W4385804973C119599485 @default.
- W4385804973 hasConceptScore W4385804973C119857082 @default.
- W4385804973 hasConceptScore W4385804973C127413603 @default.
- W4385804973 hasConceptScore W4385804973C153180895 @default.
- W4385804973 hasConceptScore W4385804973C154945302 @default.
- W4385804973 hasConceptScore W4385804973C165801399 @default.
- W4385804973 hasConceptScore W4385804973C2776151529 @default.
- W4385804973 hasConceptScore W4385804973C2776214188 @default.
- W4385804973 hasConceptScore W4385804973C31972630 @default.
- W4385804973 hasConceptScore W4385804973C41008148 @default.
- W4385804973 hasConceptScore W4385804973C46743427 @default.
- W4385804973 hasConceptScore W4385804973C5339829 @default.
- W4385804973 hasConceptScore W4385804973C63584917 @default.
- W4385804973 hasConceptScore W4385804973C66322947 @default.
- W4385804973 hasConceptScore W4385804973C79403827 @default.
- W4385804973 hasLocation W43858049731 @default.
- W4385804973 hasLocation W43858049732 @default.
- W4385804973 hasOpenAccess W4385804973 @default.
- W4385804973 hasPrimaryLocation W43858049731 @default.
- W4385804973 hasRelatedWork W2122376820 @default.
- W4385804973 hasRelatedWork W2740329524 @default.
- W4385804973 hasRelatedWork W2754428891 @default.
- W4385804973 hasRelatedWork W2782964878 @default.
- W4385804973 hasRelatedWork W2922421953 @default.
- W4385804973 hasRelatedWork W2980691083 @default.
- W4385804973 hasRelatedWork W3002270006 @default.
- W4385804973 hasRelatedWork W3033480908 @default.
- W4385804973 hasRelatedWork W4281560450 @default.
- W4385804973 hasRelatedWork W4385804973 @default.
- W4385804973 isParatext "false" @default.
- W4385804973 isRetracted "false" @default.
- W4385804973 workType "article" @default.