Matches in SemOpenAlex for { <https://semopenalex.org/work/W4313201602> ?p ?o ?g. }
- W4313201602 endingPage "325" @default.
- W4313201602 startingPage "313" @default.
- W4313201602 abstract "Humans use multimodal sensory information to understand the physical properties of their environment. Intelligent decision-making systems such as the ones used in robotic applications could also utilize the fusion of multimodal information to improve their performance and reliability. In recent years, machine learning and deep learning methods are used at the heart of such intelligent systems. Developing visuo-tactile models is a challenging task due to various problems such as performance, limited datasets, reliability, and computational efficiency. In this research, we propose four efficient models based on dynamic neural network architectures for unimodal and multimodal object recognition. For unimodal object recognition, TactileNet and VisionNet are proposed. For multimodal object recognition, the FusionNet-A and the FusionNet-B are designed to implement early and late fusion strategies, respectively. The proposed models have a flexible structure and are able to change at the train or test phase to accommodate the amount of available information. Model confidence calibration is employed to enhance the reliability and generalization of the models. The proposed models are evaluated on MIT CSAIL large-scale multimodal dataset. Our results demonstrate accurate performance in both unimodal and multimodal scenarios. It has been illustrated that by using different fusion strategies and augmenting the tactile-based models with visual information, the top-1 error rate of the single-frame tactile model was reduced by 78% and the mean average precision was increased by 2.19 times. Although the focus has been on the fusion of tactile and visual modalities, the proposed design methodology can be generalized to include more modalities." @default.
- W4313201602 created "2023-01-06" @default.
- W4313201602 creator A5029210476 @default.
- W4313201602 creator A5033570527 @default.
- W4313201602 creator A5064678610 @default.
- W4313201602 creator A5085834816 @default.
- W4313201602 date "2023-04-01" @default.
- W4313201602 modified "2023-10-18" @default.
- W4313201602 title "Fusion of tactile and visual information in deep learning models for object recognition" @default.
- W4313201602 cites W1689711448 @default.
- W4313201602 cites W1969090956 @default.
- W4313201602 cites W1993757871 @default.
- W4313201602 cites W2040996971 @default.
- W4313201602 cites W2075654868 @default.
- W4313201602 cites W2117539524 @default.
- W4313201602 cites W2134172620 @default.
- W4313201602 cites W2160499119 @default.
- W4313201602 cites W2161739581 @default.
- W4313201602 cites W2395611524 @default.
- W4313201602 cites W2481240925 @default.
- W4313201602 cites W2619383789 @default.
- W4313201602 cites W2897995899 @default.
- W4313201602 cites W2912070267 @default.
- W4313201602 cites W2919115771 @default.
- W4313201602 cites W2919201013 @default.
- W4313201602 cites W2933084981 @default.
- W4313201602 cites W2947434510 @default.
- W4313201602 cites W2963015618 @default.
- W4313201602 cites W2987325452 @default.
- W4313201602 cites W3008400075 @default.
- W4313201602 cites W3011727199 @default.
- W4313201602 cites W3160888325 @default.
- W4313201602 cites W4285252628 @default.
- W4313201602 doi "https://doi.org/10.1016/j.inffus.2022.11.032" @default.
- W4313201602 hasPublicationYear "2023" @default.
- W4313201602 type Work @default.
- W4313201602 citedByCount "1" @default.
- W4313201602 countsByYear W43132016022023 @default.
- W4313201602 crossrefType "journal-article" @default.
- W4313201602 hasAuthorship W4313201602A5029210476 @default.
- W4313201602 hasAuthorship W4313201602A5033570527 @default.
- W4313201602 hasAuthorship W4313201602A5064678610 @default.
- W4313201602 hasAuthorship W4313201602A5085834816 @default.
- W4313201602 hasConcept C108583219 @default.
- W4313201602 hasConcept C119857082 @default.
- W4313201602 hasConcept C121332964 @default.
- W4313201602 hasConcept C134306372 @default.
- W4313201602 hasConcept C144024400 @default.
- W4313201602 hasConcept C154945302 @default.
- W4313201602 hasConcept C162324750 @default.
- W4313201602 hasConcept C163258240 @default.
- W4313201602 hasConcept C177148314 @default.
- W4313201602 hasConcept C187736073 @default.
- W4313201602 hasConcept C2779903281 @default.
- W4313201602 hasConcept C2780451532 @default.
- W4313201602 hasConcept C2780660688 @default.
- W4313201602 hasConcept C2781238097 @default.
- W4313201602 hasConcept C33923547 @default.
- W4313201602 hasConcept C33954974 @default.
- W4313201602 hasConcept C36289849 @default.
- W4313201602 hasConcept C41008148 @default.
- W4313201602 hasConcept C43214815 @default.
- W4313201602 hasConcept C50644808 @default.
- W4313201602 hasConcept C62520636 @default.
- W4313201602 hasConcept C64876066 @default.
- W4313201602 hasConceptScore W4313201602C108583219 @default.
- W4313201602 hasConceptScore W4313201602C119857082 @default.
- W4313201602 hasConceptScore W4313201602C121332964 @default.
- W4313201602 hasConceptScore W4313201602C134306372 @default.
- W4313201602 hasConceptScore W4313201602C144024400 @default.
- W4313201602 hasConceptScore W4313201602C154945302 @default.
- W4313201602 hasConceptScore W4313201602C162324750 @default.
- W4313201602 hasConceptScore W4313201602C163258240 @default.
- W4313201602 hasConceptScore W4313201602C177148314 @default.
- W4313201602 hasConceptScore W4313201602C187736073 @default.
- W4313201602 hasConceptScore W4313201602C2779903281 @default.
- W4313201602 hasConceptScore W4313201602C2780451532 @default.
- W4313201602 hasConceptScore W4313201602C2780660688 @default.
- W4313201602 hasConceptScore W4313201602C2781238097 @default.
- W4313201602 hasConceptScore W4313201602C33923547 @default.
- W4313201602 hasConceptScore W4313201602C33954974 @default.
- W4313201602 hasConceptScore W4313201602C36289849 @default.
- W4313201602 hasConceptScore W4313201602C41008148 @default.
- W4313201602 hasConceptScore W4313201602C43214815 @default.
- W4313201602 hasConceptScore W4313201602C50644808 @default.
- W4313201602 hasConceptScore W4313201602C62520636 @default.
- W4313201602 hasConceptScore W4313201602C64876066 @default.
- W4313201602 hasFunder F4320323227 @default.
- W4313201602 hasFunder F4320335254 @default.
- W4313201602 hasLocation W43132016021 @default.
- W4313201602 hasOpenAccess W4313201602 @default.
- W4313201602 hasPrimaryLocation W43132016021 @default.
- W4313201602 hasRelatedWork W2556013083 @default.
- W4313201602 hasRelatedWork W2904518532 @default.
- W4313201602 hasRelatedWork W2962931510 @default.
- W4313201602 hasRelatedWork W2963650472 @default.
- W4313201602 hasRelatedWork W3199271201 @default.
- W4313201602 hasRelatedWork W4280529741 @default.