Matches in SemOpenAlex for { <https://semopenalex.org/work/W3102049645> ?p ?o ?g. }
- W3102049645 endingPage "800" @default.
- W3102049645 startingPage "779" @default.
- W3102049645 abstract "Personal authentication systems based on biometric have seen a strong demand mainly due to the increasing concern in various privacy and security applications. Although the use of each biometric trait is problem dependent, the human ear has been found to have enough discriminating characteristics to allow its use as a strong biometric measure. To locate an ear in a face image is a strenuous task, numerous existing approaches have achieved significant performance, but the majority of studies are based on the constrained environment. However, ear biometrics possess a great level of difficulties in the unconstrained environment, where pose, scale, occlusion, illuminations, background clutter, etc., vary to a great extent. To address the problem of ear detection in the wild, we have proposed two high-performance ear detection models: CED-Net-1 and CED-Net-2, which are fundamentally based on deep convolutional neural networks and primarily use contextual information to detect ear in the unconstrained environment. To compare the performance of proposed models, we have implemented state-of-the-art deep learning models, viz. FRCNN (faster region convolutional neural network) and SSD (single shot multibox detector) for ear detection task. To test the model’s generalization, these are evaluated on six different benchmark datasets, viz. IITD, IITK, USTB-DB3, UND-E, UND-J2 and UBEAR, and each one of the databases has different challenging images. The models are compared based on performance measure parameters such as IOU (intersection over union), accuracy, precision, recall and F1-score. It is observed that our proposed models CED-Net-1 and CED-Net-2 outperformed the FRCNN and SSD at higher values of IOUs. An accuracy of 99% is achieved at IOU 0.5 on majority of the databases. This performance signifies the importance and effectiveness of the models and indicates that the models are resilient to environmental conditions." @default.
- W3102049645 created "2020-11-23" @default.
- W3102049645 creator A5030978363 @default.
- W3102049645 creator A5064646117 @default.
- W3102049645 creator A5065781653 @default.
- W3102049645 creator A5085641811 @default.
- W3102049645 date "2020-11-09" @default.
- W3102049645 modified "2023-10-11" @default.
- W3102049645 title "CED-Net: context-aware ear detection network for unconstrained images" @default.
- W3102049645 cites W1958328135 @default.
- W3102049645 cites W1976642447 @default.
- W3102049645 cites W1993208982 @default.
- W3102049645 cites W2003307199 @default.
- W3102049645 cites W2018713546 @default.
- W3102049645 cites W2029021864 @default.
- W3102049645 cites W2047844716 @default.
- W3102049645 cites W2050708033 @default.
- W3102049645 cites W2079247604 @default.
- W3102049645 cites W2097117768 @default.
- W3102049645 cites W2101799416 @default.
- W3102049645 cites W2131848189 @default.
- W3102049645 cites W2154559504 @default.
- W3102049645 cites W2290929958 @default.
- W3102049645 cites W2519284461 @default.
- W3102049645 cites W2549139847 @default.
- W3102049645 cites W2549484948 @default.
- W3102049645 cites W2551082533 @default.
- W3102049645 cites W2559791628 @default.
- W3102049645 cites W2559813380 @default.
- W3102049645 cites W2566864799 @default.
- W3102049645 cites W2587948115 @default.
- W3102049645 cites W2590229671 @default.
- W3102049645 cites W2591630331 @default.
- W3102049645 cites W2607432142 @default.
- W3102049645 cites W2615293002 @default.
- W3102049645 cites W2735318119 @default.
- W3102049645 cites W2737730861 @default.
- W3102049645 cites W2747666224 @default.
- W3102049645 cites W2755425156 @default.
- W3102049645 cites W2781855851 @default.
- W3102049645 cites W2792583777 @default.
- W3102049645 cites W2797607609 @default.
- W3102049645 cites W2915175583 @default.
- W3102049645 cites W2925480326 @default.
- W3102049645 cites W2948080074 @default.
- W3102049645 cites W2962752334 @default.
- W3102049645 cites W2963011882 @default.
- W3102049645 cites W2963878474 @default.
- W3102049645 cites W2963881378 @default.
- W3102049645 cites W2966523470 @default.
- W3102049645 cites W2979860801 @default.
- W3102049645 cites W2981595954 @default.
- W3102049645 cites W2982026851 @default.
- W3102049645 cites W2999934079 @default.
- W3102049645 cites W3098090606 @default.
- W3102049645 cites W3098218837 @default.
- W3102049645 cites W3105744362 @default.
- W3102049645 cites W3106250896 @default.
- W3102049645 cites W3106450905 @default.
- W3102049645 cites W41316734 @default.
- W3102049645 doi "https://doi.org/10.1007/s10044-020-00914-4" @default.
- W3102049645 hasPublicationYear "2020" @default.
- W3102049645 type Work @default.
- W3102049645 sameAs 3102049645 @default.
- W3102049645 citedByCount "11" @default.
- W3102049645 countsByYear W31020496452021 @default.
- W3102049645 countsByYear W31020496452022 @default.
- W3102049645 countsByYear W31020496452023 @default.
- W3102049645 crossrefType "journal-article" @default.
- W3102049645 hasAuthorship W3102049645A5030978363 @default.
- W3102049645 hasAuthorship W3102049645A5064646117 @default.
- W3102049645 hasAuthorship W3102049645A5065781653 @default.
- W3102049645 hasAuthorship W3102049645A5085641811 @default.
- W3102049645 hasBestOaLocation W31020496452 @default.
- W3102049645 hasConcept C108583219 @default.
- W3102049645 hasConcept C119857082 @default.
- W3102049645 hasConcept C13280743 @default.
- W3102049645 hasConcept C134306372 @default.
- W3102049645 hasConcept C144024400 @default.
- W3102049645 hasConcept C151730666 @default.
- W3102049645 hasConcept C153180895 @default.
- W3102049645 hasConcept C154945302 @default.
- W3102049645 hasConcept C177148314 @default.
- W3102049645 hasConcept C184297639 @default.
- W3102049645 hasConcept C185798385 @default.
- W3102049645 hasConcept C205649164 @default.
- W3102049645 hasConcept C2779304628 @default.
- W3102049645 hasConcept C2779343474 @default.
- W3102049645 hasConcept C33923547 @default.
- W3102049645 hasConcept C36289849 @default.
- W3102049645 hasConcept C41008148 @default.
- W3102049645 hasConcept C81363708 @default.
- W3102049645 hasConcept C86803240 @default.
- W3102049645 hasConceptScore W3102049645C108583219 @default.
- W3102049645 hasConceptScore W3102049645C119857082 @default.
- W3102049645 hasConceptScore W3102049645C13280743 @default.
- W3102049645 hasConceptScore W3102049645C134306372 @default.
- W3102049645 hasConceptScore W3102049645C144024400 @default.