Matches in SemOpenAlex for { <https://semopenalex.org/work/W2138544107> ?p ?o ?g. }
Showing items 1 to 70 of
70
with 100 items per page.
- W2138544107 abstract "Three-Dimensional face recognition is a challenging task with a large number of proposed solutions [1, 2]. With variations in pose and expression the identification of a face scan based on 3D geometry is difficult. To improve on this task and to evaluate existing face matching methods large sets of 3D faces were constructed, such as the FRGC [3], BU-3DFE [4], and the GavabDB [5] database. When used in the same experimental way, these publicly available sets allow for a fair comparison of different methods. Usually, researchers compare the recognition rates (or identification rates) of different methods. To identify a person, its 3D face scan is enrolled as query in the database and if the most similar scan (other than the query) in the database belongs to the same person, he or she is identified correctly. For a set of queries, the recognition rate is computed as the average of zeros (no identification) and ones (correct identification). However, the recognition rate is a limited evaluation measure, because it considers merely the closest match of each query. In case you are using a database that contains two scans per expression per subject and you use each scan as query once, you are bound to find the similar scan on top of the ranked list. Such an experiment boosts the recognition rate, but gives no insight of the expression invariance of different methods. For that, an evaluation measure is required that takes a larger part of the ranked list into account. In this contest we compare different face matching methods using a large number of performance measures. As a test set we have used a processed subset of the GavabDB [5], which contains several expressions and pose variations per subject. 2 DATABASE For the retrieval contest of 3D faces we have used a subset of the GavabDB [5]. The GavabDB consists of Minolta Vi-700 laser range scans from 61 different subjects. The subjects, of which 45 are male and 16 are female, are all Caucasian. Each subject was scanned nine times for different poses and expressions, namely six neutral expression scans and three scans with an expression. The neutral scans include two different frontal scans, one scan while looking up ( +35 ), one scan while looking down ( -35 ), one scan from the right side ( +90 ), and one from the left side ( -90 ). The expression scans include one with a smile, one with a pronounced laugh, and an “arbitrary expression” freely chosen by the subject." @default.
- W2138544107 created "2016-06-24" @default.
- W2138544107 creator A5022137808 @default.
- W2138544107 creator A5028096958 @default.
- W2138544107 creator A5084414354 @default.
- W2138544107 date "2008-06-01" @default.
- W2138544107 modified "2023-10-18" @default.
- W2138544107 title "SHape REtrieval contest 2008: 3D face scans" @default.
- W2138544107 cites W2103464684 @default.
- W2138544107 cites W2109992201 @default.
- W2138544107 cites W2116313104 @default.
- W2138544107 cites W2119445003 @default.
- W2138544107 cites W2131131256 @default.
- W2138544107 cites W2161308290 @default.
- W2138544107 doi "https://doi.org/10.1109/smi.2008.4547979" @default.
- W2138544107 hasPublicationYear "2008" @default.
- W2138544107 type Work @default.
- W2138544107 sameAs 2138544107 @default.
- W2138544107 citedByCount "24" @default.
- W2138544107 countsByYear W21385441072012 @default.
- W2138544107 countsByYear W21385441072013 @default.
- W2138544107 countsByYear W21385441072014 @default.
- W2138544107 countsByYear W21385441072015 @default.
- W2138544107 countsByYear W21385441072016 @default.
- W2138544107 countsByYear W21385441072017 @default.
- W2138544107 countsByYear W21385441072018 @default.
- W2138544107 countsByYear W21385441072019 @default.
- W2138544107 countsByYear W21385441072021 @default.
- W2138544107 countsByYear W21385441072023 @default.
- W2138544107 crossrefType "proceedings-article" @default.
- W2138544107 hasAuthorship W2138544107A5022137808 @default.
- W2138544107 hasAuthorship W2138544107A5028096958 @default.
- W2138544107 hasAuthorship W2138544107A5084414354 @default.
- W2138544107 hasConcept C121684516 @default.
- W2138544107 hasConcept C144024400 @default.
- W2138544107 hasConcept C154945302 @default.
- W2138544107 hasConcept C17744445 @default.
- W2138544107 hasConcept C199539241 @default.
- W2138544107 hasConcept C2777582232 @default.
- W2138544107 hasConcept C2779304628 @default.
- W2138544107 hasConcept C31972630 @default.
- W2138544107 hasConcept C36289849 @default.
- W2138544107 hasConcept C41008148 @default.
- W2138544107 hasConceptScore W2138544107C121684516 @default.
- W2138544107 hasConceptScore W2138544107C144024400 @default.
- W2138544107 hasConceptScore W2138544107C154945302 @default.
- W2138544107 hasConceptScore W2138544107C17744445 @default.
- W2138544107 hasConceptScore W2138544107C199539241 @default.
- W2138544107 hasConceptScore W2138544107C2777582232 @default.
- W2138544107 hasConceptScore W2138544107C2779304628 @default.
- W2138544107 hasConceptScore W2138544107C31972630 @default.
- W2138544107 hasConceptScore W2138544107C36289849 @default.
- W2138544107 hasConceptScore W2138544107C41008148 @default.
- W2138544107 hasLocation W21385441071 @default.
- W2138544107 hasOpenAccess W2138544107 @default.
- W2138544107 hasPrimaryLocation W21385441071 @default.
- W2138544107 hasRelatedWork W1891287906 @default.
- W2138544107 hasRelatedWork W1969923398 @default.
- W2138544107 hasRelatedWork W2036807459 @default.
- W2138544107 hasRelatedWork W2058170566 @default.
- W2138544107 hasRelatedWork W2166024367 @default.
- W2138544107 hasRelatedWork W2229312674 @default.
- W2138544107 hasRelatedWork W2755342338 @default.
- W2138544107 hasRelatedWork W2772917594 @default.
- W2138544107 hasRelatedWork W2775347418 @default.
- W2138544107 hasRelatedWork W3116076068 @default.
- W2138544107 isParatext "false" @default.
- W2138544107 isRetracted "false" @default.
- W2138544107 magId "2138544107" @default.
- W2138544107 workType "article" @default.