Matches in SemOpenAlex for { <https://semopenalex.org/work/W1605856042> ?p ?o ?g. }
Showing items 1 to 88 of
88
with 100 items per page.
- W1605856042 abstract "For the last decade, researchers of many fields have pursued the creation of systems capable of human abilities. One of the most admired humans' qualities is the vision sense, something that looks so easy to us, but it has not been fully understood jet. In the scientific literature, face recognition has been extensively studied and; in some cases, successfully simulated. According to the Biometric International Group, nowadays Biometrics represent not only a main security application, but an expanding business according to Fig. 1a. Besides, as it can be seen in Fig. 1b, facial identification has been pointed out as one of the most important biometric modalities (Biometric International Group 2010). However, face recognition is not an easy task. Systems can be trained to recognize subjects in a given case. But along the time, characteristics of the scenario (light, face perspective, quality) can change and mislead the system. In fact, the own subject's face varies along the time (glasses, hats, stubble). These are major problems with which face recognition systems have to deal using different techniques. Since a face can appear in whatever position within a picture, the first step is to place it. However, this is far from the end of the problem, since within that location, a face can present a number of orientations. An approach to solve these problems is to normalize space position; variation of translation, and rotation degree; variation of rotation, by analyzing specific face reference points (Liun & He, 2008). There are plenty of publications about gender classification, combining different techniques and models trying to increase the state of the art performance. For example, (Chennamma et al., 2010) presented the problem of face or person identification from heavily altered facial images and manipulated faces generated by face transformation software tools available online. They proposed SIFT features for efficient face identification. Their dataset consisted on 100 face images downloaded from http://www.thesmokinggun.com/mugshots, reaching an identification rate up to 92 %. In (Chen-Chung & Shiuan-You Chin, 2010), the RGB images are transformed into the YIQ domain. As a first step, (Chen-Chung & ShiuanYou Chin, 2010) took the Y component and applied wavelet transformation. Then, the binary two dimensional principal components (B2DPC) were extracted. Finally, SVM was used as classifier. On a database of 24 subjects, with 6 samples per user, (Chen-Chung & Shiuan-You Chin, 2010) achieved an average identification rate between 96.37% and 100%." @default.
- W1605856042 created "2016-06-24" @default.
- W1605856042 creator A5080693200 @default.
- W1605856042 creator A5085214984 @default.
- W1605856042 creator A5090939243 @default.
- W1605856042 date "2011-10-21" @default.
- W1605856042 modified "2023-09-30" @default.
- W1605856042 title "Facial Identification Based on Transform Domains for Images and Videos" @default.
- W1605856042 cites W1489793438 @default.
- W1605856042 cites W1521222528 @default.
- W1605856042 cites W1548802052 @default.
- W1605856042 cites W1554663460 @default.
- W1605856042 cites W1555148682 @default.
- W1605856042 cites W1576520375 @default.
- W1605856042 cites W1599542129 @default.
- W1605856042 cites W1601740268 @default.
- W1605856042 cites W1603777779 @default.
- W1605856042 cites W1604938182 @default.
- W1605856042 cites W1980658651 @default.
- W1605856042 cites W1992137287 @default.
- W1605856042 cites W2014643786 @default.
- W1605856042 cites W2027585167 @default.
- W1605856042 cites W2040621208 @default.
- W1605856042 cites W2045621754 @default.
- W1605856042 cites W2047858664 @default.
- W1605856042 cites W2072538254 @default.
- W1605856042 cites W2080484987 @default.
- W1605856042 cites W2089035607 @default.
- W1605856042 cites W2099230815 @default.
- W1605856042 cites W2100743446 @default.
- W1605856042 cites W2103042600 @default.
- W1605856042 cites W2109975807 @default.
- W1605856042 cites W2110594266 @default.
- W1605856042 cites W2111294859 @default.
- W1605856042 cites W2114461480 @default.
- W1605856042 cites W2116551122 @default.
- W1605856042 cites W2117152910 @default.
- W1605856042 cites W2142681617 @default.
- W1605856042 cites W2150735546 @default.
- W1605856042 cites W2155190225 @default.
- W1605856042 cites W2159372481 @default.
- W1605856042 cites W262970172 @default.
- W1605856042 cites W580594664 @default.
- W1605856042 doi "https://doi.org/10.5772/17072" @default.
- W1605856042 hasPublicationYear "2011" @default.
- W1605856042 type Work @default.
- W1605856042 sameAs 1605856042 @default.
- W1605856042 citedByCount "2" @default.
- W1605856042 countsByYear W16058560422014 @default.
- W1605856042 countsByYear W16058560422016 @default.
- W1605856042 crossrefType "book-chapter" @default.
- W1605856042 hasAuthorship W1605856042A5080693200 @default.
- W1605856042 hasAuthorship W1605856042A5085214984 @default.
- W1605856042 hasAuthorship W1605856042A5090939243 @default.
- W1605856042 hasBestOaLocation W16058560421 @default.
- W1605856042 hasConcept C116834253 @default.
- W1605856042 hasConcept C153180895 @default.
- W1605856042 hasConcept C154945302 @default.
- W1605856042 hasConcept C31972630 @default.
- W1605856042 hasConcept C41008148 @default.
- W1605856042 hasConcept C59822182 @default.
- W1605856042 hasConcept C86803240 @default.
- W1605856042 hasConceptScore W1605856042C116834253 @default.
- W1605856042 hasConceptScore W1605856042C153180895 @default.
- W1605856042 hasConceptScore W1605856042C154945302 @default.
- W1605856042 hasConceptScore W1605856042C31972630 @default.
- W1605856042 hasConceptScore W1605856042C41008148 @default.
- W1605856042 hasConceptScore W1605856042C59822182 @default.
- W1605856042 hasConceptScore W1605856042C86803240 @default.
- W1605856042 hasLocation W16058560421 @default.
- W1605856042 hasLocation W16058560422 @default.
- W1605856042 hasLocation W16058560423 @default.
- W1605856042 hasOpenAccess W1605856042 @default.
- W1605856042 hasPrimaryLocation W16058560421 @default.
- W1605856042 hasRelatedWork W1891287906 @default.
- W1605856042 hasRelatedWork W1969923398 @default.
- W1605856042 hasRelatedWork W2036807459 @default.
- W1605856042 hasRelatedWork W2058170566 @default.
- W1605856042 hasRelatedWork W2170022336 @default.
- W1605856042 hasRelatedWork W2229312674 @default.
- W1605856042 hasRelatedWork W258625772 @default.
- W1605856042 hasRelatedWork W2755342338 @default.
- W1605856042 hasRelatedWork W2772917594 @default.
- W1605856042 hasRelatedWork W3116076068 @default.
- W1605856042 isParatext "false" @default.
- W1605856042 isRetracted "false" @default.
- W1605856042 magId "1605856042" @default.
- W1605856042 workType "book-chapter" @default.