Matches in SemOpenAlex for { <https://semopenalex.org/work/W2989431432> ?p ?o ?g. }
- W2989431432 abstract "Machine learning systems are being used to automate many types of laborious labeling tasks. Facial actioncoding is an example of such a labeling task that requires copious amounts of time and a beyond average level of human domain expertise. In recent years, the use of end-to-end deep neural networks has led to significant improvements in action unit recognition performance and many network architectures have been proposed. Do the more complex deep neural network(DNN) architectures perform sufficiently well to justify the additional complexity? We show that pre-training on a large diverse set of noisy data can result in even a simple CNN model improving over the current state-of-the-art DNN architectures.The average F1-score achieved with our proposed method on the DISFA dataset is 0.60, compared to a previous state-of-the-art of 0.57. Additionally, we show how the number of subjects and number of images used for pre-training impacts the model performance. The approach that we have outlined is open-source, highly scalable, and not dependent on the model architecture. We release the code and data: this https URL." @default.
- W2989431432 created "2019-11-22" @default.
- W2989431432 creator A5040219094 @default.
- W2989431432 creator A5064963095 @default.
- W2989431432 date "2019-11-14" @default.
- W2989431432 modified "2023-09-27" @default.
- W2989431432 title "A Scalable Approach for Facial Action Unit Classifier Training Using Noisy Data for Pre-Training." @default.
- W2989431432 cites W1566413196 @default.
- W2989431432 cites W1595126664 @default.
- W2989431432 cites W1661563386 @default.
- W2989431432 cites W1850070871 @default.
- W2989431432 cites W1967460643 @default.
- W2989431432 cites W1988298387 @default.
- W2989431432 cites W1991142926 @default.
- W2989431432 cites W2000567862 @default.
- W2989431432 cites W2008933718 @default.
- W2989431432 cites W2045472600 @default.
- W2989431432 cites W2046875449 @default.
- W2989431432 cites W2072128103 @default.
- W2989431432 cites W2084472488 @default.
- W2989431432 cites W2098615198 @default.
- W2989431432 cites W2101965618 @default.
- W2989431432 cites W2103943262 @default.
- W2989431432 cites W2138857742 @default.
- W2989431432 cites W2178237821 @default.
- W2989431432 cites W2247411543 @default.
- W2989431432 cites W2280620570 @default.
- W2989431432 cites W2345729520 @default.
- W2989431432 cites W2405287689 @default.
- W2989431432 cites W2421475762 @default.
- W2989431432 cites W2436394355 @default.
- W2989431432 cites W2513383847 @default.
- W2989431432 cites W2557283755 @default.
- W2989431432 cites W2609141003 @default.
- W2989431432 cites W2612215259 @default.
- W2989431432 cites W2613634265 @default.
- W2989431432 cites W2729069466 @default.
- W2989431432 cites W2769429431 @default.
- W2989431432 cites W2780309588 @default.
- W2989431432 cites W2790681991 @default.
- W2989431432 cites W2792605498 @default.
- W2989431432 cites W2798545058 @default.
- W2989431432 cites W2807126412 @default.
- W2989431432 cites W2915886991 @default.
- W2989431432 cites W2948303854 @default.
- W2989431432 cites W2951151581 @default.
- W2989431432 cites W2952419167 @default.
- W2989431432 cites W2953773463 @default.
- W2989431432 cites W2969059826 @default.
- W2989431432 cites W2969824923 @default.
- W2989431432 cites W3125537303 @default.
- W2989431432 cites W3146803896 @default.
- W2989431432 cites W3210232381 @default.
- W2989431432 hasPublicationYear "2019" @default.
- W2989431432 type Work @default.
- W2989431432 sameAs 2989431432 @default.
- W2989431432 citedByCount "2" @default.
- W2989431432 countsByYear W29894314322020 @default.
- W2989431432 crossrefType "posted-content" @default.
- W2989431432 hasAuthorship W2989431432A5040219094 @default.
- W2989431432 hasAuthorship W2989431432A5064963095 @default.
- W2989431432 hasConcept C108583219 @default.
- W2989431432 hasConcept C119857082 @default.
- W2989431432 hasConcept C154945302 @default.
- W2989431432 hasConcept C162324750 @default.
- W2989431432 hasConcept C177264268 @default.
- W2989431432 hasConcept C187736073 @default.
- W2989431432 hasConcept C199360897 @default.
- W2989431432 hasConcept C2776760102 @default.
- W2989431432 hasConcept C2780451532 @default.
- W2989431432 hasConcept C2984842247 @default.
- W2989431432 hasConcept C41008148 @default.
- W2989431432 hasConcept C48044578 @default.
- W2989431432 hasConcept C50644808 @default.
- W2989431432 hasConcept C51632099 @default.
- W2989431432 hasConcept C77088390 @default.
- W2989431432 hasConcept C95623464 @default.
- W2989431432 hasConceptScore W2989431432C108583219 @default.
- W2989431432 hasConceptScore W2989431432C119857082 @default.
- W2989431432 hasConceptScore W2989431432C154945302 @default.
- W2989431432 hasConceptScore W2989431432C162324750 @default.
- W2989431432 hasConceptScore W2989431432C177264268 @default.
- W2989431432 hasConceptScore W2989431432C187736073 @default.
- W2989431432 hasConceptScore W2989431432C199360897 @default.
- W2989431432 hasConceptScore W2989431432C2776760102 @default.
- W2989431432 hasConceptScore W2989431432C2780451532 @default.
- W2989431432 hasConceptScore W2989431432C2984842247 @default.
- W2989431432 hasConceptScore W2989431432C41008148 @default.
- W2989431432 hasConceptScore W2989431432C48044578 @default.
- W2989431432 hasConceptScore W2989431432C50644808 @default.
- W2989431432 hasConceptScore W2989431432C51632099 @default.
- W2989431432 hasConceptScore W2989431432C77088390 @default.
- W2989431432 hasConceptScore W2989431432C95623464 @default.
- W2989431432 hasLocation W29894314321 @default.
- W2989431432 hasOpenAccess W2989431432 @default.
- W2989431432 hasPrimaryLocation W29894314321 @default.
- W2989431432 hasRelatedWork W2401239971 @default.
- W2989431432 hasRelatedWork W2510182739 @default.
- W2989431432 hasRelatedWork W2769937543 @default.
- W2989431432 hasRelatedWork W2786085341 @default.