Matches in SemOpenAlex for { <https://semopenalex.org/work/W2024662007> ?p ?o ?g. }
Showing items 1 to 48 of
48
with 100 items per page.
- W2024662007 abstract "Astudyonman-machinecollab orationbyusingfacialexpressionsY. DaiaS.Katahera andD. CaiaFaculty of softwareandinformation science,IwatePref.Univ,JapanABSTRACTForrealizingtheexibleman-machinecollab oration,understandingoffacialexpressionsandgesturesisnotnegligible.Inour metho d,weprop osed a hierarchical recognition approach,for theunderstanding of humanemotions.Accordingtothismetho d,thefacialAFs(actionfeatures)were rstlyextractedandrecognizedby using histograms of optical ow.Then, based on the facial AFs, facial expressions were classi ed into twocalsses, one of which presents the p ositive emotions, and the other of which do es the negativeones.Accordingly,the facial expressions b elonged to the p ositive class, or the ones b elonged to the negative class, were classi edinto more complex emotions, whichwere revealed by the corresp onding facial expressions.Finally, the system architecture how to co ordinate in recognizing facil action features and facial expressionsfor man-machine collab oration was prop osed.Keywords:understanding of facial expressions, facial action features, hierarchical recognition1. INTRODUCTIONOn-line communication, collab oration and communities give rise to exciting new application areas for computers.Forrealizingtheexibleman-machinecollab oration,understandingoffacialexpressionsandgesturesisnotnegligible.Although nonverbal cues and clues to the underlying structure of communication were studied bysome researchers,7what they mainly used as cues are head no ds and shakes, eye-brow p osition and eye gaze.It is a little rough for understanding of human emotions to utilize only these nonverbal cues and clues.On the other hand, manyofcurrent researchs fo cus on the study of recognizing the happiness, sadness, sur-prise, fear, anger and disgust, which are considered to b e universally asso ciated with distinct facial expressions.In these researches, the metho dsof expression classi cations prop osed(,1,2346)almost employ a represen-tationof facial action units,whichisbased onthedescriptions ofepicof facialexpressions suggested byEkman.10In,so called 44 AUs(ActionUnits)which can describ e all the facial expressions was prop osed.But, by observing the exp eriment results of our analyzing the facial action features,5except 44 Aus, there wereother facial action features extracted, such face sideways or forehead motion, whichreveal the human emotionsmore distinctly.Furthermore,practically,exceptthesix principle expressions ab ove, itis necessary to recognize the otherexpressions insome cases,suchasthesystem ofmonitoring patients onb ed,for whichtheexpressions whatdo ctors are concerned with are happiness, easiness, uneasiness, disgust, su ering, and surprise.In,5a systemintegration metho d of monitoring patients on b ed by utilizing the facial expression recognition was prop osed.Because in the case of nursing patients, what the do ctor is concerned with are the emotions of happiness, ease,unease, disgust, su ering, and surprise, these expressions were recognized.But, this metho d seems to b e shortof generality.Inourmetho d,basedonthefacialactionfeatureanalysisab outemotioninexp erimentswhichwerementioned in,5we prop ose a hierarchical recognition approachwhich is more close to the biology characteristicsofemotion,9fortheunderstandinghumanemotions.Accordingtothismetho d,facialAFs(actionfeatures) are rstly extracted and recognized.Then,based on the facial AFs,facial expressions are classi edFurther author information:(Send corresp ondence to Y. D.)Y.D.:E-mail:dai@soft.iwate-pu.ac.jp" @default.
- W2024662007 created "2016-06-24" @default.
- W2024662007 creator A5037518844 @default.
- W2024662007 creator A5037942269 @default.
- W2024662007 creator A5086880652 @default.
- W2024662007 date "2002-09-04" @default.
- W2024662007 modified "2023-09-23" @default.
- W2024662007 title "<title>Man-machine collaboration using facial expressions</title>" @default.
- W2024662007 doi "https://doi.org/10.1117/12.481593" @default.
- W2024662007 hasPublicationYear "2002" @default.
- W2024662007 type Work @default.
- W2024662007 sameAs 2024662007 @default.
- W2024662007 citedByCount "0" @default.
- W2024662007 crossrefType "proceedings-article" @default.
- W2024662007 hasAuthorship W2024662007A5037518844 @default.
- W2024662007 hasAuthorship W2024662007A5037942269 @default.
- W2024662007 hasAuthorship W2024662007A5086880652 @default.
- W2024662007 hasConcept C107457646 @default.
- W2024662007 hasConcept C154945302 @default.
- W2024662007 hasConcept C195704467 @default.
- W2024662007 hasConcept C204321447 @default.
- W2024662007 hasConcept C23123220 @default.
- W2024662007 hasConcept C28490314 @default.
- W2024662007 hasConcept C41008148 @default.
- W2024662007 hasConceptScore W2024662007C107457646 @default.
- W2024662007 hasConceptScore W2024662007C154945302 @default.
- W2024662007 hasConceptScore W2024662007C195704467 @default.
- W2024662007 hasConceptScore W2024662007C204321447 @default.
- W2024662007 hasConceptScore W2024662007C23123220 @default.
- W2024662007 hasConceptScore W2024662007C28490314 @default.
- W2024662007 hasConceptScore W2024662007C41008148 @default.
- W2024662007 hasLocation W20246620071 @default.
- W2024662007 hasOpenAccess W2024662007 @default.
- W2024662007 hasPrimaryLocation W20246620071 @default.
- W2024662007 hasRelatedWork W2101955803 @default.
- W2024662007 hasRelatedWork W2119214692 @default.
- W2024662007 hasRelatedWork W2144190808 @default.
- W2024662007 hasRelatedWork W2151447942 @default.
- W2024662007 hasRelatedWork W2357241418 @default.
- W2024662007 hasRelatedWork W2366644548 @default.
- W2024662007 hasRelatedWork W2376314740 @default.
- W2024662007 hasRelatedWork W2384888906 @default.
- W2024662007 hasRelatedWork W2469626427 @default.
- W2024662007 hasRelatedWork W2611614995 @default.
- W2024662007 isParatext "false" @default.
- W2024662007 isRetracted "false" @default.
- W2024662007 magId "2024662007" @default.
- W2024662007 workType "article" @default.