Matches in SemOpenAlex for { <https://semopenalex.org/work/W2084263387> ?p ?o ?g. }
- W2084263387 abstract "Human-Robot Interaction (HRI) studies have recently received increasing attention in various fields, from academic communities to engineering firms and the media. Many researchers have been focusing on the development of tools to evaluate the performance of robotic systems and studying how to extend the range of robot interaction modalities and contexts. Because people are emotionally engaged when interacting with computers and robots, researchers have been focusing attention on the study of affective human-robot interaction. This new field of study requires the integration of various approaches typical of different research backgrounds, such as psychology and engineering, to gain more insight into the human-robot affective interaction.In this paper, we report the development of a multimodal acquisition platform called HIPOP (Human Interaction Pervasive Observation Platform). HIPOP is a modular data-gathering platform based on various hardware and software units that can be easily used to create a custom acquisition setup for HRI studies. The platform uses modules for physiological signals, eye gaze, video and audio acquisition to perform an integrated affective and behavioral analysis. It is also possible to include new hardware devices into the platform. The open-source hardware and software revolution has made many high-quality commercial and open-source products freely available for HRI and HCI research. These devices are currently most often used for data acquisition and robot control, and they can be easily included in HIPOP.Technical tests demonstrated the ability of HIPOP to reliably acquire a large set of data in terms of failure management and data synchronization. The platform was able to automatically recover from errors and faults without affecting the entire system, and the misalignment observed in the acquired data was not significant and did not affect the multimodal analysis. HIPOP was also tested in the context of the FACET (FACE Therapy) project, in which a humanoid robot called FACE (Facial Automaton for Conveying Emotions) was used to convey affective stimuli to children with autism. In the FACET project, psychologists without technical skills were able to use HIPOP to collect the data needed for their experiments without dealing with hardware issues, data integration challenges, or synchronization problems. The FACET case study highlighted the real core feature of the HIPOP platform (i.e., multimodal data integration and fusion). This analytical approach allowed psychologists to study both behavioral and psychophysiological reactions to obtain a more complete view of the subjects' state during interaction with the robot.These results indicate that HIPOP could become an innovative tool for HRI affective studies aimed at inferring a more detailed view of a subject's feelings and behavior during interaction with affective and empathic robots." @default.
- W2084263387 created "2016-06-24" @default.
- W2084263387 creator A5018186881 @default.
- W2084263387 creator A5024349725 @default.
- W2084263387 creator A5041499996 @default.
- W2084263387 date "2014-06-01" @default.
- W2084263387 modified "2023-09-23" @default.
- W2084263387 title "Development and Testing of a Multimodal Acquisition Platform for Human-Robot Interaction Affective Studies" @default.
- W2084263387 cites W14468699 @default.
- W2084263387 cites W1581387623 @default.
- W2084263387 cites W1594278800 @default.
- W2084263387 cites W1727904711 @default.
- W2084263387 cites W1748703215 @default.
- W2084263387 cites W1966706482 @default.
- W2084263387 cites W1972978214 @default.
- W2084263387 cites W1978980572 @default.
- W2084263387 cites W1980882881 @default.
- W2084263387 cites W1992275931 @default.
- W2084263387 cites W1992739204 @default.
- W2084263387 cites W1994405094 @default.
- W2084263387 cites W2009857749 @default.
- W2084263387 cites W2011210641 @default.
- W2084263387 cites W2015403432 @default.
- W2084263387 cites W2017264794 @default.
- W2084263387 cites W2041282815 @default.
- W2084263387 cites W2041649634 @default.
- W2084263387 cites W2043180902 @default.
- W2084263387 cites W2044659637 @default.
- W2084263387 cites W2047792798 @default.
- W2084263387 cites W2047810479 @default.
- W2084263387 cites W2053101950 @default.
- W2084263387 cites W2060349841 @default.
- W2084263387 cites W2061230281 @default.
- W2084263387 cites W2066403355 @default.
- W2084263387 cites W2076313098 @default.
- W2084263387 cites W2077741723 @default.
- W2084263387 cites W2078671978 @default.
- W2084263387 cites W2095844678 @default.
- W2084263387 cites W2109923073 @default.
- W2084263387 cites W2121318724 @default.
- W2084263387 cites W2125109199 @default.
- W2084263387 cites W2134415008 @default.
- W2084263387 cites W2142034317 @default.
- W2084263387 cites W2143580450 @default.
- W2084263387 cites W2144408839 @default.
- W2084263387 cites W2152193807 @default.
- W2084263387 cites W2153738822 @default.
- W2084263387 cites W2162951543 @default.
- W2084263387 cites W2163583870 @default.
- W2084263387 cites W2164368909 @default.
- W2084263387 cites W2165010300 @default.
- W2084263387 cites W2167557160 @default.
- W2084263387 cites W2605232330 @default.
- W2084263387 cites W63886951 @default.
- W2084263387 doi "https://doi.org/10.5898/jhri.3.2.lazzeri" @default.
- W2084263387 hasPublicationYear "2014" @default.
- W2084263387 type Work @default.
- W2084263387 sameAs 2084263387 @default.
- W2084263387 citedByCount "15" @default.
- W2084263387 countsByYear W20842633872014 @default.
- W2084263387 countsByYear W20842633872015 @default.
- W2084263387 countsByYear W20842633872016 @default.
- W2084263387 countsByYear W20842633872018 @default.
- W2084263387 countsByYear W20842633872019 @default.
- W2084263387 countsByYear W20842633872020 @default.
- W2084263387 countsByYear W20842633872021 @default.
- W2084263387 countsByYear W20842633872022 @default.
- W2084263387 crossrefType "journal-article" @default.
- W2084263387 hasAuthorship W2084263387A5018186881 @default.
- W2084263387 hasAuthorship W2084263387A5024349725 @default.
- W2084263387 hasAuthorship W2084263387A5041499996 @default.
- W2084263387 hasBestOaLocation W20842633871 @default.
- W2084263387 hasConcept C101468663 @default.
- W2084263387 hasConcept C107457646 @default.
- W2084263387 hasConcept C111919701 @default.
- W2084263387 hasConcept C127162648 @default.
- W2084263387 hasConcept C144024400 @default.
- W2084263387 hasConcept C145460709 @default.
- W2084263387 hasConcept C154945302 @default.
- W2084263387 hasConcept C163985040 @default.
- W2084263387 hasConcept C199360897 @default.
- W2084263387 hasConcept C2777904410 @default.
- W2084263387 hasConcept C2778562939 @default.
- W2084263387 hasConcept C2779903281 @default.
- W2084263387 hasConcept C31258907 @default.
- W2084263387 hasConcept C36289849 @default.
- W2084263387 hasConcept C41008148 @default.
- W2084263387 hasConcept C90509273 @default.
- W2084263387 hasConceptScore W2084263387C101468663 @default.
- W2084263387 hasConceptScore W2084263387C107457646 @default.
- W2084263387 hasConceptScore W2084263387C111919701 @default.
- W2084263387 hasConceptScore W2084263387C127162648 @default.
- W2084263387 hasConceptScore W2084263387C144024400 @default.
- W2084263387 hasConceptScore W2084263387C145460709 @default.
- W2084263387 hasConceptScore W2084263387C154945302 @default.
- W2084263387 hasConceptScore W2084263387C163985040 @default.
- W2084263387 hasConceptScore W2084263387C199360897 @default.
- W2084263387 hasConceptScore W2084263387C2777904410 @default.
- W2084263387 hasConceptScore W2084263387C2778562939 @default.
- W2084263387 hasConceptScore W2084263387C2779903281 @default.