Matches in SemOpenAlex for { <https://semopenalex.org/work/W2904341128> ?p ?o ?g. }
Showing items 1 to 96 of
96
with 100 items per page.
- W2904341128 abstract "The influence of fatigue on accidents has been demonstrated over time by conducting several research papers, whose conclusions came to confirm the actual situations faced by drivers every day. Globally, between 10% and 25% of road, accidents are caused by fatigue, and studies have shown that professional drivers are more prone to the risk of being involved in an accident than casual drivers. Nowadays a high percent of automobile accidents are caused by human error. Human errors reflected by drivers lacking necessary vigilance levels or reacting improperly causing inadequate control of their vehicle.The critical role of human factor in an accident motivated the development of a new series of countermeasures aimed to prevent or mitigate human error while driving [1]. Concept systems developed and implemented to monitor in real-time the performance of driver help to achieve the goal of reducing the overall rate of accidents. Current systems that use image processing and computer vision techniques are able to measure driver fatigue and perceive distraction or engagement levels [2], [3]. These driver assistance systems analyze driver head movements, face and eye movements to evaluate the driver engagement level while driving [4]. The work presented in this paper describes a system developed to monitor driver face and eye movements with the purpose of identifying incipient clues describing driver drowsiness and distraction from driving task. A video camera captures image frames with the driver face supplied as input to the system. The proposed image processing system analyzes the video, frame by frame, locates driver face, detects the eyes regions, measures the movement of eyes and eyelids and then evaluates drowsiness and distraction levels. Performance characteristics of the proposed driver assistance system have been tested using video sequences acquired during a test drive with a car in a real world environment.The first part of our work challenges the implementation of a face detection procedure. Although this problem is common and was tackled by various approaches [5], it remains of high complexity imposing interesting challenges at system level. One of the challenges is to make a robust implementation that is able to detect or recognize a human face in environments described by a high dynamic range of luminosity, occlusion of scale. The complex problem of such a driver assistance system partially tacked in our work by highlighting the main issues, their impact and proposing dedicated solutions that address these challenges. Our implementation proposes solutions tested in real world scenarios to design challenges that influence the outcome when deciding the concept of a face detection or recognition system for a real world application. The solutions proposed to address the design decisions process when conceptualizing such a system convey the usage of methods for feature extraction, approaches to holistic matching and other hybrid methods described with more details in the following sections.The second part of our work addresses challenges encountered when developing a concept for driver drowsiness or driver vigilance monitoring. In the more general use case of such a system it would be of high interest to correlate the environment with what driver is perceiving or what is aware of while driving. This high-level functionality would allow the system to warn the driver when is not aware or it does not perceive something in the surrounding environment. Our work evaluates the performance of a driver vigilance method that uses data obtained by measurement of face and eye region movements. The method employs a face-matching step and a histogram projection approach to extract driver vigilance data from face and eye regions. Head rotation used by our method to gather data about the visual focus point of driver correlated to the orientation of driver face. This approach uses the following advantages of the method: low complexity and efficient detection of head rotation using face template matching, robust features extraction from the eye region. The extracted features employed by our method evaluate the driver drowsiness and distraction levels." @default.
- W2904341128 created "2018-12-22" @default.
- W2904341128 creator A5011676963 @default.
- W2904341128 creator A5012704162 @default.
- W2904341128 creator A5061571227 @default.
- W2904341128 creator A5077812730 @default.
- W2904341128 date "2018-10-01" @default.
- W2904341128 modified "2023-10-18" @default.
- W2904341128 title "Driver Monitoring Using Face Detection and Facial Landmarks" @default.
- W2904341128 cites W1975469340 @default.
- W2904341128 cites W1988790447 @default.
- W2904341128 cites W201421746 @default.
- W2904341128 cites W2050394165 @default.
- W2904341128 cites W2115785119 @default.
- W2904341128 cites W2128221159 @default.
- W2904341128 cites W2134738818 @default.
- W2904341128 cites W2164598857 @default.
- W2904341128 cites W2164752783 @default.
- W2904341128 cites W2218357776 @default.
- W2904341128 cites W2279055374 @default.
- W2904341128 cites W2295580156 @default.
- W2904341128 cites W2405491235 @default.
- W2904341128 cites W2137304373 @default.
- W2904341128 doi "https://doi.org/10.1109/icepe.2018.8559898" @default.
- W2904341128 hasPublicationYear "2018" @default.
- W2904341128 type Work @default.
- W2904341128 sameAs 2904341128 @default.
- W2904341128 citedByCount "3" @default.
- W2904341128 countsByYear W29043411282019 @default.
- W2904341128 countsByYear W29043411282020 @default.
- W2904341128 crossrefType "proceedings-article" @default.
- W2904341128 hasAuthorship W2904341128A5011676963 @default.
- W2904341128 hasAuthorship W2904341128A5012704162 @default.
- W2904341128 hasAuthorship W2904341128A5061571227 @default.
- W2904341128 hasAuthorship W2904341128A5077812730 @default.
- W2904341128 hasConcept C115961682 @default.
- W2904341128 hasConcept C127413603 @default.
- W2904341128 hasConcept C144024400 @default.
- W2904341128 hasConcept C154945302 @default.
- W2904341128 hasConcept C169760540 @default.
- W2904341128 hasConcept C169806903 @default.
- W2904341128 hasConcept C200601418 @default.
- W2904341128 hasConcept C2776378700 @default.
- W2904341128 hasConcept C2779304628 @default.
- W2904341128 hasConcept C2780689630 @default.
- W2904341128 hasConcept C29825287 @default.
- W2904341128 hasConcept C31510193 @default.
- W2904341128 hasConcept C31972630 @default.
- W2904341128 hasConcept C36289849 @default.
- W2904341128 hasConcept C41008148 @default.
- W2904341128 hasConcept C44154836 @default.
- W2904341128 hasConcept C4641261 @default.
- W2904341128 hasConcept C52622490 @default.
- W2904341128 hasConcept C76155785 @default.
- W2904341128 hasConcept C86803240 @default.
- W2904341128 hasConcept C87833898 @default.
- W2904341128 hasConcept C9417928 @default.
- W2904341128 hasConceptScore W2904341128C115961682 @default.
- W2904341128 hasConceptScore W2904341128C127413603 @default.
- W2904341128 hasConceptScore W2904341128C144024400 @default.
- W2904341128 hasConceptScore W2904341128C154945302 @default.
- W2904341128 hasConceptScore W2904341128C169760540 @default.
- W2904341128 hasConceptScore W2904341128C169806903 @default.
- W2904341128 hasConceptScore W2904341128C200601418 @default.
- W2904341128 hasConceptScore W2904341128C2776378700 @default.
- W2904341128 hasConceptScore W2904341128C2779304628 @default.
- W2904341128 hasConceptScore W2904341128C2780689630 @default.
- W2904341128 hasConceptScore W2904341128C29825287 @default.
- W2904341128 hasConceptScore W2904341128C31510193 @default.
- W2904341128 hasConceptScore W2904341128C31972630 @default.
- W2904341128 hasConceptScore W2904341128C36289849 @default.
- W2904341128 hasConceptScore W2904341128C41008148 @default.
- W2904341128 hasConceptScore W2904341128C44154836 @default.
- W2904341128 hasConceptScore W2904341128C4641261 @default.
- W2904341128 hasConceptScore W2904341128C52622490 @default.
- W2904341128 hasConceptScore W2904341128C76155785 @default.
- W2904341128 hasConceptScore W2904341128C86803240 @default.
- W2904341128 hasConceptScore W2904341128C87833898 @default.
- W2904341128 hasConceptScore W2904341128C9417928 @default.
- W2904341128 hasLocation W29043411281 @default.
- W2904341128 hasOpenAccess W2904341128 @default.
- W2904341128 hasPrimaryLocation W29043411281 @default.
- W2904341128 hasRelatedWork W1993261271 @default.
- W2904341128 hasRelatedWork W2106309274 @default.
- W2904341128 hasRelatedWork W2110159536 @default.
- W2904341128 hasRelatedWork W2127135061 @default.
- W2904341128 hasRelatedWork W2128405921 @default.
- W2904341128 hasRelatedWork W2382074608 @default.
- W2904341128 hasRelatedWork W2544717100 @default.
- W2904341128 hasRelatedWork W2624789303 @default.
- W2904341128 hasRelatedWork W2904341128 @default.
- W2904341128 hasRelatedWork W2556027189 @default.
- W2904341128 isParatext "false" @default.
- W2904341128 isRetracted "false" @default.
- W2904341128 magId "2904341128" @default.
- W2904341128 workType "article" @default.