Matches in SemOpenAlex for { <https://semopenalex.org/work/W4285207520> ?p ?o ?g. }
- W4285207520 endingPage "134" @default.
- W4285207520 startingPage "117" @default.
- W4285207520 abstract "Construction practitioners make decisions about safety risks that can be subjective and prone to error. The trained computer object detection provides a standardised method to deal with this issue, but it typically relies on many photos to train one object, which is costly and time-consuming. This project proposes a new algorithm to train a computer to identify construction safety risks with fewer photos. In addition, holographic hybrid reality will be developed for safety training in the construction industry; Mercedes-Benz has used a similar approach to inform employees about collision zones. We will use the trained images to develop HoloLens hybrid reality to share construction site safety knowledge via wearable HoloLens for on-site safety risk detection. Lastly, although decision-making in various areas has been studied using neuroscience, how an individual’s brain makes decisions when different construction safety risks are perceived and the impact of holographic safety training on brain reaction and activities remains unknown. These issues will be studied via haemodynamic response and neuroimaging. In this research agenda, we plan to construct a photo library of 10,000 high-quality photos of various construction risks with different shading, size, and orientation by collecting Creative Commons construction photos and turning existing online Creative Commons videos into photos. To achieve this goal, we will input the online common creative photos into our image-based CAPTCHA system (similar to Google’s ReCaptcha). Each group will include 16 photos to allow construction practitioners to identify and click on those that include a safety risk. The identified photos with safety risks will be saved, and specific categories of safety risks will be uploaded to social media and sent via email. The trained figures will be deployed to the HoloLens in our laboratory. About 20 safety experts, including safety managers and trainers, will be invited to use the HoloLens for detecting hazards on-site and provide comments for improvement. Holographic hybrid reality will be built with Unity C# and HoloToolkit. The object detection results obtained will also be used so the research participants will see the real scene with not only hazards labelled by AI, but also some high-risk elements that cannot be included in ordinary safety training, such as open taps with water running into the ground and blasting. Four holographic hybrid reality training scenarios will be generated: a general construction site, and three specially designed scenarios for refurbishment, new building, and road construction settings. In the last stage, we will use functional near-infrared spectroscopy (fNIRS) to study construction practitioners’ brain responses when they see and identify various kinds of hazards. The first group will be exposed to holographic hybrid reality with some safety risks on the first day, and they will be asked on the tenth day to identify the safety risks. The control group will receive no safety training but will be asked to identify risks. All research participants will be monitored by fNIRS when they attempt to identify the safety risks. Haemodynamic response and neuroimaging tests will be used to study the effectiveness of the safety training." @default.
- W4285207520 created "2022-07-14" @default.
- W4285207520 creator A5011379262 @default.
- W4285207520 creator A5023569287 @default.
- W4285207520 creator A5025881868 @default.
- W4285207520 date "2022-01-01" @default.
- W4285207520 modified "2023-10-01" @default.
- W4285207520 title "AI Object Detection, Holographic Hybrid Reality and Haemodynamic Response to Construction Site Safety Risks" @default.
- W4285207520 cites W1975960737 @default.
- W4285207520 cites W1987052439 @default.
- W4285207520 cites W2002143429 @default.
- W4285207520 cites W2059473123 @default.
- W4285207520 cites W2143161757 @default.
- W4285207520 cites W2157587268 @default.
- W4285207520 cites W2301606162 @default.
- W4285207520 cites W2326192082 @default.
- W4285207520 cites W2470072196 @default.
- W4285207520 cites W2611799017 @default.
- W4285207520 cites W2625912276 @default.
- W4285207520 cites W2629208786 @default.
- W4285207520 cites W2740232185 @default.
- W4285207520 cites W2744595380 @default.
- W4285207520 cites W2746335686 @default.
- W4285207520 cites W2748891445 @default.
- W4285207520 cites W2761891891 @default.
- W4285207520 cites W2765275511 @default.
- W4285207520 cites W2767250411 @default.
- W4285207520 cites W2767299945 @default.
- W4285207520 cites W2786078834 @default.
- W4285207520 cites W2789436332 @default.
- W4285207520 cites W2792224473 @default.
- W4285207520 cites W2793602173 @default.
- W4285207520 cites W2797392161 @default.
- W4285207520 cites W2799347691 @default.
- W4285207520 cites W2807982056 @default.
- W4285207520 cites W2810160006 @default.
- W4285207520 cites W2883173902 @default.
- W4285207520 cites W2883701876 @default.
- W4285207520 cites W2888858922 @default.
- W4285207520 cites W2889114093 @default.
- W4285207520 cites W2890964789 @default.
- W4285207520 cites W2891116979 @default.
- W4285207520 cites W2891534250 @default.
- W4285207520 cites W2892389796 @default.
- W4285207520 cites W2892879472 @default.
- W4285207520 cites W2898584722 @default.
- W4285207520 cites W2899324831 @default.
- W4285207520 cites W2900479853 @default.
- W4285207520 cites W2905256258 @default.
- W4285207520 cites W2905386317 @default.
- W4285207520 cites W2905468702 @default.
- W4285207520 cites W2911109269 @default.
- W4285207520 cites W2913628112 @default.
- W4285207520 cites W2921030598 @default.
- W4285207520 cites W2998552507 @default.
- W4285207520 cites W4255785389 @default.
- W4285207520 doi "https://doi.org/10.1007/978-981-19-0737-1_8" @default.
- W4285207520 hasPublicationYear "2022" @default.
- W4285207520 type Work @default.
- W4285207520 citedByCount "0" @default.
- W4285207520 crossrefType "book-chapter" @default.
- W4285207520 hasAuthorship W4285207520A5011379262 @default.
- W4285207520 hasAuthorship W4285207520A5023569287 @default.
- W4285207520 hasAuthorship W4285207520A5025881868 @default.
- W4285207520 hasConcept C107457646 @default.
- W4285207520 hasConcept C127413603 @default.
- W4285207520 hasConcept C154945302 @default.
- W4285207520 hasConcept C16345878 @default.
- W4285207520 hasConcept C199360897 @default.
- W4285207520 hasConcept C2524010 @default.
- W4285207520 hasConcept C2780801425 @default.
- W4285207520 hasConcept C2781238097 @default.
- W4285207520 hasConcept C33923547 @default.
- W4285207520 hasConcept C41008148 @default.
- W4285207520 hasConcept C66938386 @default.
- W4285207520 hasConcept C86804380 @default.
- W4285207520 hasConceptScore W4285207520C107457646 @default.
- W4285207520 hasConceptScore W4285207520C127413603 @default.
- W4285207520 hasConceptScore W4285207520C154945302 @default.
- W4285207520 hasConceptScore W4285207520C16345878 @default.
- W4285207520 hasConceptScore W4285207520C199360897 @default.
- W4285207520 hasConceptScore W4285207520C2524010 @default.
- W4285207520 hasConceptScore W4285207520C2780801425 @default.
- W4285207520 hasConceptScore W4285207520C2781238097 @default.
- W4285207520 hasConceptScore W4285207520C33923547 @default.
- W4285207520 hasConceptScore W4285207520C41008148 @default.
- W4285207520 hasConceptScore W4285207520C66938386 @default.
- W4285207520 hasConceptScore W4285207520C86804380 @default.
- W4285207520 hasLocation W42852075201 @default.
- W4285207520 hasOpenAccess W4285207520 @default.
- W4285207520 hasPrimaryLocation W42852075201 @default.
- W4285207520 hasRelatedWork W1569815043 @default.
- W4285207520 hasRelatedWork W1969324738 @default.
- W4285207520 hasRelatedWork W2159508116 @default.
- W4285207520 hasRelatedWork W2351571780 @default.
- W4285207520 hasRelatedWork W2355833191 @default.
- W4285207520 hasRelatedWork W2724221648 @default.
- W4285207520 hasRelatedWork W2899084033 @default.