Matches in SemOpenAlex for { <https://semopenalex.org/work/W4317509311> ?p ?o ?g. }
- W4317509311 endingPage "1866" @default.
- W4317509311 startingPage "1866" @default.
- W4317509311 abstract "Artificial Intelligence (AI) and allied disruptive technologies have revolutionized the scientific world. However, civil engineering, in general, and infrastructure management, in particular, are lagging behind the technology adoption curves. Crack identification and assessment are important indicators to assess and evaluate the structural health of critical city infrastructures such as bridges. Historically, such critical infrastructure has been monitored through manual visual inspection. This process is costly, time-consuming, and prone to errors as it relies on the inspector’s knowledge and the gadgets’ precision. To save time and cost, automatic crack and damage detection in bridges and similar infrastructure is required to ensure its efficacy and reliability. However, an automated and reliable system does not exist, particularly in developing countries, presenting a gap targeted in this study. Accordingly, we proposed a two-phased deep learning-based framework for smart infrastructure management to assess the conditions of bridges in developing countries. In the first part of the study, we detected cracks in bridges using the dataset from Pakistan and the online-accessible SDNET2018 dataset. You only look once version 5 (YOLOv5) has been used to locate and classify cracks in the dataset images. To determine the main indicators (precision, recall, and mAP (0.5)), we applied each of the YOLOv5 s, m, and l models to the dataset using a ratio of 7:2:1 for training, validation, and testing, respectively. The mAP (Mean average precision) values of all the models were compared to evaluate their performance. The results show mAP values for the test set of the YOLOv5 s, m, and l as 97.8%, 99.3%, and 99.1%, respectively, indicating the superior performance of the YOLOv5 m model compared to the two counterparts. In the second portion of the study, segmentation of the crack is carried out using the U-Net model to acquire their exact pixels. Using the segmentation mask allocated to the attribute extractor, the pixel’s width, height, and area are measured and visualized on scatter plots and Boxplots to segregate different cracks. Furthermore, the segmentation part validated the output of the proposed YOLOv5 models. This study not only located and classified the cracks based on their severity level, but also segmented the crack pixels and measured their width, height, and area per pixel under different lighting conditions. It is one of the few studies targeting low-cost health assessment and damage detection in bridges of developing countries that otherwise struggle with regular maintenance and rehabilitation of such critical infrastructure. The proposed model can be used by local infrastructure monitoring and rehabilitation authorities for regular condition and health assessment of the bridges and similar infrastructure to move towards a smarter and automated damage assessment system." @default.
- W4317509311 created "2023-01-20" @default.
- W4317509311 creator A5017346745 @default.
- W4317509311 creator A5046529981 @default.
- W4317509311 creator A5054148817 @default.
- W4317509311 creator A5083637324 @default.
- W4317509311 date "2023-01-18" @default.
- W4317509311 modified "2023-10-01" @default.
- W4317509311 title "Smart and Automated Infrastructure Management: A Deep Learning Approach for Crack Detection in Bridge Images" @default.
- W4317509311 cites W1513670756 @default.
- W4317509311 cites W1960097882 @default.
- W4317509311 cites W1993097837 @default.
- W4317509311 cites W1995130521 @default.
- W4317509311 cites W2003731406 @default.
- W4317509311 cites W2009040215 @default.
- W4317509311 cites W2019496031 @default.
- W4317509311 cites W2040665675 @default.
- W4317509311 cites W2056863247 @default.
- W4317509311 cites W2079054397 @default.
- W4317509311 cites W2109255472 @default.
- W4317509311 cites W2126052502 @default.
- W4317509311 cites W2154805199 @default.
- W4317509311 cites W2213612645 @default.
- W4317509311 cites W2271158160 @default.
- W4317509311 cites W2407692387 @default.
- W4317509311 cites W2549261554 @default.
- W4317509311 cites W2598457882 @default.
- W4317509311 cites W2748643398 @default.
- W4317509311 cites W2791957585 @default.
- W4317509311 cites W2793513544 @default.
- W4317509311 cites W2814406141 @default.
- W4317509311 cites W2889494142 @default.
- W4317509311 cites W2899144041 @default.
- W4317509311 cites W2905163589 @default.
- W4317509311 cites W2908128586 @default.
- W4317509311 cites W2912530595 @default.
- W4317509311 cites W2916229809 @default.
- W4317509311 cites W2920633487 @default.
- W4317509311 cites W2941356554 @default.
- W4317509311 cites W2943586321 @default.
- W4317509311 cites W2949817882 @default.
- W4317509311 cites W2963908722 @default.
- W4317509311 cites W2971843891 @default.
- W4317509311 cites W2976255582 @default.
- W4317509311 cites W2989663977 @default.
- W4317509311 cites W3001011940 @default.
- W4317509311 cites W3005719996 @default.
- W4317509311 cites W3009227557 @default.
- W4317509311 cites W3024770686 @default.
- W4317509311 cites W3029797938 @default.
- W4317509311 cites W3036935957 @default.
- W4317509311 cites W3083035629 @default.
- W4317509311 cites W3092785682 @default.
- W4317509311 cites W3105421998 @default.
- W4317509311 cites W3106250896 @default.
- W4317509311 cites W3112798201 @default.
- W4317509311 cites W3116534533 @default.
- W4317509311 cites W3121933194 @default.
- W4317509311 cites W3124942917 @default.
- W4317509311 cites W3130591054 @default.
- W4317509311 cites W3184231723 @default.
- W4317509311 cites W3204197025 @default.
- W4317509311 cites W3209586559 @default.
- W4317509311 cites W3216721547 @default.
- W4317509311 cites W4200025715 @default.
- W4317509311 cites W4206368078 @default.
- W4317509311 cites W4210720597 @default.
- W4317509311 cites W4210748864 @default.
- W4317509311 cites W4213171605 @default.
- W4317509311 cites W4225270898 @default.
- W4317509311 cites W4281696066 @default.
- W4317509311 cites W4281984836 @default.
- W4317509311 cites W4286212714 @default.
- W4317509311 cites W4290085836 @default.
- W4317509311 cites W4292291633 @default.
- W4317509311 cites W4293770025 @default.
- W4317509311 cites W4296292820 @default.
- W4317509311 cites W4297149181 @default.
- W4317509311 cites W4309102516 @default.
- W4317509311 cites W4310691231 @default.
- W4317509311 cites W4312269673 @default.
- W4317509311 doi "https://doi.org/10.3390/su15031866" @default.
- W4317509311 hasPublicationYear "2023" @default.
- W4317509311 type Work @default.
- W4317509311 citedByCount "6" @default.
- W4317509311 countsByYear W43175093112023 @default.
- W4317509311 crossrefType "journal-article" @default.
- W4317509311 hasAuthorship W4317509311A5017346745 @default.
- W4317509311 hasAuthorship W4317509311A5046529981 @default.
- W4317509311 hasAuthorship W4317509311A5054148817 @default.
- W4317509311 hasAuthorship W4317509311A5083637324 @default.
- W4317509311 hasBestOaLocation W43175093111 @default.
- W4317509311 hasConcept C100776233 @default.
- W4317509311 hasConcept C108583219 @default.
- W4317509311 hasConcept C111919701 @default.
- W4317509311 hasConcept C116834253 @default.
- W4317509311 hasConcept C119857082 @default.
- W4317509311 hasConcept C121332964 @default.