Matches in SemOpenAlex for { <https://semopenalex.org/work/W2986095360> ?p ?o ?g. }
- W2986095360 endingPage "224" @default.
- W2986095360 startingPage "212" @default.
- W2986095360 abstract "As an important component of trains, rolling bearing is always faced with the defection of shed oil, which inevitably threatens the train safety. Therefore, it is of great significance to conduct defection inspection on bearing shed oil. Due to the complex structure of rolling bearings, traditional signal analysis approaches cannot detect the defections of bearing shed oil with high-efficiency and low cost. In recent years, deep learning has achieved remarkable growth and been successfully applied to various computer-vision tasks. Motivated by this fact, we propose a two-stage attention aware method to recognize defections of bearing shed oil. The proposed method is based on convolutional neural networks, can automatically learn bearing defect features, and does not need manual feature design and extraction like traditional methods. The two-stage method cascades a bearing localization stage and a defection segmentation stage, to recognize the defect areas in a coarse-to-fine manner. The localization stage extracts the foremost bearing region and removes the useless part of images, so as to focus the attention of segmentation stage only on the target region. In segmentation stage, we propose a novel attention aware network APP-UNet16, to segment defect areas from extracted bearing region. APP-UNet16 stacks attention gates to enable the attention-aware features change adaptively, and thus can learn to focus on target defect areas automatically. We also utilize transfer learning in constructing the encoder of APP-UNet16, and introduce spatial pyramid pooling to connect the encoder and decoder, to improve traditional UNet. A series of comparative experiments are conducted, to compare our two-stage method with one-stage method which directly perform segmentation on original train images. The results indicate that the proposed two-stage inspection method achieves higher robustness and accuracy in recognizing defect areas with small oil spot. And the experimental results on proposed APP-UNet16 also demonstrate that a better segmentation performance is achieved, compared to traditional UNet and related state-of-art approaches. We will release the source code as well as the trained models to facilitate more research work." @default.
- W2986095360 created "2019-11-22" @default.
- W2986095360 creator A5010092389 @default.
- W2986095360 creator A5011482738 @default.
- W2986095360 creator A5030348054 @default.
- W2986095360 creator A5035282947 @default.
- W2986095360 creator A5078793726 @default.
- W2986095360 creator A5087894632 @default.
- W2986095360 date "2020-03-01" @default.
- W2986095360 modified "2023-09-29" @default.
- W2986095360 title "A two-stage attention aware method for train bearing shed oil inspection based on convolutional neural networks" @default.
- W2986095360 cites W1970574904 @default.
- W2986095360 cites W2075479283 @default.
- W2986095360 cites W2088049833 @default.
- W2986095360 cites W2109255472 @default.
- W2986095360 cites W2163922914 @default.
- W2986095360 cites W2165698076 @default.
- W2986095360 cites W2344725271 @default.
- W2986095360 cites W2395611524 @default.
- W2986095360 cites W2503931548 @default.
- W2986095360 cites W2522624197 @default.
- W2986095360 cites W2568669383 @default.
- W2986095360 cites W2622612680 @default.
- W2986095360 cites W2750692136 @default.
- W2986095360 cites W2783772867 @default.
- W2986095360 cites W2803500965 @default.
- W2986095360 cites W2887057358 @default.
- W2986095360 cites W2891516347 @default.
- W2986095360 cites W2963433607 @default.
- W2986095360 cites W2963881378 @default.
- W2986095360 cites W639708223 @default.
- W2986095360 doi "https://doi.org/10.1016/j.neucom.2019.11.002" @default.
- W2986095360 hasPublicationYear "2020" @default.
- W2986095360 type Work @default.
- W2986095360 sameAs 2986095360 @default.
- W2986095360 citedByCount "21" @default.
- W2986095360 countsByYear W29860953602020 @default.
- W2986095360 countsByYear W29860953602021 @default.
- W2986095360 countsByYear W29860953602022 @default.
- W2986095360 countsByYear W29860953602023 @default.
- W2986095360 crossrefType "journal-article" @default.
- W2986095360 hasAuthorship W2986095360A5010092389 @default.
- W2986095360 hasAuthorship W2986095360A5011482738 @default.
- W2986095360 hasAuthorship W2986095360A5030348054 @default.
- W2986095360 hasAuthorship W2986095360A5035282947 @default.
- W2986095360 hasAuthorship W2986095360A5078793726 @default.
- W2986095360 hasAuthorship W2986095360A5087894632 @default.
- W2986095360 hasConcept C108583219 @default.
- W2986095360 hasConcept C111919701 @default.
- W2986095360 hasConcept C118505674 @default.
- W2986095360 hasConcept C119857082 @default.
- W2986095360 hasConcept C120665830 @default.
- W2986095360 hasConcept C121332964 @default.
- W2986095360 hasConcept C138885662 @default.
- W2986095360 hasConcept C142575187 @default.
- W2986095360 hasConcept C153180895 @default.
- W2986095360 hasConcept C154945302 @default.
- W2986095360 hasConcept C192209626 @default.
- W2986095360 hasConcept C199978012 @default.
- W2986095360 hasConcept C2776401178 @default.
- W2986095360 hasConcept C31972630 @default.
- W2986095360 hasConcept C41008148 @default.
- W2986095360 hasConcept C41895202 @default.
- W2986095360 hasConcept C70437156 @default.
- W2986095360 hasConcept C81363708 @default.
- W2986095360 hasConcept C89600930 @default.
- W2986095360 hasConceptScore W2986095360C108583219 @default.
- W2986095360 hasConceptScore W2986095360C111919701 @default.
- W2986095360 hasConceptScore W2986095360C118505674 @default.
- W2986095360 hasConceptScore W2986095360C119857082 @default.
- W2986095360 hasConceptScore W2986095360C120665830 @default.
- W2986095360 hasConceptScore W2986095360C121332964 @default.
- W2986095360 hasConceptScore W2986095360C138885662 @default.
- W2986095360 hasConceptScore W2986095360C142575187 @default.
- W2986095360 hasConceptScore W2986095360C153180895 @default.
- W2986095360 hasConceptScore W2986095360C154945302 @default.
- W2986095360 hasConceptScore W2986095360C192209626 @default.
- W2986095360 hasConceptScore W2986095360C199978012 @default.
- W2986095360 hasConceptScore W2986095360C2776401178 @default.
- W2986095360 hasConceptScore W2986095360C31972630 @default.
- W2986095360 hasConceptScore W2986095360C41008148 @default.
- W2986095360 hasConceptScore W2986095360C41895202 @default.
- W2986095360 hasConceptScore W2986095360C70437156 @default.
- W2986095360 hasConceptScore W2986095360C81363708 @default.
- W2986095360 hasConceptScore W2986095360C89600930 @default.
- W2986095360 hasFunder F4320321001 @default.
- W2986095360 hasFunder F4320321471 @default.
- W2986095360 hasLocation W29860953601 @default.
- W2986095360 hasOpenAccess W2986095360 @default.
- W2986095360 hasPrimaryLocation W29860953601 @default.
- W2986095360 hasRelatedWork W2517027266 @default.
- W2986095360 hasRelatedWork W2731899572 @default.
- W2986095360 hasRelatedWork W2756241593 @default.
- W2986095360 hasRelatedWork W3002446410 @default.
- W2986095360 hasRelatedWork W3116150086 @default.
- W2986095360 hasRelatedWork W3133861977 @default.
- W2986095360 hasRelatedWork W4200173597 @default.
- W2986095360 hasRelatedWork W4312417841 @default.