Matches in SemOpenAlex for { <https://semopenalex.org/work/W2968304071> ?p ?o ?g. }
- W2968304071 abstract "PreviousNext No AccessSEG Technical Program Expanded Abstracts 2019Building realistic structure models to train convolutional neural networks for seismic structural interpretationAuthors: Xinming WuZhicheng GengYunzhi ShiNam PhamSergey FomelXinming WuBEG, UT AustinSearch for more papers by this author, Zhicheng GengBEG, UT AustinSearch for more papers by this author, Yunzhi ShiBEG, UT AustinSearch for more papers by this author, Nam PhamBEG, UT AustinSearch for more papers by this author, and Sergey FomelBEG, UT AustinSearch for more papers by this authorhttps://doi.org/10.1190/segam2019-3214282.1 SectionsSupplemental MaterialAboutPDF/ePub ToolsAdd to favoritesDownload CitationsTrack CitationsPermissions ShareFacebookTwitterLinked InRedditEmail AbstractWe improve automatic structural interpretation in seismic images by using CNNs (convolutional neural networks), which recently have shown the best performance in detecting and extracting useful image features and objects. The main limitation of applying CNN in seismic interpretation is the preparation of many training datasets and especially the corresponding geologic labels. To solve this problem, we propose a workflow to automatically build diverse structure models with realistic folding and faulting features. In this workflow, with some assumptions about typical folding and faulting patterns, we simulate structural features in a 3D model by using a set of parameters. By randomly choosing the parameters from some predefined ranges, we are able to automatically generate numerous structure models with realistic and diverse structural features. Based on these structure models with known structural information, we further automatically create numerous synthetic seismic images and the corresponding ground truth of structural labels to train CNNs for structural interpretation in field seismic images. Accurate results of structural interpretation in multiple field seismic images show that the proposed workflow simulates realistic and generalized structure models from which the CNNs effectively learn to recognize real structures in field images.Presentation Date: Monday, September 16, 2019Session Start Time: 1:50 PMPresentation Start Time: 2:40 PMLocation: 301BPresentation Type: OralKeywords: interpretation, machine learning, neural networks, artificial intelligence, seismic attributesPermalink: https://doi.org/10.1190/segam2019-3214282.1FiguresReferencesRelatedDetailsCited byDeep learning for local seismic image processing: Fault detection, structure-oriented smoothing with edge-preserving, and slope estimation by using a single convolutional neural networkXinming Wu, Luming Liang, Yunzhi Shi, Zhicheng Geng, and Sergey Fomel10 August 2019 SEG Technical Program Expanded Abstracts 2019ISSN (print):1052-3812 ISSN (online):1949-4645Copyright: 2019 Pages: 5407 publication data© 2019 Published in electronic format with permission by the Society of Exploration GeophysicistsPublisher:Society of Exploration Geophysicists HistoryPublished Online: 10 Aug 2019 CITATION INFORMATION Xinming Wu, Zhicheng Geng, Yunzhi Shi, Nam Pham, and Sergey Fomel, (2019), Building realistic structure models to train convolutional neural networks for seismic structural interpretation, SEG Technical Program Expanded Abstracts : 4745-4750. https://doi.org/10.1190/segam2019-3214282.1 Plain-Language Summary Keywordsinterpretationmachine learningneural networksartificial intelligenceseismic attributesPDF DownloadLoading ..." @default.
- W2968304071 created "2019-08-22" @default.
- W2968304071 creator A5000470385 @default.
- W2968304071 creator A5005237932 @default.
- W2968304071 creator A5047852500 @default.
- W2968304071 creator A5058640298 @default.
- W2968304071 creator A5091058254 @default.
- W2968304071 date "2019-08-10" @default.
- W2968304071 modified "2023-09-28" @default.
- W2968304071 title "Building realistic structure models to train convolutional neural networks for seismic structural interpretation" @default.
- W2968304071 cites W1901129140 @default.
- W2968304071 cites W1970167563 @default.
- W2968304071 cites W1971153498 @default.
- W2968304071 cites W1973093491 @default.
- W2968304071 cites W1973812374 @default.
- W2968304071 cites W2039814260 @default.
- W2968304071 cites W2067911588 @default.
- W2968304071 cites W2079277635 @default.
- W2968304071 cites W2082304646 @default.
- W2968304071 cites W2106921281 @default.
- W2968304071 cites W2107308393 @default.
- W2968304071 cites W2118701387 @default.
- W2968304071 cites W2126097500 @default.
- W2968304071 cites W2129793631 @default.
- W2968304071 cites W2135069867 @default.
- W2968304071 cites W2161006083 @default.
- W2968304071 cites W2194775991 @default.
- W2968304071 cites W2287473451 @default.
- W2968304071 cites W2314555564 @default.
- W2968304071 cites W2324573032 @default.
- W2968304071 cites W2501698054 @default.
- W2968304071 cites W2588053692 @default.
- W2968304071 cites W2592266704 @default.
- W2968304071 cites W2592517375 @default.
- W2968304071 cites W2708728779 @default.
- W2968304071 cites W2774783909 @default.
- W2968304071 cites W2802007045 @default.
- W2968304071 cites W2805155846 @default.
- W2968304071 cites W2886991536 @default.
- W2968304071 cites W2889638205 @default.
- W2968304071 cites W2890209207 @default.
- W2968304071 cites W2890568727 @default.
- W2968304071 cites W2891111066 @default.
- W2968304071 cites W2892287369 @default.
- W2968304071 cites W2895663205 @default.
- W2968304071 cites W2923222994 @default.
- W2968304071 cites W2963150697 @default.
- W2968304071 cites W2963881378 @default.
- W2968304071 cites W2968154985 @default.
- W2968304071 cites W4244567635 @default.
- W2968304071 cites W4248461486 @default.
- W2968304071 doi "https://doi.org/10.1190/segam2019-3214282.1" @default.
- W2968304071 hasPublicationYear "2019" @default.
- W2968304071 type Work @default.
- W2968304071 sameAs 2968304071 @default.
- W2968304071 citedByCount "1" @default.
- W2968304071 countsByYear W29683040712019 @default.
- W2968304071 crossrefType "proceedings-article" @default.
- W2968304071 hasAuthorship W2968304071A5000470385 @default.
- W2968304071 hasAuthorship W2968304071A5005237932 @default.
- W2968304071 hasAuthorship W2968304071A5047852500 @default.
- W2968304071 hasAuthorship W2968304071A5058640298 @default.
- W2968304071 hasAuthorship W2968304071A5091058254 @default.
- W2968304071 hasBestOaLocation W29683040712 @default.
- W2968304071 hasConcept C153180895 @default.
- W2968304071 hasConcept C154945302 @default.
- W2968304071 hasConcept C177212765 @default.
- W2968304071 hasConcept C177264268 @default.
- W2968304071 hasConcept C199360897 @default.
- W2968304071 hasConcept C202444582 @default.
- W2968304071 hasConcept C33923547 @default.
- W2968304071 hasConcept C41008148 @default.
- W2968304071 hasConcept C527412718 @default.
- W2968304071 hasConcept C77088390 @default.
- W2968304071 hasConcept C81363708 @default.
- W2968304071 hasConcept C9652623 @default.
- W2968304071 hasConceptScore W2968304071C153180895 @default.
- W2968304071 hasConceptScore W2968304071C154945302 @default.
- W2968304071 hasConceptScore W2968304071C177212765 @default.
- W2968304071 hasConceptScore W2968304071C177264268 @default.
- W2968304071 hasConceptScore W2968304071C199360897 @default.
- W2968304071 hasConceptScore W2968304071C202444582 @default.
- W2968304071 hasConceptScore W2968304071C33923547 @default.
- W2968304071 hasConceptScore W2968304071C41008148 @default.
- W2968304071 hasConceptScore W2968304071C527412718 @default.
- W2968304071 hasConceptScore W2968304071C77088390 @default.
- W2968304071 hasConceptScore W2968304071C81363708 @default.
- W2968304071 hasConceptScore W2968304071C9652623 @default.
- W2968304071 hasLocation W29683040711 @default.
- W2968304071 hasLocation W29683040712 @default.
- W2968304071 hasLocation W29683040713 @default.
- W2968304071 hasOpenAccess W2968304071 @default.
- W2968304071 hasPrimaryLocation W29683040711 @default.
- W2968304071 hasRelatedWork W2175746458 @default.
- W2968304071 hasRelatedWork W2732542196 @default.
- W2968304071 hasRelatedWork W2738221750 @default.
- W2968304071 hasRelatedWork W2758063741 @default.
- W2968304071 hasRelatedWork W2760085659 @default.
- W2968304071 hasRelatedWork W2883200793 @default.
- W2968304071 hasRelatedWork W2912288872 @default.