Matches in SemOpenAlex for { <https://semopenalex.org/work/W3185295980> ?p ?o ?g. }
- W3185295980 abstract "Deep Learning applications are becoming increasingly popular worldwide. Developers of deep learning systems like in every other context of software development strive to write more efficient code in terms of performance, complexity, and maintenance. The continuous evolution of deep learning systems imposing tighter development timelines and their increasing complexity may result in bad design decisions by the developers. Besides, due to the use of common frameworks and repetitive implementation of similar tasks, deep learning developers are likely to use the copy-paste practice leading to clones in deep learning code. Code clone is considered to be a bad software development practice since developers can inadvertently fail to properly propagate changes to all clones fragments during a maintenance activity. However, to the best of our knowledge, no study has investigated code cloning practices in deep learning development. The majority of research on deep learning systems mostly focusing on improving the dependability of the models. Given the negative impacts of clones on software quality reported in the studies on traditional systems and the inherent complexity of maintaining deep learning systems (e.g., bug fixing), it is very important to understand the characteristics and potential impacts of code clones on deep learning systems. This paper examines the frequency, distribution, and impacts of code clones and the code cloning practices in deep learning systems. To accomplish this, we use the NiCad clone detection tool to detect clones from 59 Python, 14 C#, and 6 Java based deep learning systems and an equal number of traditional software systems. We then analyze the comparative frequency and distribution of code clones in deep learning systems and the traditional ones. Further, we study the distribution of the detected code clones by applying a location based taxonomy. In addition, we study the correlation between bugs and code clones to assess the impacts of clones on the quality of the studied systems. Finally, we introduce a code clone taxonomy related to deep learning programs based on 6 DL software systems (from 59 DL systems) and identify the deep learning system development phases in which cloning has the highest risk of faults. Our results show that code cloning is a frequent practice in deep learning systems and that deep learning developers often clone code from files contain in distant repositories in the system. In addition, we found that code cloning occurs more frequently during DL model construction, model training, and data pre-processing. And that hyperparameters setting is the phase of deep learning model construction during which cloning is the riskiest, since it often leads to faults." @default.
- W3185295980 created "2021-08-02" @default.
- W3185295980 creator A5009290843 @default.
- W3185295980 creator A5045535375 @default.
- W3185295980 creator A5065453567 @default.
- W3185295980 creator A5071052367 @default.
- W3185295980 date "2022-04-08" @default.
- W3185295980 modified "2023-10-03" @default.
- W3185295980 title "Clones in deep learning code: what, where, and why?" @default.
- W3185295980 cites W1035198403 @default.
- W3185295980 cites W1508590353 @default.
- W3185295980 cites W1566773348 @default.
- W3185295980 cites W1894439495 @default.
- W3185295980 cites W1973650376 @default.
- W3185295980 cites W1985500401 @default.
- W3185295980 cites W2008085811 @default.
- W3185295980 cites W2010208284 @default.
- W3185295980 cites W2018890516 @default.
- W3185295980 cites W2022508996 @default.
- W3185295980 cites W2025962632 @default.
- W3185295980 cites W2028713366 @default.
- W3185295980 cites W2039429335 @default.
- W3185295980 cites W2043169794 @default.
- W3185295980 cites W2057826716 @default.
- W3185295980 cites W2064035568 @default.
- W3185295980 cites W2065314038 @default.
- W3185295980 cites W2071342967 @default.
- W3185295980 cites W2074529754 @default.
- W3185295980 cites W2090432523 @default.
- W3185295980 cites W2097433746 @default.
- W3185295980 cites W2100060170 @default.
- W3185295980 cites W2101398182 @default.
- W3185295980 cites W2111305209 @default.
- W3185295980 cites W2112739286 @default.
- W3185295980 cites W2112796928 @default.
- W3185295980 cites W2114056383 @default.
- W3185295980 cites W2123928002 @default.
- W3185295980 cites W2131477050 @default.
- W3185295980 cites W2156778594 @default.
- W3185295980 cites W2160815625 @default.
- W3185295980 cites W2162436321 @default.
- W3185295980 cites W2165739648 @default.
- W3185295980 cites W2171368158 @default.
- W3185295980 cites W2171868993 @default.
- W3185295980 cites W2183341477 @default.
- W3185295980 cites W2261527505 @default.
- W3185295980 cites W2277774738 @default.
- W3185295980 cites W2294903866 @default.
- W3185295980 cites W2427333829 @default.
- W3185295980 cites W2511803001 @default.
- W3185295980 cites W2580729925 @default.
- W3185295980 cites W2586702902 @default.
- W3185295980 cites W2598761292 @default.
- W3185295980 cites W2605202003 @default.
- W3185295980 cites W2610332124 @default.
- W3185295980 cites W2626932320 @default.
- W3185295980 cites W2727832342 @default.
- W3185295980 cites W2743745531 @default.
- W3185295980 cites W2759794714 @default.
- W3185295980 cites W2766557196 @default.
- W3185295980 cites W2775209903 @default.
- W3185295980 cites W2782864149 @default.
- W3185295980 cites W2796040126 @default.
- W3185295980 cites W2850992922 @default.
- W3185295980 cites W2883626290 @default.
- W3185295980 cites W2885453527 @default.
- W3185295980 cites W2899407111 @default.
- W3185295980 cites W2922234936 @default.
- W3185295980 cites W2954370766 @default.
- W3185295980 cites W2963037989 @default.
- W3185295980 cites W2968594320 @default.
- W3185295980 cites W2970433196 @default.
- W3185295980 cites W2990954032 @default.
- W3185295980 cites W2991332382 @default.
- W3185295980 cites W2993280551 @default.
- W3185295980 cites W3004493192 @default.
- W3185295980 cites W3018447383 @default.
- W3185295980 cites W3021836553 @default.
- W3185295980 cites W3043652335 @default.
- W3185295980 cites W3091074642 @default.
- W3185295980 cites W3100925971 @default.
- W3185295980 cites W3104103145 @default.
- W3185295980 cites W3105290170 @default.
- W3185295980 cites W3147107715 @default.
- W3185295980 doi "https://doi.org/10.1007/s10664-021-10099-x" @default.
- W3185295980 hasPublicationYear "2022" @default.
- W3185295980 type Work @default.
- W3185295980 sameAs 3185295980 @default.
- W3185295980 citedByCount "2" @default.
- W3185295980 countsByYear W31852959802023 @default.
- W3185295980 crossrefType "journal-article" @default.
- W3185295980 hasAuthorship W3185295980A5009290843 @default.
- W3185295980 hasAuthorship W3185295980A5045535375 @default.
- W3185295980 hasAuthorship W3185295980A5065453567 @default.
- W3185295980 hasAuthorship W3185295980A5071052367 @default.
- W3185295980 hasBestOaLocation W31852959802 @default.
- W3185295980 hasConcept C108583219 @default.
- W3185295980 hasConcept C115903868 @default.
- W3185295980 hasConcept C119857082 @default.
- W3185295980 hasConcept C149091818 @default.