Matches in SemOpenAlex for { <https://semopenalex.org/work/W4285026368> ?p ?o ?g. }
- W4285026368 endingPage "101689" @default.
- W4285026368 startingPage "101689" @default.
- W4285026368 abstract "Deep Neural Network (DNN) is widely used in engineering applications for its ability to handle problems with almost any nonlinearities. However, it is generally difficult to obtain sufficient high-fidelity (HF) sample points for expensive optimization tasks, which may affect the generalization performance of DNN and result in inaccurate predictions. To solve this problem and improve the prediction accuracy of DNN, this paper proposes an on-line transfer learning based multi-fidelity data fusion (OTL-MFDF) method including two parts. In the first part, the ensemble of DNNs is established. Firstly, a large number of low-fidelity sample points and a few HF sample points are generated, which are used as the source dataset and target dataset, respectively. Then, the Bayesian Optimization (BO) is utilized to obtain several groups of hyperparameters, based on which DNNs are pre-trained using the source dataset. Next, these pre-trained DNNs are re-trained by fine-tuning on the target dataset, and the ensemble of DNNs is established by assigning different weights to each pre-trained DNN. In the second part, the on-line learning system is developed for adaptive updating of the ensemble of DNNs. To evaluate the uncertainty error of the predicted values of DNN and determine the location of the updated HF sample point, the query-by-committee strategy based on the ensemble of DNNs is developed. The Covariance Matrix Adaptation Evolutionary Strategies is employed as the optimizer to find out the location where the maximal disagreement is achieved by the ensemble of DNNs. The design space is partitioned by the Voronoi diagram method, and then the selected point is moved to its nearest Voronoi cell boundary to avoid clustering between the updated point and the existing sample points. Three different types of test problems and an engineering example are adopted to illustrate the effectiveness of the OTL-MFDF method. Results verify the outstanding efficiency, global prediction accuracy and applicability of the OTL-MFDF method." @default.
- W4285026368 created "2022-07-12" @default.
- W4285026368 creator A5006893011 @default.
- W4285026368 creator A5027766247 @default.
- W4285026368 creator A5037458498 @default.
- W4285026368 creator A5043055937 @default.
- W4285026368 creator A5056168495 @default.
- W4285026368 creator A5084939652 @default.
- W4285026368 creator A5089171536 @default.
- W4285026368 date "2022-08-01" @default.
- W4285026368 modified "2023-10-17" @default.
- W4285026368 title "On-line transfer learning for multi-fidelity data fusion with ensemble of deep neural networks" @default.
- W4285026368 cites W102487131 @default.
- W4285026368 cites W1964550805 @default.
- W4285026368 cites W1967005434 @default.
- W4285026368 cites W1972777769 @default.
- W4285026368 cites W1975046772 @default.
- W4285026368 cites W1986614398 @default.
- W4285026368 cites W2016043834 @default.
- W4285026368 cites W2018342164 @default.
- W4285026368 cites W2065576828 @default.
- W4285026368 cites W2076647894 @default.
- W4285026368 cites W2080021732 @default.
- W4285026368 cites W2083450550 @default.
- W4285026368 cites W2085384144 @default.
- W4285026368 cites W2088990166 @default.
- W4285026368 cites W2092101161 @default.
- W4285026368 cites W2114459240 @default.
- W4285026368 cites W2114668845 @default.
- W4285026368 cites W2154504007 @default.
- W4285026368 cites W2165698076 @default.
- W4285026368 cites W2165949883 @default.
- W4285026368 cites W2180941696 @default.
- W4285026368 cites W2237036887 @default.
- W4285026368 cites W2382210627 @default.
- W4285026368 cites W2412616130 @default.
- W4285026368 cites W2562542164 @default.
- W4285026368 cites W2618015877 @default.
- W4285026368 cites W2729558485 @default.
- W4285026368 cites W2742038006 @default.
- W4285026368 cites W2757101479 @default.
- W4285026368 cites W2757276162 @default.
- W4285026368 cites W2774341240 @default.
- W4285026368 cites W2894750600 @default.
- W4285026368 cites W2903759548 @default.
- W4285026368 cites W2909147580 @default.
- W4285026368 cites W2912853883 @default.
- W4285026368 cites W2916360883 @default.
- W4285026368 cites W2919958648 @default.
- W4285026368 cites W2937151579 @default.
- W4285026368 cites W2944410534 @default.
- W4285026368 cites W2947465899 @default.
- W4285026368 cites W2963273475 @default.
- W4285026368 cites W2965490185 @default.
- W4285026368 cites W2992792013 @default.
- W4285026368 cites W3008504170 @default.
- W4285026368 cites W3012253434 @default.
- W4285026368 cites W3027712952 @default.
- W4285026368 cites W3046502127 @default.
- W4285026368 cites W3089933773 @default.
- W4285026368 cites W3093259966 @default.
- W4285026368 cites W3098407580 @default.
- W4285026368 cites W3113371787 @default.
- W4285026368 cites W3120310365 @default.
- W4285026368 cites W3120684150 @default.
- W4285026368 cites W3121478259 @default.
- W4285026368 cites W3125250686 @default.
- W4285026368 cites W3133657579 @default.
- W4285026368 cites W3148202045 @default.
- W4285026368 cites W3157271444 @default.
- W4285026368 cites W3183039042 @default.
- W4285026368 cites W3185331043 @default.
- W4285026368 cites W3188225742 @default.
- W4285026368 cites W3196706461 @default.
- W4285026368 cites W3198839319 @default.
- W4285026368 cites W3205940321 @default.
- W4285026368 cites W4200542463 @default.
- W4285026368 cites W4205985902 @default.
- W4285026368 cites W4220774199 @default.
- W4285026368 doi "https://doi.org/10.1016/j.aei.2022.101689" @default.
- W4285026368 hasPublicationYear "2022" @default.
- W4285026368 type Work @default.
- W4285026368 citedByCount "11" @default.
- W4285026368 countsByYear W42850263682023 @default.
- W4285026368 crossrefType "journal-article" @default.
- W4285026368 hasAuthorship W4285026368A5006893011 @default.
- W4285026368 hasAuthorship W4285026368A5027766247 @default.
- W4285026368 hasAuthorship W4285026368A5037458498 @default.
- W4285026368 hasAuthorship W4285026368A5043055937 @default.
- W4285026368 hasAuthorship W4285026368A5056168495 @default.
- W4285026368 hasAuthorship W4285026368A5084939652 @default.
- W4285026368 hasAuthorship W4285026368A5089171536 @default.
- W4285026368 hasConcept C119857082 @default.
- W4285026368 hasConcept C134306372 @default.
- W4285026368 hasConcept C150899416 @default.
- W4285026368 hasConcept C153180895 @default.
- W4285026368 hasConcept C154945302 @default.
- W4285026368 hasConcept C177148314 @default.