Matches in SemOpenAlex for { <https://semopenalex.org/work/W3216519085> ?p ?o ?g. }
- W3216519085 abstract "Generalizability of machine-learning (ML) based turbulence closures to accurately predict unseen practical flows remains an important challenge. At the Reynolds-averaged Navier-Stokes (RANS) level, NN-based turbulence closure modeling is rendered difficult due to two important reasons: inherent complexity of the constitutive relation arising from flow-dependent non-linearity and bifurcations; and, inordinate difficulty in obtaining high-fidelity data covering the entire parameter space of interest. In this context, the objective of the work is to investigate the approximation capabilities of standard moderate-sized fully-connected NNs. We seek to systematically investigate the effects of: (i) intrinsic complexity of the solution manifold; (ii) sampling procedure (interpolation vs. extrapolation) and (iii) optimization procedure. To overcome the data acquisition challenges, three proxy-physics turbulence surrogates of different degrees of complexity (yet significantly simpler than turbulence physics) are employed to generate the parameter-to-solution maps. Even for this simple proxy-physics system, it is demonstrated that feed-forward NNs require more degrees of freedom than the original proxy-physics model to accurately approximate the true model even when trained with data over the entire parameter space (interpolation). Additionally, if deep fully-connected NNs are trained with data only from part of the parameter space (extrapolation), their approximation capability reduces considerably and it is not straightforward to find an optimal architecture. Overall, the findings provide a realistic perspective on the utility of ML turbulence closures for practical applications and identify areas for improvement." @default.
- W3216519085 created "2021-12-06" @default.
- W3216519085 creator A5022858244 @default.
- W3216519085 creator A5029911175 @default.
- W3216519085 creator A5067736188 @default.
- W3216519085 creator A5084855116 @default.
- W3216519085 date "2021-11-01" @default.
- W3216519085 modified "2023-10-01" @default.
- W3216519085 title "Turbulence closure modeling with data-driven techniques: Investigation of generalizable deep neural networks" @default.
- W3216519085 cites W1965957586 @default.
- W3216519085 cites W1967483301 @default.
- W3216519085 cites W1991871575 @default.
- W3216519085 cites W2000926867 @default.
- W3216519085 cites W2006240266 @default.
- W3216519085 cites W2016234921 @default.
- W3216519085 cites W2016644205 @default.
- W3216519085 cites W2054101073 @default.
- W3216519085 cites W2056245155 @default.
- W3216519085 cites W2062474936 @default.
- W3216519085 cites W2079224763 @default.
- W3216519085 cites W2093300654 @default.
- W3216519085 cites W2099706800 @default.
- W3216519085 cites W2103496339 @default.
- W3216519085 cites W2105355027 @default.
- W3216519085 cites W2106607398 @default.
- W3216519085 cites W2107853203 @default.
- W3216519085 cites W2137983211 @default.
- W3216519085 cites W2152593035 @default.
- W3216519085 cites W2166116275 @default.
- W3216519085 cites W2166380466 @default.
- W3216519085 cites W2528305538 @default.
- W3216519085 cites W2534240011 @default.
- W3216519085 cites W2560112327 @default.
- W3216519085 cites W2585298970 @default.
- W3216519085 cites W2625995436 @default.
- W3216519085 cites W2749028154 @default.
- W3216519085 cites W2766298346 @default.
- W3216519085 cites W2766872946 @default.
- W3216519085 cites W2795982117 @default.
- W3216519085 cites W2803629276 @default.
- W3216519085 cites W2807826281 @default.
- W3216519085 cites W2864723431 @default.
- W3216519085 cites W2885872291 @default.
- W3216519085 cites W2899283552 @default.
- W3216519085 cites W2902987217 @default.
- W3216519085 cites W2903546100 @default.
- W3216519085 cites W2905258134 @default.
- W3216519085 cites W2907955265 @default.
- W3216519085 cites W2911525369 @default.
- W3216519085 cites W2911964244 @default.
- W3216519085 cites W2930017973 @default.
- W3216519085 cites W2959998927 @default.
- W3216519085 cites W2962777873 @default.
- W3216519085 cites W2963146412 @default.
- W3216519085 cites W2965947939 @default.
- W3216519085 cites W2971645765 @default.
- W3216519085 cites W2980247813 @default.
- W3216519085 cites W2980984515 @default.
- W3216519085 cites W2986795381 @default.
- W3216519085 cites W2994070579 @default.
- W3216519085 cites W2995408993 @default.
- W3216519085 cites W2997171342 @default.
- W3216519085 cites W3010839048 @default.
- W3216519085 cites W3012045275 @default.
- W3216519085 cites W3013108861 @default.
- W3216519085 cites W3013803313 @default.
- W3216519085 cites W3015596437 @default.
- W3216519085 cites W3027801118 @default.
- W3216519085 cites W3036965708 @default.
- W3216519085 cites W3048444177 @default.
- W3216519085 cites W3088916089 @default.
- W3216519085 cites W3091186176 @default.
- W3216519085 cites W3093998008 @default.
- W3216519085 cites W3095393108 @default.
- W3216519085 cites W3096571028 @default.
- W3216519085 cites W3098093095 @default.
- W3216519085 cites W3098175809 @default.
- W3216519085 cites W3100625809 @default.
- W3216519085 cites W3101316902 @default.
- W3216519085 cites W3102140816 @default.
- W3216519085 cites W3102436560 @default.
- W3216519085 cites W3104183394 @default.
- W3216519085 cites W3105245152 @default.
- W3216519085 cites W3125537303 @default.
- W3216519085 cites W3128803576 @default.
- W3216519085 cites W3134546754 @default.
- W3216519085 cites W3134626437 @default.
- W3216519085 cites W3135859784 @default.
- W3216519085 cites W3137240924 @default.
- W3216519085 cites W3159480617 @default.
- W3216519085 cites W3161054362 @default.
- W3216519085 cites W3162533428 @default.
- W3216519085 cites W3166586430 @default.
- W3216519085 cites W3166995168 @default.
- W3216519085 cites W3167031531 @default.
- W3216519085 cites W3201508484 @default.
- W3216519085 doi "https://doi.org/10.1063/5.0070890" @default.
- W3216519085 hasPublicationYear "2021" @default.
- W3216519085 type Work @default.
- W3216519085 sameAs 3216519085 @default.