Matches in SemOpenAlex for { <https://semopenalex.org/work/W3205769708> ?p ?o ?g. }
- W3205769708 abstract "Abstract Reservoir engineering constitutes a major part of the studies regarding oil and gas exploration and production. Reservoir engineering has various duties, including conducting experiments, constructing appropriate models, characterization, and forecasting reservoir dynamics. However, traditional engineering approaches started to face challenges as the number of raw field data increases. It pushed the researchers to use more powerful tools for data classification, cleaning and preparing data to be used in models, which enhances a better data evaluation, thus making proper decisions. In addition, simultaneous simulations are sometimes performed, aiming to have optimization and sensitivity analysis during the history matching process. Multi-functional works are required to meet all these deficiencies. Upgrading conventional reservoir engineering approaches with CPUs, or more powerful computers are insufficient since it increases computational cost and is time-consuming. Machine learning techniques have been proposed as the best solution for strong learning capability and computational efficiency. Recently developed algorithms make it possible to handle a very large number of data with high accuracy. The most widely used machine learning approaches are: Artificial Neural Network (ANN), Support Vector Machines and Adaptive Neuro-Fuzzy Inference Systems. In this study, these approaches are introduced by providing their capability and limitations. After that, the study focuses on using machine learning techniques in unconventional reservoir engineering calculations: Reservoir characterization, PVT calculations and optimization of well completion. These processes are repeated until all the values reach to the output layer. Normally, one hidden layer is good enough for most problems and additional hidden layers usually does not improve the model performance, instead, it may create the risk for converging to a local minimum and make the model more complex. The most typical neural network is the forward feed network, often used for data classification. MLP has a learning function that minimizes a global error function, the least square method. It uses back propagation algorithm to update the weights, searching for local minima by performing a gradient descent (Figure 1). The learning rate is usually selected as less than one." @default.
- W3205769708 created "2021-10-25" @default.
- W3205769708 creator A5011287935 @default.
- W3205769708 creator A5031812393 @default.
- W3205769708 creator A5045920426 @default.
- W3205769708 creator A5046103184 @default.
- W3205769708 creator A5057631731 @default.
- W3205769708 creator A5074504215 @default.
- W3205769708 creator A5079832257 @default.
- W3205769708 creator A5090696831 @default.
- W3205769708 date "2021-10-04" @default.
- W3205769708 modified "2023-10-11" @default.
- W3205769708 title "A Thorough Review of Machine Learning Applications in Oil and Gas Industry" @default.
- W3205769708 cites W147273414 @default.
- W3205769708 cites W1489973231 @default.
- W3205769708 cites W1971764756 @default.
- W3205769708 cites W1988494513 @default.
- W3205769708 cites W1996377693 @default.
- W3205769708 cites W2011562040 @default.
- W3205769708 cites W2015093737 @default.
- W3205769708 cites W2021586754 @default.
- W3205769708 cites W2052790603 @default.
- W3205769708 cites W2054979538 @default.
- W3205769708 cites W2063015500 @default.
- W3205769708 cites W2066180822 @default.
- W3205769708 cites W2067607655 @default.
- W3205769708 cites W2079493731 @default.
- W3205769708 cites W2080202977 @default.
- W3205769708 cites W2086115842 @default.
- W3205769708 cites W2087814009 @default.
- W3205769708 cites W2126658806 @default.
- W3205769708 cites W2128357172 @default.
- W3205769708 cites W2130149913 @default.
- W3205769708 cites W2139073224 @default.
- W3205769708 cites W2149723649 @default.
- W3205769708 cites W2283733733 @default.
- W3205769708 cites W2313631232 @default.
- W3205769708 cites W2323469010 @default.
- W3205769708 cites W2326755141 @default.
- W3205769708 cites W2329187271 @default.
- W3205769708 cites W2332740891 @default.
- W3205769708 cites W2487200295 @default.
- W3205769708 cites W2492814369 @default.
- W3205769708 cites W2515352124 @default.
- W3205769708 cites W2524585699 @default.
- W3205769708 cites W2529108047 @default.
- W3205769708 cites W2534240011 @default.
- W3205769708 cites W2589806773 @default.
- W3205769708 cites W2592024144 @default.
- W3205769708 cites W2596971675 @default.
- W3205769708 cites W2604203235 @default.
- W3205769708 cites W2615591457 @default.
- W3205769708 cites W2746931442 @default.
- W3205769708 cites W2782975644 @default.
- W3205769708 cites W2783009195 @default.
- W3205769708 cites W2789488193 @default.
- W3205769708 cites W2795139205 @default.
- W3205769708 cites W2797091367 @default.
- W3205769708 cites W2801454134 @default.
- W3205769708 cites W2883547640 @default.
- W3205769708 cites W2886132040 @default.
- W3205769708 cites W2887616078 @default.
- W3205769708 cites W2895179101 @default.
- W3205769708 cites W2896542983 @default.
- W3205769708 cites W2896840749 @default.
- W3205769708 cites W2897685323 @default.
- W3205769708 cites W2898802455 @default.
- W3205769708 cites W2904778773 @default.
- W3205769708 cites W2905373526 @default.
- W3205769708 cites W2922187008 @default.
- W3205769708 cites W2926728301 @default.
- W3205769708 cites W2932706296 @default.
- W3205769708 cites W2936964251 @default.
- W3205769708 cites W2938142867 @default.
- W3205769708 cites W2941892128 @default.
- W3205769708 cites W2946160554 @default.
- W3205769708 cites W2947768295 @default.
- W3205769708 cites W2954404092 @default.
- W3205769708 cites W2961342225 @default.
- W3205769708 cites W2963261548 @default.
- W3205769708 cites W2963276185 @default.
- W3205769708 cites W2965966792 @default.
- W3205769708 cites W2966332045 @default.
- W3205769708 cites W2966660221 @default.
- W3205769708 cites W2966908144 @default.
- W3205769708 cites W2967315555 @default.
- W3205769708 cites W2967385452 @default.
- W3205769708 cites W2967502595 @default.
- W3205769708 cites W2967550035 @default.
- W3205769708 cites W2972467624 @default.
- W3205769708 cites W2972539026 @default.
- W3205769708 cites W2974143794 @default.
- W3205769708 cites W2974384244 @default.
- W3205769708 cites W2974434987 @default.
- W3205769708 cites W2974444031 @default.
- W3205769708 cites W2974464356 @default.
- W3205769708 cites W2979169256 @default.
- W3205769708 cites W2979841333 @default.
- W3205769708 cites W2980173076 @default.
- W3205769708 cites W2980318039 @default.