Matches in SemOpenAlex for { <https://semopenalex.org/work/W3136018460> ?p ?o ?g. }
- W3136018460 endingPage "1596" @default.
- W3136018460 startingPage "1596" @default.
- W3136018460 abstract "Short-term electrical load forecasting plays an important role in the safety, stability, and sustainability of the power production and scheduling process. An accurate prediction of power load can provide a reliable decision for power system management. To solve the limitation of the existing load forecasting methods in dealing with time-series data, causing the poor stability and non-ideal forecasting accuracy, this paper proposed an attention-based encoder-decoder network with Bayesian optimization to do the accurate short-term power load forecasting. Proposed model is based on an encoder-decoder architecture with a gated recurrent units (GRU) recurrent neural network with high robustness on time-series data modeling. The temporal attention layer focuses on the key features of input data that play a vital role in promoting the prediction accuracy for load forecasting. Finally, the Bayesian optimization method is used to confirm the model’s hyperparameters to achieve optimal predictions. The verification experiments of 24 h load forecasting with real power load data from American Electric Power (AEP) show that the proposed model outperforms other models in terms of prediction accuracy and algorithm stability, providing an effective approach for migrating time-serial power load prediction by deep-learning technology." @default.
- W3136018460 created "2021-03-29" @default.
- W3136018460 creator A5008574593 @default.
- W3136018460 creator A5025536472 @default.
- W3136018460 creator A5031227420 @default.
- W3136018460 creator A5067818846 @default.
- W3136018460 creator A5069176169 @default.
- W3136018460 creator A5077399002 @default.
- W3136018460 creator A5088907463 @default.
- W3136018460 date "2021-03-13" @default.
- W3136018460 modified "2023-10-01" @default.
- W3136018460 title "Deep-Learning Forecasting Method for Electric Power Load via Attention-Based Encoder-Decoder with Bayesian Optimization" @default.
- W3136018460 cites W1793209788 @default.
- W3136018460 cites W1991277158 @default.
- W3136018460 cites W2000061362 @default.
- W3136018460 cites W2003235296 @default.
- W3136018460 cites W2059804518 @default.
- W3136018460 cites W2064675550 @default.
- W3136018460 cites W2077188078 @default.
- W3136018460 cites W2085866051 @default.
- W3136018460 cites W2090322886 @default.
- W3136018460 cites W2147568880 @default.
- W3136018460 cites W2192203593 @default.
- W3136018460 cites W2255201377 @default.
- W3136018460 cites W2613328025 @default.
- W3136018460 cites W2739322982 @default.
- W3136018460 cites W2776741657 @default.
- W3136018460 cites W2809317444 @default.
- W3136018460 cites W2810658548 @default.
- W3136018460 cites W2888165363 @default.
- W3136018460 cites W2888547842 @default.
- W3136018460 cites W2890330768 @default.
- W3136018460 cites W2899494475 @default.
- W3136018460 cites W2915373746 @default.
- W3136018460 cites W2915594101 @default.
- W3136018460 cites W2942392852 @default.
- W3136018460 cites W2951215179 @default.
- W3136018460 cites W2963832956 @default.
- W3136018460 cites W2966793957 @default.
- W3136018460 cites W2973001045 @default.
- W3136018460 cites W2981704113 @default.
- W3136018460 cites W2988068710 @default.
- W3136018460 cites W2994074976 @default.
- W3136018460 cites W2994209110 @default.
- W3136018460 cites W2996604628 @default.
- W3136018460 cites W3000499162 @default.
- W3136018460 cites W3004437308 @default.
- W3136018460 cites W3005177200 @default.
- W3136018460 cites W3010421437 @default.
- W3136018460 cites W3013592816 @default.
- W3136018460 cites W3020259708 @default.
- W3136018460 cites W3048102275 @default.
- W3136018460 cites W3084888580 @default.
- W3136018460 cites W3087179847 @default.
- W3136018460 cites W3093308276 @default.
- W3136018460 cites W3107979244 @default.
- W3136018460 cites W3113024520 @default.
- W3136018460 cites W3114571338 @default.
- W3136018460 cites W3114921025 @default.
- W3136018460 cites W3116475140 @default.
- W3136018460 cites W3128596107 @default.
- W3136018460 cites W3168712301 @default.
- W3136018460 cites W3210587668 @default.
- W3136018460 doi "https://doi.org/10.3390/en14061596" @default.
- W3136018460 hasPublicationYear "2021" @default.
- W3136018460 type Work @default.
- W3136018460 sameAs 3136018460 @default.
- W3136018460 citedByCount "84" @default.
- W3136018460 countsByYear W31360184602021 @default.
- W3136018460 countsByYear W31360184602022 @default.
- W3136018460 countsByYear W31360184602023 @default.
- W3136018460 crossrefType "journal-article" @default.
- W3136018460 hasAuthorship W3136018460A5008574593 @default.
- W3136018460 hasAuthorship W3136018460A5025536472 @default.
- W3136018460 hasAuthorship W3136018460A5031227420 @default.
- W3136018460 hasAuthorship W3136018460A5067818846 @default.
- W3136018460 hasAuthorship W3136018460A5069176169 @default.
- W3136018460 hasAuthorship W3136018460A5077399002 @default.
- W3136018460 hasAuthorship W3136018460A5088907463 @default.
- W3136018460 hasBestOaLocation W31360184601 @default.
- W3136018460 hasConcept C104317684 @default.
- W3136018460 hasConcept C119857082 @default.
- W3136018460 hasConcept C121332964 @default.
- W3136018460 hasConcept C124101348 @default.
- W3136018460 hasConcept C154945302 @default.
- W3136018460 hasConcept C163258240 @default.
- W3136018460 hasConcept C185592680 @default.
- W3136018460 hasConcept C41008148 @default.
- W3136018460 hasConcept C50644808 @default.
- W3136018460 hasConcept C55493867 @default.
- W3136018460 hasConcept C62520636 @default.
- W3136018460 hasConcept C63479239 @default.
- W3136018460 hasConcept C77715397 @default.
- W3136018460 hasConcept C8642999 @default.
- W3136018460 hasConcept C89227174 @default.
- W3136018460 hasConceptScore W3136018460C104317684 @default.
- W3136018460 hasConceptScore W3136018460C119857082 @default.
- W3136018460 hasConceptScore W3136018460C121332964 @default.