Matches in SemOpenAlex for { <https://semopenalex.org/work/W3183164146> ?p ?o ?g. }
Showing items 1 to 80 of
80
with 100 items per page.
- W3183164146 endingPage "907" @default.
- W3183164146 startingPage "891" @default.
- W3183164146 abstract "Until now, the reigning computing paradigm has been Cloud Computing, whose facilities concentrate in large and remote areas. Novel data-intensive services with critical latency and bandwidth constraints, such as autonomous driving and remote health, will suffer under an increasingly saturated network. On the contrary, Edge Computing brings computing facilities closer to end-users to offload workloads in Edge Data Centers (EDCs). Nevertheless, Edge Computing raises other concerns like EDC size, energy consumption, price, and user-centered design. This research addresses these challenges by optimizing Edge Computing scenarios in two ways, two-phase immersion cooling systems and smart resource allocation via Deep Reinforcement Learning. To this end, several Edge Computing scenarios have been modeled, simulated, and optimized with energy-aware strategies using real traces of user demand and hardware behavior. These scenarios include air-cooled and two-phase immersion-cooled EDCs devised using hardware prototypes and a resource allocation manager based on an Advantage Actor–Critic (A2C) agent. Our immersion-cooled EDC’s IT energy model achieved an NRMSD of 3.15% and an R 2 of 97.97%. These EDCs yielded an average energy saving of 22.8% compared to air-cooled. Our DRL-based allocation manager further reduced energy consumption by up to 23.8% in comparison to the baseline. • Edge Computing’s sustainable deployment has limitations like area, energy, and price. • We address them via Two-Phase Immersion Cooling and Deep Reinforcement Learning. • We devise and optimize realistic Edge Computing deployments for an ADAS application. • We build and model an immersion cooling system energy-wise obtaining an R 2 of 97.97%. • The DRL agent obtains an energy reduction of up to 23.8% compared with the baseline." @default.
- W3183164146 created "2021-08-02" @default.
- W3183164146 creator A5048400925 @default.
- W3183164146 creator A5051332139 @default.
- W3183164146 creator A5085179161 @default.
- W3183164146 date "2021-12-01" @default.
- W3183164146 modified "2023-10-14" @default.
- W3183164146 title "Energy-conscious optimization of Edge Computing through Deep Reinforcement Learning and two-phase immersion cooling" @default.
- W3183164146 cites W2532132370 @default.
- W3183164146 cites W2595744084 @default.
- W3183164146 cites W2599106590 @default.
- W3183164146 cites W2766447205 @default.
- W3183164146 cites W2797367353 @default.
- W3183164146 cites W2809494244 @default.
- W3183164146 cites W2891795289 @default.
- W3183164146 cites W2895922401 @default.
- W3183164146 cites W2898035736 @default.
- W3183164146 cites W2947829929 @default.
- W3183164146 cites W2952957939 @default.
- W3183164146 cites W2958991960 @default.
- W3183164146 cites W2959368156 @default.
- W3183164146 cites W2960833983 @default.
- W3183164146 cites W2963302706 @default.
- W3183164146 cites W2963334314 @default.
- W3183164146 cites W2964229770 @default.
- W3183164146 cites W2965698433 @default.
- W3183164146 cites W2972192263 @default.
- W3183164146 cites W2980360843 @default.
- W3183164146 cites W2995182484 @default.
- W3183164146 cites W2999561085 @default.
- W3183164146 cites W3018623270 @default.
- W3183164146 cites W3023421339 @default.
- W3183164146 cites W3030402187 @default.
- W3183164146 cites W3120551842 @default.
- W3183164146 cites W3124099125 @default.
- W3183164146 doi "https://doi.org/10.1016/j.future.2021.07.031" @default.
- W3183164146 hasPublicationYear "2021" @default.
- W3183164146 type Work @default.
- W3183164146 sameAs 3183164146 @default.
- W3183164146 citedByCount "14" @default.
- W3183164146 countsByYear W31831641462021 @default.
- W3183164146 countsByYear W31831641462022 @default.
- W3183164146 countsByYear W31831641462023 @default.
- W3183164146 crossrefType "journal-article" @default.
- W3183164146 hasAuthorship W3183164146A5048400925 @default.
- W3183164146 hasAuthorship W3183164146A5051332139 @default.
- W3183164146 hasAuthorship W3183164146A5085179161 @default.
- W3183164146 hasBestOaLocation W31831641461 @default.
- W3183164146 hasConcept C154945302 @default.
- W3183164146 hasConcept C199068039 @default.
- W3183164146 hasConcept C202444582 @default.
- W3183164146 hasConcept C33923547 @default.
- W3183164146 hasConcept C41008148 @default.
- W3183164146 hasConcept C97541855 @default.
- W3183164146 hasConceptScore W3183164146C154945302 @default.
- W3183164146 hasConceptScore W3183164146C199068039 @default.
- W3183164146 hasConceptScore W3183164146C202444582 @default.
- W3183164146 hasConceptScore W3183164146C33923547 @default.
- W3183164146 hasConceptScore W3183164146C41008148 @default.
- W3183164146 hasConceptScore W3183164146C97541855 @default.
- W3183164146 hasLocation W31831641461 @default.
- W3183164146 hasOpenAccess W3183164146 @default.
- W3183164146 hasPrimaryLocation W31831641461 @default.
- W3183164146 hasRelatedWork W1562959674 @default.
- W3183164146 hasRelatedWork W2923653485 @default.
- W3183164146 hasRelatedWork W2952472710 @default.
- W3183164146 hasRelatedWork W2957776456 @default.
- W3183164146 hasRelatedWork W3005560120 @default.
- W3183164146 hasRelatedWork W3037422413 @default.
- W3183164146 hasRelatedWork W4206669594 @default.
- W3183164146 hasRelatedWork W4210912933 @default.
- W3183164146 hasRelatedWork W4224287422 @default.
- W3183164146 hasRelatedWork W4255994452 @default.
- W3183164146 hasVolume "125" @default.
- W3183164146 isParatext "false" @default.
- W3183164146 isRetracted "false" @default.
- W3183164146 magId "3183164146" @default.
- W3183164146 workType "article" @default.