Matches in SemOpenAlex for { <https://semopenalex.org/work/W3214374352> ?p ?o ?g. }
- W3214374352 endingPage "3030" @default.
- W3214374352 startingPage "3017" @default.
- W3214374352 abstract "Mobile Edge Computing (MEC) has emerged as a promising paradigm catering to overwhelming explosions of mobile applications, by offloading compute-intensive tasks to MEC networks for processing. The surging of deep learning brings new vigor and vitality to shape the prospect of intelligent Internet of Things (IoT), and edge intelligence arises to provision real-time deep neural network (DNN) inference services for users. To accelerate the processing of the DNN inference of a user request in an MEC network, the DNN inference model usually can be partitioned into two connected parts: one part is processed in the local IoT device of the request, and another part is processed in a cloudlet (edge server) in the MEC network. Also, the DNN inference can be further accelerated by allocating multiple threads of the cloudlet to which the request is assigned. In this paper, we study a novel delay-aware DNN inference throughput maximization problem with the aim to maximize the number of delay-aware DNN service requests admitted, by accelerating each DNN inference through jointly exploring DNN partitioning and multi-thread execution parallelism. Specifically, we consider the problem under both offline and online request arrival settings: a set of DNN inference requests is given in advance, and a sequence of DNN inference requests arrives one by one without the knowledge of future arrivals, respectively. We first show that the defined problems are NP-hard. We then devise a novel constant approximation algorithm for the problem under the offline setting. We also propose an online algorithm with a provable competitive ratio for the problem under the online setting. We finally evaluate the performance of the proposed algorithms through experimental simulations. Experimental results demonstrate that the proposed algorithms are promising" @default.
- W3214374352 created "2021-11-22" @default.
- W3214374352 creator A5012677271 @default.
- W3214374352 creator A5013643572 @default.
- W3214374352 creator A5019506252 @default.
- W3214374352 creator A5031773278 @default.
- W3214374352 creator A5043464306 @default.
- W3214374352 creator A5063489530 @default.
- W3214374352 date "2023-05-01" @default.
- W3214374352 modified "2023-10-16" @default.
- W3214374352 title "Throughput Maximization of Delay-Aware DNN Inference in Edge Computing by Exploring DNN Model Partitioning and Inference Parallelism" @default.
- W3214374352 cites W2113137767 @default.
- W3214374352 cites W2158760370 @default.
- W3214374352 cites W2194775991 @default.
- W3214374352 cites W2195423816 @default.
- W3214374352 cites W2920031528 @default.
- W3214374352 cites W2950865323 @default.
- W3214374352 cites W2963376050 @default.
- W3214374352 cites W3003427453 @default.
- W3214374352 cites W3014810041 @default.
- W3214374352 cites W3024547383 @default.
- W3214374352 cites W3035564946 @default.
- W3214374352 cites W3040288082 @default.
- W3214374352 cites W3044591330 @default.
- W3214374352 cites W3047565185 @default.
- W3214374352 cites W3094144275 @default.
- W3214374352 cites W3100023286 @default.
- W3214374352 cites W3104849992 @default.
- W3214374352 cites W3107609010 @default.
- W3214374352 cites W3110777925 @default.
- W3214374352 cites W3112782147 @default.
- W3214374352 cites W3195447391 @default.
- W3214374352 cites W3197778677 @default.
- W3214374352 cites W4235435541 @default.
- W3214374352 cites W595252221 @default.
- W3214374352 doi "https://doi.org/10.1109/tmc.2021.3125949" @default.
- W3214374352 hasPublicationYear "2023" @default.
- W3214374352 type Work @default.
- W3214374352 sameAs 3214374352 @default.
- W3214374352 citedByCount "8" @default.
- W3214374352 countsByYear W32143743522022 @default.
- W3214374352 countsByYear W32143743522023 @default.
- W3214374352 crossrefType "journal-article" @default.
- W3214374352 hasAuthorship W3214374352A5012677271 @default.
- W3214374352 hasAuthorship W3214374352A5013643572 @default.
- W3214374352 hasAuthorship W3214374352A5019506252 @default.
- W3214374352 hasAuthorship W3214374352A5031773278 @default.
- W3214374352 hasAuthorship W3214374352A5043464306 @default.
- W3214374352 hasAuthorship W3214374352A5063489530 @default.
- W3214374352 hasConcept C111919701 @default.
- W3214374352 hasConcept C120314980 @default.
- W3214374352 hasConcept C138236772 @default.
- W3214374352 hasConcept C154945302 @default.
- W3214374352 hasConcept C157764524 @default.
- W3214374352 hasConcept C162307627 @default.
- W3214374352 hasConcept C2776061582 @default.
- W3214374352 hasConcept C2776214188 @default.
- W3214374352 hasConcept C2778456923 @default.
- W3214374352 hasConcept C2781055072 @default.
- W3214374352 hasConcept C31258907 @default.
- W3214374352 hasConcept C41008148 @default.
- W3214374352 hasConcept C555944384 @default.
- W3214374352 hasConcept C76155785 @default.
- W3214374352 hasConcept C79974875 @default.
- W3214374352 hasConceptScore W3214374352C111919701 @default.
- W3214374352 hasConceptScore W3214374352C120314980 @default.
- W3214374352 hasConceptScore W3214374352C138236772 @default.
- W3214374352 hasConceptScore W3214374352C154945302 @default.
- W3214374352 hasConceptScore W3214374352C157764524 @default.
- W3214374352 hasConceptScore W3214374352C162307627 @default.
- W3214374352 hasConceptScore W3214374352C2776061582 @default.
- W3214374352 hasConceptScore W3214374352C2776214188 @default.
- W3214374352 hasConceptScore W3214374352C2778456923 @default.
- W3214374352 hasConceptScore W3214374352C2781055072 @default.
- W3214374352 hasConceptScore W3214374352C31258907 @default.
- W3214374352 hasConceptScore W3214374352C41008148 @default.
- W3214374352 hasConceptScore W3214374352C555944384 @default.
- W3214374352 hasConceptScore W3214374352C76155785 @default.
- W3214374352 hasConceptScore W3214374352C79974875 @default.
- W3214374352 hasFunder F4320309893 @default.
- W3214374352 hasFunder F4320321001 @default.
- W3214374352 hasFunder F4320334704 @default.
- W3214374352 hasIssue "5" @default.
- W3214374352 hasLocation W32143743521 @default.
- W3214374352 hasOpenAccess W3214374352 @default.
- W3214374352 hasPrimaryLocation W32143743521 @default.
- W3214374352 hasRelatedWork W2804912624 @default.
- W3214374352 hasRelatedWork W2945616868 @default.
- W3214374352 hasRelatedWork W3009018976 @default.
- W3214374352 hasRelatedWork W3131458535 @default.
- W3214374352 hasRelatedWork W3159722749 @default.
- W3214374352 hasRelatedWork W4226427977 @default.
- W3214374352 hasRelatedWork W4297093186 @default.
- W3214374352 hasRelatedWork W4307482744 @default.
- W3214374352 hasRelatedWork W4316660948 @default.
- W3214374352 hasRelatedWork W4318705081 @default.
- W3214374352 hasVolume "22" @default.
- W3214374352 isParatext "false" @default.