Matches in SemOpenAlex for { <https://semopenalex.org/work/W4384080248> ?p ?o ?g. }
Showing items 1 to 77 of
77
with 100 items per page.
- W4384080248 endingPage "2637" @default.
- W4384080248 startingPage "2622" @default.
- W4384080248 abstract "Introducing adaptiveness to federated learning has recently ushered in a new way to optimize its convergence performance. However, adaptive learning strategies originally designed in centralized machine learning are in naїve extended to federated learning in existing works, which does not necessarily improve convergence performance and further reduce communication overhead as expected. In this paper, we fully investigate those centralized learning-based adaptive learning strategies, and propose an adaptive <underline xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink>Fed</u> erated learning algorithm targeting the model parameter <underline xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink>U</u> pdate <underline xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink>R</u> ule, called <italic xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink>FedUR</i> . Convergence upper bounds under <italic xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink>FedUR</i> are derived from the aspect of both local iterations and global aggregations. Through comparison with the convergence upper bounds of original federated learning, we theoretically analyze how those strategies should be tuned to help federated learning effectively optimize convergence performance and reduce overall communication overhead. Extensive experiments are conducted based on several real datasets and machine learning models, which show that <italic xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink>FedUR</i> can effectively increase final convergence accuracy with even lower communication overhead requirement." @default.
- W4384080248 created "2023-07-13" @default.
- W4384080248 creator A5037378130 @default.
- W4384080248 creator A5074294282 @default.
- W4384080248 creator A5078480632 @default.
- W4384080248 date "2023-01-01" @default.
- W4384080248 modified "2023-09-26" @default.
- W4384080248 title "FedUR: Federated Learning Optimization Through Adaptive Centralized Learning Optimizers" @default.
- W4384080248 cites W1982003698 @default.
- W4384080248 cites W2612026221 @default.
- W4384080248 cites W2744829609 @default.
- W4384080248 cites W2963318081 @default.
- W4384080248 cites W2989289980 @default.
- W4384080248 cites W2991236681 @default.
- W4384080248 cites W2998045710 @default.
- W4384080248 cites W3007279825 @default.
- W4384080248 cites W3013860853 @default.
- W4384080248 cites W3033664100 @default.
- W4384080248 cites W3047304572 @default.
- W4384080248 cites W3090615085 @default.
- W4384080248 cites W3103802018 @default.
- W4384080248 cites W3105122387 @default.
- W4384080248 cites W3109847748 @default.
- W4384080248 cites W3113075536 @default.
- W4384080248 cites W3148526481 @default.
- W4384080248 cites W3164271417 @default.
- W4384080248 cites W3184838508 @default.
- W4384080248 cites W3187356235 @default.
- W4384080248 cites W3194243671 @default.
- W4384080248 cites W3197055488 @default.
- W4384080248 cites W3204091362 @default.
- W4384080248 cites W3204468048 @default.
- W4384080248 doi "https://doi.org/10.1109/tsp.2023.3292497" @default.
- W4384080248 hasPublicationYear "2023" @default.
- W4384080248 type Work @default.
- W4384080248 citedByCount "0" @default.
- W4384080248 crossrefType "journal-article" @default.
- W4384080248 hasAuthorship W4384080248A5037378130 @default.
- W4384080248 hasAuthorship W4384080248A5074294282 @default.
- W4384080248 hasAuthorship W4384080248A5078480632 @default.
- W4384080248 hasConcept C11413529 @default.
- W4384080248 hasConcept C119857082 @default.
- W4384080248 hasConcept C154945302 @default.
- W4384080248 hasConcept C162324750 @default.
- W4384080248 hasConcept C199360897 @default.
- W4384080248 hasConcept C2777303404 @default.
- W4384080248 hasConcept C2779960059 @default.
- W4384080248 hasConcept C41008148 @default.
- W4384080248 hasConcept C50522688 @default.
- W4384080248 hasConceptScore W4384080248C11413529 @default.
- W4384080248 hasConceptScore W4384080248C119857082 @default.
- W4384080248 hasConceptScore W4384080248C154945302 @default.
- W4384080248 hasConceptScore W4384080248C162324750 @default.
- W4384080248 hasConceptScore W4384080248C199360897 @default.
- W4384080248 hasConceptScore W4384080248C2777303404 @default.
- W4384080248 hasConceptScore W4384080248C2779960059 @default.
- W4384080248 hasConceptScore W4384080248C41008148 @default.
- W4384080248 hasConceptScore W4384080248C50522688 @default.
- W4384080248 hasLocation W43840802481 @default.
- W4384080248 hasOpenAccess W4384080248 @default.
- W4384080248 hasPrimaryLocation W43840802481 @default.
- W4384080248 hasRelatedWork W2028024605 @default.
- W4384080248 hasRelatedWork W2961085424 @default.
- W4384080248 hasRelatedWork W3046775127 @default.
- W4384080248 hasRelatedWork W3170094116 @default.
- W4384080248 hasRelatedWork W4205958290 @default.
- W4384080248 hasRelatedWork W4285260836 @default.
- W4384080248 hasRelatedWork W4286629047 @default.
- W4384080248 hasRelatedWork W4306321456 @default.
- W4384080248 hasRelatedWork W4306674287 @default.
- W4384080248 hasRelatedWork W4224009465 @default.
- W4384080248 hasVolume "71" @default.
- W4384080248 isParatext "false" @default.
- W4384080248 isRetracted "false" @default.
- W4384080248 workType "article" @default.