p2pfl.learning.aggregators.fedadagrad moduleΒΆ
FedAdagrad Aggregator - Adaptive Federated Optimization using Adagrad.
- class p2pfl.learning.aggregators.fedadagrad.FedAdagrad(eta=0.1, eta_l=0.1, tau=1e-09, disable_partial_aggregation=False)[source]ΒΆ
Bases:
Aggregator
FedAdagrad - Adaptive Federated Optimization using Adagrad [Reddi et al., 2020].
FedAdagrad adapts the Adagrad optimizer to federated settings, maintaining adaptive learning rates on the server side based on accumulated squared gradients.
Paper: https://arxiv.org/abs/2003.00295
-
SUPPORTS_PARTIAL_AGGREGATION:
bool
= FalseΒΆ
- aggregate(*args: Any, **kwargs: Any) Any ΒΆ
- Return type:
Any
-
SUPPORTS_PARTIAL_AGGREGATION: