p2pfl.learning.aggregators.fedadam moduleΒΆ

FedAdam Aggregator - Adaptive Federated Optimization using Adam.

class p2pfl.learning.aggregators.fedadam.FedAdam(eta=0.1, eta_l=0.1, beta_1=0.9, beta_2=0.99, tau=1e-09, disable_partial_aggregation=False)[source]ΒΆ

Bases: Aggregator

FedAdam - Adaptive Federated Optimization using Adam [Reddi et al., 2020].

FedAdam adapts the Adam optimizer to federated settings, maintaining both momentum and adaptive learning rates on the server side.

Paper: https://arxiv.org/abs/2003.00295

SUPPORTS_PARTIAL_AGGREGATION: bool = FalseΒΆ
aggregate(*args: Any, **kwargs: Any) AnyΒΆ
Return type:

Any