p2pfl.learning.aggregators.fedopt.fedadam moduleΒΆ
FedAdam Aggregator - Adaptive Federated Optimization using Adam.
- class p2pfl.learning.aggregators.fedopt.fedadam.FedAdam(eta=0.1, beta_1=0.9, beta_2=0.99, tau=1e-09, disable_partial_aggregation=False)[source]ΒΆ
Bases:
FedOptBase
FedAdam - Adaptive Federated Optimization using Adam [Reddi et al., 2020].
FedAdam adapts the Adam optimizer to federated settings, maintaining both momentum and adaptive learning rates on the server side.
Paper: https://arxiv.org/abs/2003.00295
- Parameters:
eta (
float
)beta_1 (
float
)beta_2 (
float
)tau (
float
)disable_partial_aggregation (
bool
)