p2pfl.learning.aggregators.fedopt.fedadagrad moduleΒΆ
FedAdagrad Aggregator - Adaptive Federated Optimization using Adagrad.
- class p2pfl.learning.aggregators.fedopt.fedadagrad.FedAdagrad(eta=0.1, beta_1=0.9, tau=1e-09, disable_partial_aggregation=False)[source]ΒΆ
Bases:
FedOptBase
FedAdagrad - Adaptive Federated Optimization using Adagrad [Reddi et al., 2020].
FedAdagrad adapts the Adagrad optimizer to federated settings, maintaining adaptive learning rates on the server side based on accumulated squared gradients.
Paper: https://arxiv.org/abs/2003.00295
- Parameters:
eta (
float
)beta_1 (
float
)tau (
float
)disable_partial_aggregation (
bool
)