p2pfl.learning.aggregators.scaffold moduleΒΆ

SCAFFOLD Aggregator.

class p2pfl.learning.aggregators.scaffold.Scaffold(global_lr=0.1, disable_partial_aggregation=False)[source]ΒΆ

Bases: WeightAggregator

SCAFFOLD Aggregator [Karimireddy et al., 2019].

Inherits from WeightAggregator as SCAFFOLD works with neural network weight tensors.

Paper: https://arxiv.org/pdf/1910.06378

The aggregator acts like the server in centralized learning, handling both model and control variate updates.

Due to the complete decentralization of the environment, a global model is also maintained in the aggregator. This consumes additional bandwidth.

Note

This aggregator requires the scaffold callback to be registered. The callback computes delta_y_i (model update) and delta_c_i (control variate update) which are passed via additional_info. See ScaffoldCallback for PyTorch/TensorFlow implementations.

Todo

Improve efficiency by sharing the global model only each n rounds.

Parameters:
  • global_lr (float)

  • disable_partial_aggregation (bool)

REQUIRED_INFO_KEYS = ['delta_y_i', 'delta_c_i']ΒΆ
SUPPORTS_PARTIAL_AGGREGATION: bool = FalseΒΆ
addr: strΒΆ
c: list[ndarray]ΒΆ
get_required_callbacks()[source]ΒΆ

Retrieve the list of required callback keys for this aggregator.

Return type:

list[str]

global_model_params: list[ndarray]ΒΆ
partial_aggregation: boolΒΆ