p2pfl.learning.aggregators.scaffold moduleΒΆ
SCAFFOLD Aggregator.
- class p2pfl.learning.aggregators.scaffold.Scaffold(global_lr=0.1, disable_partial_aggregation=False)[source]ΒΆ
Bases:
WeightAggregatorSCAFFOLD Aggregator [Karimireddy et al., 2019].
Inherits from
WeightAggregatoras SCAFFOLD works with neural network weight tensors.Paper: https://arxiv.org/pdf/1910.06378
The aggregator acts like the server in centralized learning, handling both model and control variate updates.
Due to the complete decentralization of the environment, a global model is also maintained in the aggregator. This consumes additional bandwidth.
Note
This aggregator requires the
scaffoldcallback to be registered. The callback computesdelta_y_i(model update) anddelta_c_i(control variate update) which are passed viaadditional_info. SeeScaffoldCallbackfor PyTorch/TensorFlow implementations.Todo
Improve efficiency by sharing the global model only each n rounds.
- Parameters:
global_lr (
float)disable_partial_aggregation (
bool)
- REQUIRED_INFO_KEYS = ['delta_y_i', 'delta_c_i']ΒΆ
-
SUPPORTS_PARTIAL_AGGREGATION:
bool= FalseΒΆ
-
addr:
strΒΆ
-
c:
list[ndarray]ΒΆ
- get_required_callbacks()[source]ΒΆ
Retrieve the list of required callback keys for this aggregator.
- Return type:
list[str]
-
global_model_params:
list[ndarray]ΒΆ
-
partial_aggregation:
boolΒΆ