New paper out

We present new techniques (Basis Learning) how to make Newton-type methods applicable to Federated Learning. The idea is based on choosing appropriate basis for local Hessians representations which can lead to much better compression mechanism. Particularly, if we choose standard basis in the space of square matrices, we cover FedNL method.

Rustem Islamov
Master student
comments powered by Disqus