In our new paper we prove that a class of so called Three Point Compressors can be applied on Newton method. This result opens up a wide variety of communication strategies, such as a combination of contractive compression and lazy aggregation, available to our disposal to compress prohibitively costly curvature information.

I am very proud to announce that our FedNL paper was accepted to 39th International Conference on Machine Learning (ICML 2022).

Currenly I am in Lausanne. Here I have started my internship at Machine Learning and Optimization Lab, EPFL run by Prof. Martin Jaggi. I will work under the supervision of postdoctoral fellow Hadrien Hendrikx and Prof. Martin Jaggi.

Our paper: Basis Matters: Better Communication-Efficient Second Order Methods for Federated Learning, joint work with Xun Qian, Mher Safaryan, and Peter Richtàrik, was accepted to AISTATS 2022.

We present new techniques (Basis Learning) how to make Newton-type methods applicable to Federated Learning. The idea is based on choosing appropriate basis for local Hessians representations which can lead to much better compression mechanism. Particularly, if we choose standard basis in the space of square matrices, we cover FedNL method.

I have finally arrived to Paris. Here I am going to study at Institut Polytechnique de Paris supervised by Olivier Fercoq. My first year will be based on Applied Mathematics and Statistics Program.

Our FedNL paper has been accepted to ICML workshops: International Workshop on Federated Learning for User Privacy and Data Confidentiality and Beyond first-order methods in ML systems. The paper will be presented as a poster at the first workshop and as a prerecorded video at the second one.

I am attending PRAIRE/MIAI Artificial Intelligence Summer School (PAISS 2021) (this year virtually only). This summer school is organized by PRAIRIE, MIAI, NAVER LABS Europe, and INRIA. The summer school comprises lectures conducted by renowned experts in different areas of artificial intelligence. As a part of the summer school I have presented the poster on our FedNL paper.

I have given a talk based on our paper Distributed Second Order Methods with Fast Rates and Compressed Communication at Maths&AI: MIPT-UGA young researchers workshop. The recorded video is available on YouTube. Here you can find the slides that I prepared for the talk.

I am very happy to announce that I defended my Bachelor thesis and finished the undergraduate study at MIPT yesterday. Now I am planning to conitnue my education at Institut Polytechnique de Paris.

We propose a family of Federated Newton Learn (FedNL) methods, which we believe is a marked step in the direction of making second-order methods applicable to FL. We perform a variety of numerical experiments that show that our FedNL methods have state-of-the-art communication complexity when compared to key baselines.

Our paper Distributed Second Order Methods with Fast Rates and Compressed Communication has been accepted to ICML 2021, which will be run virtually during July 18-24, 2021.

I have presented our recent work Distributed Second Order Methods with Fast Rates and Compressed Communication at KAUST Conference on Artificial Intelligence. The recorded video is available on YouTube.

Peter Richtárik has given a talk on our recent paper: Distributed Second Order Methods with Fast Rates and Compressed Communication, on All Russian Optimization Seminar. The recorded video will be available soon on YouTube channel of the seminar. In addition, I have presented the poster on the paper at Communication Efficient Distributed Optimization Workshop.

I have gotten one more internship at Machine Learning and Optimization Laboratory that Prof. Peter Richtárik runs. We will continue our work on second-order methods for distributed optimization.

We developed new communication efficient second order methods with fast rates. Our results are supported with experimental results on real datasets, and show several orders of magnitude improvement on baseline and state-of-the-art methods in terms of communication complexity. You can find more information about the paper by clicking on the link.