Education

 
 
 
 
 

PhD Candidate in Computer Science

UniBas, Department of Mathematics and Computer Science

Oct. 2023 – Present Basel, Switzerland
Supervisor: Aurelien Lucchi
 
 
 
 
 

Master of Science in Data Science

IP Paris, Department of Applied Mathematics

Sept. 2021 – Aug. 2023 GPA 17.65/20 Palaiseau, France
Thesis: Unified Analysis of Asynchronous Algorithms
Thesis Supervisor: Mher Safaryan, Dan Alistarh
 
 
 
 
 

Bachelor of Science in Applied Mathematics and Physics

MIPT, Phystech School of Applied Mathematics and Informatics

Sept. 2017 – Jul. 2021 GPA 4.95/5 (9.27/10)Dolgoprudny, Russia
Thesis: Distributed Second Order Methods with Fast Rates and Compressed Communication
Thesis Supervisor: Peter Richtárik

Recent Posts

I am happy to announce that I have one paper accepted to NeurIPS 2024 on a novel class of functions that neural networks represent. This class contains functions with saddles and local minima in contrast to previous works. This is a joint work with Niccolò Ajroldi , Antonio Orvieto, and Aurelien Lucchi. It will be soon available on arXiv.

New paper on decentralized training with contractive compressed communication. We achieve near-optimal convergence guarantees using Error Feedback and momentum tracking.

I attended KAUST Rising Stars in AI Symposium organized by KAUST AI Initiatve. As a part of the symposium, I gave a talk on AsGrad project.

I am thrilled to announce that AsGrad paper has been accepted to AISTATS 2024. In this paper we propose a unified framework to analyze asynchronous SGD-type algorithms covering many existing versions.

I am happy to announce that Econtrol paper has been accepted to ICLR 2024. In this paper we develop new Error Feedback mechanism to properly handle the bias coming from the compression and noise of stochastic gradients.

Contact