17 Articles by Researchers of HSE Faculty of Computer Science Accepted at NeurIPS
In 2022, 17 articles by the researchers of HSE Faculty of Computer Science were accepted at the NeurIPS (Conference and Workshop on Neural Information Processing Systems), one of the world’s most prestigious events in the field of machine learning and artificial intelligence. The 36th conference will be held in a hybrid format from November 28th to December 9th in New Orleans (USA).
Every year, the organizers of NeurIPS receive thousands of articles for consideration: in 2021, more than 9000 works were submitted, with less than a quarter being accepted. Statistics on the number of applications in 2022 have not yet been revealed, but it is safe to assume that the number of papers is growing annually.
Employees of leading AI companies and research centres are among the organizers and speakers of NeurIPS. The conference traditionally attracts considerable attention and determines the trends of the future development of AI technology.
Ivan Arzhantsev
‘In 2021, 12 reports of the researchers of HSE Faculty of Computer Science were presented at the conference, in 2022 this number has increased to 17. For comparison, in 2021, the organisers accepted 20 articles from Nvidia, and 10 from Apple. As for higher educational institutions, the success of the HSE Faculty of Computer Science can be compared with the following participants: in 2020, the organisers accepted 19 articles from Imperial College London, 13 articles from the German Institute for Intelligent Systems of the Max Planck Society, and 9 from the Technical University of Munich.
Such high assessment of the scientific work of our faculty staff confirms the exceptional level of research being conducted here and the competence of the faculty's specialists,’ says Ivan Arzhantsev, Dean of the HSE Faculty of Computer Science.
Dmitry Vetrov
‘Our group is very proud that we manage to maintain the high quality of our work: this year, as last, two articles have been accepted from our laboratory. In the paper ‘Training Scale-Invariant Neural Networks on the Sphere Can Happen in Three Regimes’, we investigate the properties of neural network training, while the article ‘HyperDomainNet: Universal Domain Adaptation for Generative Adversarial Networks’ covers the training of generative models,’ says Dmitry Vetrov, head of the Centre of Deep Learning and Bayesian Methods, Academic Supervisor of the AI Centre.
Alexey Naumov
‘Thus year 14 articles were accepted from the staff of our laboratory. Alexander Gasnikov, Sergey Samsonov, Leonid Osipov, Daniil Tyapkin, Alexander Beznosikov, Evgeny Lagutin, Maxim Rakhuba, Alexandra Senderovich, Ekaterina Bulatova, and Ekaterina Borodich all played an active role in this. Alexander Gasnikov co-authored 9 articles and made it to the top of the world ranking in terms of number of articles (alongside such famous scientists as Michael Jordan (10 articles) and Sergey Levine (12 articles). Even just one article at NeurIPS is a significant result that many teams of scientists around the world strive for, and we can undoubtedly be proud of our fourteen articles accepted at the conference. These papers are dedicated to reinforcement learning, optimization, effective methods of data generation and computational algorithms,’ says Alexey Naumov, Head of the International Laboratory of Stochastic Algorithms and High-Dimensional Inference.
Full list of publications of the faculty researchers at NeurIPS-2022:
- Towards Practical Computation of Singular Values of Convolutional Layers — Alexandra Senderovich, Ekaterina Bulatova, Maxim Rakhuba
- On Embeddings for Numerical Features in Tabular Deep Learning — Ivan Rubachev, Artem Babenko
- SketchBoost: Fast Gradient Boosted Decision Tree for Multioutput Problems — Leonid Iosipoi
- Local-Global MCMC kernels: the best of both worlds — Sergey Samsonov, Alexey Naumov
- BR-SNIS: Bias Reduced Self-Normalized Importance Sampling — Sergey Samsonov
- Optimistic Posterior Sampling for Reinforcement Learning with Few Samples and Tight Guarantees — Daniil Tyapkin, Alexey Naumov
- Training Scale-Invariant Neural Networks on the Sphere Can Happen in Three Regimes — Maxim Kodryan, Ekaterina Lobacheva, Dmitry Vetrov
- HyperDomainNet: Universal Domain Adaptation for Generative Adversarial Networks — Dmitry Vetrov
- Accelerated Primal-Dual Gradient Method for Smooth and Convex-Concave Saddle-Point Problems with Bilinear Coupling — Alexander Gasnikov
- The First Optimal Acceleration of High-Order Methods in Smooth Convex Optimization — Alexander Gasnikov
- The First Optimal Algorithm for Smooth and Strongly-Convex-Strongly-Concave Minimax Optimization — Alexander Gasnikov
- Clipped Stochastic Methods for Variational Inequalities with Heavy-Tailed Noise — Alexander Gasnikov
- A Damped Newton Method Achieves Global O(1/k^2) and Local Quadratic Convergence Rate — Alexander Gasnikov
- Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees — Mikhail Diskin, Maxim Ryabinin, Alexander Gasnikov
- Optimal Algorithms for Decentralized Stochastic Variational Inequalities — Alexander Gasnikov
- Optimal Gradient Sliding and its Application to Optimal Distributed Optimization Under Similarity — Alexander Gasnikov
- Decentralized Local Stochastic Extra-Gradient for Variational Inequalities — Alexander Gasnikov
Ekaterina Bulatova
Michael Diskin
Ekaterina Lobacheva