Peter Súkeník

Institute of Science and Technology Austria (ISTA)

prof_pic.jpg

Hello! I am Peter Súkeník, a Ph.D. student at ISTA, proud to be co-supervised by Prof. Marco Mondelli and Prof. Christoph Lampert. I am currently most interested in selected topics in the theory of deep learning. For example, I am intrigued by a structure in learned representations, such as low-rank matrices, neural collapse, or extreme tokens and the related attention sinks. My focus is on understanding how the mentioned phenomena are influenced by weight regularization or other key training parameters.

During my PhD I have worked extensively on the neural collapse phenomenon, having produced three NeurIPS papers (1, 2, 3, one spotlighted), one ICLR paper (4 - oral presentation) and one preprint.

Before joining ISTA, I studied master’s in mathematics at Technical University Munich (TUM), focusing strongly on ML and prob-stats topics with a bit of a financial flavor. Even before, I did my bachelor’s degree in financial and economic mathematics at Comenius University Bratislava (UK). Way before that, I was born in the beautiful town of Žilina, Slovakia.

Besides research, I have many interests, in particular sports and game design. My most favorite sports are competitive math and running, but I also enjoy gym, ultimate frisbee, cycling, dancing, climbing, hiking, floorball, and many others. In competitive math, I successfully represented Slovakia on IMO, but also myself on many other events. Currently, I serve as an organizer of the Slovakian math olympiad and math camps and for many years I organized another math competition. My PB’s in the running are:

  • 5K: 18:43
  • 10K: 38:34
  • Half-marathon: 1:21:56
  • Marathon: TBD!

news

Jun 15, 2025 Will join G-research as a summer ML research intern.
May 15, 2025 New preprint on end-to-end optimality of neural collapse in regularized ResNets and transformers.
Apr 15, 2025 ICLR oral with Arthur Jacot, Zihan Wang and Marco Mondelli on end-to-end emergence of neural collapse in regularized MLPs.

selected publications

  1. Neural collapse vs. low-rank bias: Is deep neural collapse really optimal?
    Peter Súkenı́k, Christoph Lampert, and Marco Mondelli
    In Advances in Neural Information Processing Systems, 2024
  2. Deep neural collapse is provably optimal for the deep unconstrained features model
    Peter Súkenı́k, Marco Mondelli, and Christoph H Lampert
    Advances in Neural Information Processing Systems, 2023
  3. Neural Collapse is Globally Optimal in Deep Regularized ResNets and Transformers
    Peter Súkenı́k, Christoph H Lampert, and Marco Mondelli
    arXiv preprint arXiv:2505.15239, 2025