publications

2025

  1. Wide neural networks trained with weight decay provably exhibit neural collapse
    Arthur Jacot, Peter Súkenı́k, Zihan Wang, and Marco Mondelli
    ICLR, 2025
  2. Neural Collapse is Globally Optimal in Deep Regularized ResNets and Transformers
    Peter Súkenı́k, Christoph H Lampert, and Marco Mondelli
    arXiv preprint arXiv:2505.15239, 2025

2024

  1. Neural collapse vs. low-rank bias: Is deep neural collapse really optimal?
    Peter Súkenı́k, Christoph Lampert, and Marco Mondelli
    In Advances in Neural Information Processing Systems, 2024
  2. Average gradient outer product as a mechanism for deep neural collapse
    Daniel Beaglehole, Peter Súkenı́k, Marco Mondelli, and Misha Belkin
    Advances in Neural Information Processing Systems, 2024
  3. Generalization in multi-objective machine learning
    Peter Súkenı́k and Christoph Lampert
    Neural Computing and Applications, 2024

2023

  1. Deep neural collapse is provably optimal for the deep unconstrained features model
    Peter Súkenı́k, Marco Mondelli, and Christoph H Lampert
    Advances in Neural Information Processing Systems, 2023

2022

  1. Intriguing Properties of Input-Dependent Randomized Smoothing
    Peter Súkenı́k, Aleksei Kuvshinov, and Stephan Günnemann
    In Proceedings of the 39th International Conference on Machine Learning, 17–23 jul 2022
  2. The unreasonable effectiveness of fully-connected layers for low-data regimes
    Peter Kocsis, Peter Súkenı́k, Guillem Brasó, Matthias Nießner, and 2 more authors
    Advances in Neural Information Processing Systems, 17–23 jul 2022