Titelangaben
Geuchen, Paul ; Stöger, Dominik ; Telaar, Thomas ; Voigtlaender, Felix
:
Upper and lower bounds for the Lipschitz constant of random neural networks.
In: Information and Inference : a Journal of the IMA. 14 (2025) 2: iaaf009.
ISSN 2049-8772 ; 2049-8764
Volltext
![]() |
Link zum Volltext (externe URL): https://doi.org/10.1093/imaiai/iaaf009 |
Kurzfassung/Abstract
Empirical studies have widely demonstrated that neural networks are highly sensitive to small, adversarial perturbations of the input. The worst-case robustness against these so-called adversarial examples can be quantified by the Lipschitz constant of the neural network. In this paper, we study upper and lower bounds for the Lipschitz constant of random ReLU neural networks. Specifically, we assume that the weights and biases follow a generalization of the He initialization, where general symmetric distributions for the biases are permitted. For deep networks of fixed depth and sufficiently large width, our established upper bound is larger than the lower bound by a factor that is logarithmic in the width. In contrast, for shallow neural networks we characterize the Lipschitz constant up to an absolute numerical constant that is independent of all parameters.
Weitere Angaben
Publikationsform: | Artikel |
---|---|
Sprache des Eintrags: | Englisch |
Institutionen der Universität: | Mathematisch-Geographische Fakultät > Mathematik > Lehrstuhl für Mathematik - Reliable Machine Learning
Mathematisch-Geographische Fakultät > Mathematik > Juniorprofessur für Data Science Mathematisch-Geographische Fakultät > Mathematik > Mathematisches Institut für Maschinelles Lernen und Data Science (MIDS) |
DOI / URN / ID: | 10.1093/imaiai/iaaf009 |
Peer-Review-Journal: | Ja |
Verlag: | Oxford Univ. Press |
Die Zeitschrift ist nachgewiesen in: | |
Titel an der KU entstanden: | Ja |
KU.edoc-ID: | 35001 |
Letzte Änderung: 22. Apr 2025 14:15
URL zu dieser Anzeige: https://edoc.ku.de/id/eprint/35001/