Suche nach Personen

plus im Publikationsserver
plus bei BASE
plus bei Google Scholar

Daten exportieren

 

Optimal approximation using complex-valued neural networks

Titelangaben

Verfügbarkeit überprüfen

Geuchen, Paul ; Voigtlaender, Felix:
Optimal approximation using complex-valued neural networks.
2023
Veranstaltung: NeurlPS 2023.
(Veranstaltungsbeitrag: Kongress/Konferenz/Symposium/Tagung, Moderation/Leitung)

Volltext

Volltext Link zum Volltext (externe URL):
https://arxiv.org/abs/2303.16813

Kurzfassung/Abstract

Complex-valued neural networks (CVNNs) have recently shown promising empirical success, for instance for increasing the stability of recurrent neural networks and for improving the performance in tasks with complex-valued inputs, such as in MRI fingerprinting. While the overwhelming success of Deep Learning in the real-valued case is supported by a growing mathematical foundation, such a foundation is still largely lacking in the complex-valued case. We thus analyze the expressivity of CVNNs by studying their approximation properties. Our results yield the first quantitative approximation bounds for CVNNs that apply to a wide class of activation functions including the popular modReLU and complex cardioid activation functions. Precisely, our results apply to any activation function that is smooth but not polyharmonic on some non-empty open set; this is the natural generalization of the class of smooth and non-polynomial activation functions to the complex setting. Our main result shows that the error for the approximation of Ck-functions scales as m−k/(2n) for m→∞ where m is the number of neurons, k the smoothness of the target function and n is the (complex) input dimension. Under a natural continuity assumption, we show that this rate is optimal; we further discuss the optimality when dropping this assumption. Moreover, we prove that the problem of approximating Ck-functions using continuous approximation methods unavoidably suffers from the curse of dimensionality.

Weitere Angaben

Publikationsform:Veranstaltungsbeitrag (unveröffentlicht): Kongress/Konferenz/Symposium/Tagung, Moderation/Leitung
Sprache des Eintrags:Englisch
Institutionen der Universität:Mathematisch-Geographische Fakultät > Mathematik > Lehrstuhl für Mathematik - Reliable Machine Learning
Mathematisch-Geographische Fakultät > Mathematik > Mathematisches Institut für Maschinelles Lernen und Data Science (MIDS)
DOI / URN / ID:10.48550/arXiv.2303.16813
Titel an der KU entstanden:Ja
KU.edoc-ID:34360
Eingestellt am: 22. Jan 2025 11:39
Letzte Änderung: 23. Jan 2025 12:07
URL zu dieser Anzeige: https://edoc.ku.de/id/eprint/34360/
AnalyticsGoogle Scholar