Weighted universal approximation of differentiable maps on infinite-dimensional manifolds
We generalize the universal approximation theorem for functional input neural networks (FNN) to differentiable maps by including the approximation of the derivatives. A FNN maps the input from a possibly infinite-dimensional weighted manifold to the real-valued hidden layer, on which a non-linear scalar activation function is applied, and then returns the output into a Banach space via some linear readouts. By lifting Wiener's Tauberian theorem via the (bounded) approximation property to infinite-dimensional spaces, we establish a universal approximation theorem (UAT) for differentiable maps, which goes beyond the usual formulation on compact sets and also includes the approximation of the derivatives. This leads us to approximation results for non-anticipative functionals including the horizontal and vertical derivatives. Furthermore, we use our weighted UAT to extend the Nachbin theorem to weighted infinite-dimensional manifolds. As an application, we show that linear functions of the signature are able to approximate path space functionals including their directional derivatives.
Bio: Philipp Schmocker is an applied mathematician from Switzerland. His research lies at the intersection of machine learning, stochastic analysis, and mathematical finance, with a particular focus on the theoretical foundations and applications of neural networks. Philipp graduated with a Ph.D. in Mathematics (2025) from Nanyang Technological University (NTU), Singapore, and holds a Ph.D. in Economics and Finance (2022) from the University of St. Gallen.

