Tensor formatted numerical (multi)linear algebra has emerged for about five years as a major computational technique in nearly all areas of scientific computing; we mention data mining, computational chemistry, high-dimensional statistics, numerical solution of PDEs on high-dimensional state and parameter spaces.
Deep Learning, after fundamental developments in the 90's is currently experiencing a phenomenal revival, with major impacts on pattern recognition, classification of data, autonomous driving, knowledge discovery, simulation of complex physical phenomena, Go, etc.
The seminar addresses recently discovered mathematical connections between Deep Learning and Tensor-formatted numerical analysis, with particular attention to the numerical solution of partial differential equations, with random input data. Recent publications on high dimensional approximation, and on tensor products in Banach spaces of random fields will be covered.
The aim of the seminar is to review recent (2015-) research work and results, together with recently published software such as the TT-Toolbox, and Google's TENSORFLOW. The focus is on the mathematical analysis and interpretation of current learning approaches and related mathematical and technical fields, e.g. high-dimensional approximation, tensor structured numerical methods for the numerical solution of highdimensional PDEs, with applications in computational Uncertainty Quantification (UQ).
Further resources:
To earn seminar credits the student must: