Follow Us On

What deep learning can learn from the classical Volterra-Wiener theories of nonlinear systems

January 15, 202014:30 - 15:00State of AI and ML - January 2020

Speaker

David Wei

David Wei

Sr. Principal Engineer, ASML US, Inc.
  • January 15, 2020
  • 02:30pm - 03:00pm

Abstract

In this talk, we will review the structural and mathematical similarities between deep learning’s convolutional neural network (CNN) and the classical Volterra-Wiener theories of nonlinear systems. The Volterra and Wiener theories had been established respectively more than a century and half a century ago, and have been widely applied in various fields of physical sciences, engineering, and medicine, especially in applications of system identification. As such, there is a wealth of theoretical results and practical methods in Volterra-Wiener modeling, that lends to the present research and development involving CNNs. In particular, the celebrated Stone-Weierstrass theorem provides the why and how a Volterra or Wiener series can be truncated to a finite order, corresponding to a CNN with a definitive number of layers, while still approximating sufficiently well the input-output characteristics of the modeled system. The mathematical analysis is also useful to prove lower and upper bounds for the depth and width for a CNN to model a system well.)