The City College of New YorkCCNY
Department of Mathematics
Division of Science

Function approximation with one-bit Bernstein and neural networks

Mathematics Colloquium

Time and place

12:30–1:30 PM on Thursday, December 1st, 2022; NAC 6/114

Prof. Weilin Li (CCNY)

Note

There will be a gathering beforehand in the Mathematics Department with light refreshments and drinks.

Abstract

The celebrated universal approximation theorems for neural networks typically state that every sufficiently nice function can be arbitrarily well approximated by a neural network with carefully chosen parameters. Motivated by applications where compression is necessary, we ask whether it is possible to represent any reasonable function with a neural network whose parameters are restricted to a small set of values, with the extreme case being one-bit {+1,-1} neural networks? We answer this question in the affirmative for both quadratic and ReLU networks. Our main contributions include a novel approximation result for {+1,-1} sums of multivariate Bernstein polynomials and implementation of this approximation strategy via one-bit neural networks with either the quadratic or ReLU activation functions. Joint work with Sinan Gunturk.

The City College of New YorkCUNY
Instagram iconFacebook iconLinkedIn iconYouTube icon
© The City College of New York. All rights reserved.