Grigorios Chrysos - Learning high-dimensional signals with polynomial networks

Despite the impressive performance of Neural Networks (NNs), there are alternative classes of functions that can obtain similar approximation performance. In this talk, we will focus on Polynomial Networks (PNs), which use high-degree polynomial expansions to approximate the target function. The unknown parameters of PNs can be naturally represented as high-order tensors. We will exhibit how tensor decompositions can both reduce the number of learnable parameters and transform PNs into simple recursive formulations. In the second part of the talk, we will extend PNs for conditional tasks where we have multiple (possibly diverse) inputs. We will exhibit how PNs have been used for learning both generative and discriminative models on image, audio and non-euclidean signals. Lastly, we will showcase how conditional PNs can be used for recovering missing attribute combinations from the training set, e.g. in image generation.
Back to Top