Our guest speaker is Grigorios Chrysos from EPFL and you are all cordially invited to the CVG Seminar on Dec 3rd at 2:30 p.m. on Zoom (passcode is 825054) or in-person (room 302 at the institute of informatics)
Despite the impressive performance of Neural Networks (NNs), there are alternative classes of functions that can obtain similar approximation performance. In this talk, we will focus on Polynomial Networks (PNs), which use high-degree polynomial expansions to approximate the target function. The unknown parameters of PNs can be naturally represented as high-order tensors. We will exhibit how tensor decompositions can both reduce the number of learnable parameters and transform PNs into simple recursive formulations. In the second part of the talk, we will extend PNs for conditional tasks where we have multiple (possibly diverse) inputs. We will exhibit how PNs have been used for learning generative models on image, audio and non-euclidean signals. Lastly, we will showcase how conditional PNs can be used for recovering missing attribute combinations from the training set, e.g. in image generation.
Grigorios Chrysos is a Post-doctoral researcher at Ecole Polytechnique Federale de Lausanne (EPFL) following the completion of his PhD at Imperial College London (2020). Previously, he graduated from National Technical University of Athens with a Diploma/MEng in Electrical and Computer Engineering (2014). He has co-organised workshops in top conference venues, e.g. CVPR/ICCV, on deformable models. His current research interests lie in machine learning and its interface with computer vision. In particular, he is working on generative models, tensor decompositions and modelling high dimensional distributions with polynomial expansions. His recent work has been published in top tier conferences (CVPR, ICML, ICLR, NeurIPS) and prestigious journals (T-PAMI, IJCV, T-IP, Proceedings of the IEEE). He is also serving as a reviewer for the aforementioned conferences and journals.