Search in this site

Doctorat de l'Université Grenoble Alpes Doctorat de l'Université Grenoble Alpes

Home > News > Doctorate News

See the complete sitemap

Alberto BIETTI, winner of the 2020 Academic Thesis Award

Alberto BIETTI is the winner of the Academic Thesis Award 2020 with 7 other PhDs for his thesis presented in 2019 and entitled "Foundations of deep convolutional models through kernel methods". The academic thesis prizes were awarded to eight Doctors using criteria of excellence specific to each discipline and represented by the 13 doctoral schools on site.
Winner of the 2020 Academic Thesis Award: Alberto BIETTI

Alberto BIETTI - lauréat prix de thèse académique 2020Thesis Title: Foundations of deep convolutional models through kernel methods

Doctoral school: ED MSTII - Mathematics, Information Science and Technology, Computer Science

Host laboratory: Jean Kuntzmann Laboratory (LJK - CNRS / Grenoble INP-UGA / Inria / UGA)

Thesis Supervisor: Julien MAIRAL

Key words : machine learning, deep learning, kernels, optimization

The thesis studies the mathematical properties of deep convolutional neural networks, through the formalism of kernel methods, which provide a more easily understandable mathematical framework.

The increased availability of large amounts of data, from images in social networks, speech waveforms from mobile devices, and large text corpuses, to genomic and medical data, has led to a surge of machine learning techniques. Such methods exploit statistical patterns in these large datasets for making accurate predictions on new data. In recent years, deep learning systems have emerged as a remarkably successful class of machine learning algorithms, which rely on gradient-based methods for training multi-layer models that process data in a hierarchical manner. These methods have been particularly successful in tasks where the data consists of natural signals such as images or audio; this includes visual recognition, object detection or segmentation, and speech recognition.For such tasks, deep learning methods often yield the best known empirical performance; yet, the high dimensionality of the data and large number of parameters of these models make them challenging to understand theoretically. Their success is often attributed in part to their ability to exploit useful structure in natural signals, such as local stationarity or invariance, for instance through choices of network architectures with convolution and pooling operations. However, such properties are still poorly understood from a theoretical standpoint, leading to a growing gap between the theory and practice of machine learning. This thesis is aimed towards bridging this gap, by studying spaces of functions which arise from given network architectures, with a focus on the convolutional case. Our study relies on kernel methods, by considering reproducing kernel Hilbert spaces (RKHSs) associated to certain kernels that are constructed hierarchically based on a given architecture. This allows us to precisely study smoothness, invariance, stability to deformations, and approximation properties of functions in the RKHS. These representation properties are also linked with optimization questions when training deep networks with gradient methods in some over-parameterized regimes where such kernels arise. They also suggest new practical regularization strategies for obtaining better generalization performance on small datasets, and state-of-the-art performance for adversarial robustness on image tasks.

> Discover all the winners of 2020 Thesis Awards

Updated on June 3, 2020


The College and the doctoral schools (except Philo) moved on September 1st, 2020 to join the Maison Jean Kuntzmann at 110 rue de la Chimie 38400 Saint-Martin-d'Hères on the University Campus (Tram B and C, stops "Bibliothèques universitaires").
Associés renforcés
Associés simples