CHEVALIER Marion

PhD student at Sorbonne University
Team : MLIA
https://lip6.fr/Marion.Chevalier

Supervision : Matthieu CORD

Co-supervision : HÉNAFF Gilles, THOME Nicolas

Détection et reconnaissance de cibles en imagerie optronique

Image classification has a prominent interest in numerous visual recognition tasks, particularly for vehicle recognition in airborne systems, where the images have a low resolution because of the large distance between the system and the observed scene. During the training phase, complementary data such as knowledge on the position of the system or high-resolution images may be available. In our work, we focus on the task of low-resolution image classification while taking into account supplementary information during the training phase.
We first show the interest of deep convolutional networks for the low-resolution image recognition, especially by proposing an architecture learned on the targeted data. On the other hand, we rely on the framework of learning using privileged information to benefit from the complementary training data, here the high-resolution versions of the images. We propose two novel methods for integrating privileged information in the learning phase of neural networks. Our first model relies on these complementary data to compute an absolute difficulty level, assigning a large weight to the most easily recognized images. Our second model introduces a similarity constraint between the networks learned on each type of data. We experimentally validate our models on several application cases, especially in a fine-grained oriented context and on a dataset containing annotation noise.

Defence : 12/02/2016

Jury members :

M. François Brémond, INRIA Sophia Antipolis, Rapporteur
M. Patrick Pérez, Technicolor, Rapporteur
Mme Catherine Achard, Université Pierre et Marie Curie
M. Stéphane Canu, INSA Rouen
M. Matthieu Cord, Université Pierre et Marie Curie
M. Gilles Hénaff, Thales Optronique S.A.S.
M. Nicolas Thome, Université Pierre et Marie Curie

Departure date : 12/31/2016

2015-2018 Publications