Seminar

The seminar of OBELIX team is currently held on thursdays 11:30 am, every two weeks, at the IRISA lab, Tohannic campus (bat. ENSIBS). Usually, the presentation lasts 30 min and is followed by a discussion with the team.

The seminar is coordinated by Chloé Friguet.

Previous seminars

2019 / 2019-20 / 2020-21 / 2021-22

Upcoming seminars (2021-2022)


  • Date:  december, 9th
  • Time : 11h30
  • Room: D-001 (bat ENSIBS)
  • Speaker: Thibault Séjourné (PhD candidate, ENS Paris)
  • Title: Distance de Gromov-Wasserstein ‘unbalanced’
  • Abstract: Le transport optimal, théorie permettant de définir des distances entre distributions, est un outil de choix dans les domaines de l’apprentissage machine et de l’estimation statistique car il prend en compte la géométrie de l’espace sous-jacent. Ces distances souffrent cependant de trois limitations pouvant être problématiques : (i) elles sont coûteuses à calculer, (ii) se limitent à la comparaison de probabilités et (iii) comparent des mesures définies sur le même espace. Ces contraintes peuvent être gênantes pour passer à l’échelle dans les calculs, pour être insensible aux “outliers” géométriques (dûs à des données bruitées), ou comparer des graphes (tels que des molécules aux structures différentes). Pour pallier ces limitations ont été proposés la régularisation entropique, le transport non-équilibré et la distance de Gromov-Wasserstein. Dans cette présentation, j’introduirai d’abord la formulation non-équilibrée du transport, ainsi que sa variante entropique. Je détaillerai une variante de l’algorithme de Sinkhorn permettant de calculer le dual du problème grâce à une modification mineure de l’algorithme dans sa version équilibrée, avec une convergence linéaire. Dans un second temps, je présenterai la distance de Gromov-Wasserstein qui est un problème d’optimisation quadratique non convexe comparant des espaces munis d’une métrique et d’une mesure positive. Je définirai deux généralisations non-équilibrée de cette distance, l’une étant une borne supérieure pour l’autre. Je montrerai que la première définit une distance entre espaces métriques mesurés, et que pour la seconde il est possible de la calculer grâce à une régularisation entropique comme une suite de problème de transport non-équilibrés.

  • Date:  december, 14th
  • Time: 9:00
  • Room: IRISA Rennes Salle Métivier (and visio)
  • Speaker: Heng Zhang (PhD defense)
  • Title: Multispectral Object Detection
  • Abstract:  Scene analysis with only visible cameras is challenging when facing insufficient illumination or adverse weather. To improve the recognition reliability, multispectral systems add additional cameras (e.g. infra-red) and perform object detection from multispectral data. Although the concept of multispectral scene  analysis with deep learning has great potential, there are still many open research questions and it has not been  widely deployed in industrial contexts. In this thesis, we investigated three main challenges about multispectral  object detection: (1) the fast and accurate detection of objects of interest from images; (2) the dynamic and  daptive fusion of information from different modalities; (3) low-cost and low-energy multispectral object detection and the reduction of its manual annotation efforts. In terms of the first challenge, we first optimize the label assignment of the object detection training with a mutual guidance strategy between the classification and localization tasks; we then realize an efficient compression of object detection models by including the teacher-student prediction disagreements in a feature-based knowledge distillation framework. With regard to the second challenge, three different multispectral feature fusion schemes are proposed to deal with the most difficult fusion cases where different cameras provide  contradictory information. For the third challenge, a nouvel modality distillation framework is firstly presented to tackle the hardware and software constraints of current multispectral systems; then a multi-sensor-based active learning strategy is designed to reduce the labelling costs when constructing multispectral datasets.
    Jury : Vincent LePetit / Tinne Tuytelaars / Patrick Bouthemy / Patrick Perez /Jakob Verbeek/ Elisa Fromont / Sébastien Lefèvre

  • Date:  January, 13th
  • Time: 11:30
  • Room :
  • Speaker:
  • Title: 
  • Abstract:

  • Date:  January, 27th
  • Time: 11:30
  • Room :
  • Speaker:
  • Title: 
  • Abstract:

  • Date:  Febrary, 24th
  • Time: 11:30
  • Room :
  • Speaker:
  • Title: 
  • Abstract:

  • Date:  March, 10th
  • Time: 11:30
  • Room :
  • Speaker:
  • Title: 
  • Abstract:

  • Date:  March, 24th
  • Time: 11:30
  • Room :
  • Speaker:
  • Title: 
  • Abstract:

  • Date:  POSTPONED
  • Time: 11:30
  • Room :
  • Speaker:Yann Soulard (MCF LETG Rennes)
  • Title: 
  • Abstract:

  • Date: POSTPONED
  • Time: 11:30
  • Room :
  • Speaker: Jeremy Cohen (CR PANAMA, IRISA Rennes)
  • Title: Learning with Low Rank Approximations
  • Abstract: Matrix and tensor factorizations are widespread techniques to extract structure out of data in a potentially blind manner. However, several issues may be raised: (i) these models have often been designed for blind scenarios whereas many applications now features extensive training databases, (ii) their output may not be interpretable because of the lack of identifiability of the parameters and (iii) computing a good solution can be difficult. In this talk, after describing the link between separable functions and tensor/matrix factorizations, we will show that separability and low rank approximations are actually already at the core of many machine learning problems such as dictionary learning or simultaneous factorizations.

Comments are closed.