News

What make statistics robust?

  • Faculté des Sciences, des Technologies et de Médecine (FSTM)
    18 octobre 2021
  • Catégorie
    Recherche
  • Thème
    Mathématiques

Robustness has become a major issue in statistical learning and deep learning today. Mathematician Yannick Baraud from the University of Luxembourg has recently published a paper in the prestigious journal Probability Theory and Related Fields to tackle the problem of robustness in statistics.

It is well known that classical statistical procedures do not offer the reliability that scientists are looking for when trained on a dataset that contains a small fraction of “bad data”. This problem has consequences for the ability of new algorithms to classify images and make good predictions from real data set that may be partly of poor quality. 

The paper “Tests and estimation strategies associated to some loss functions” written by Yannick Baraud, Professor in statistics within the Department of Mathematics at the University of Luxembourg, provides a general mathematical solution to this problem and shows how it is possible to design a statistical procedure capable of estimating correctly from data, even when the dataset is contaminated by undesirable data.

Probability Theory and Related Fields is one of the most prestigious journals in the field of probability and mathematical statistics.