Experiments in Operations Research are Hardly Reproducible: A Bike-Sharing Case-Study.
Thomas Barzola  1@  , Van-Dat Cung  2@  , Nicolas Gast  3@  , Vincent Jost  1@  
1 : Laboratoire des sciences pour la conception, lóptimisation et la production
Institut polytechnique de Grenoble - Grenoble Institute of Technology, Université Grenoble Alpes, Centre National de la Recherche Scientifique : UMR5272
2 : Laboratoire des sciences pour la conception, l'optimisation et la production  (G-SCOP)  -  Site web
Institut polytechnique de Grenoble (Grenoble INP), Université Joseph Fourier - Grenoble I, CNRS : UMR5272
46 avenue Félix Viallet 38031 Grenoble -  France
3 : Inria
L'Institut National de Recherche en Informatique et e n Automatique (INRIA)

Many researchers do believe that providing a detailed description of the experiments and the algorithms used is sufficient to guarantee reproducibility. In this paper, we argue that this is largely false. To assert this, we tried to reproduce the experiments of another work. The authors of this paper were aware of the need for their work to be reproducible: they made their data available in one of the authors' website, and they provide a detailed description of their methodology (only the code is missing). Nevertheless, despite all the care they took, we were not able to reproduce their work and our numerical findings are significantly different from theirs. Without their code, we cannot be sure if there is a bug (in their implementation or in ours) or a difference in the interpretation of the model. This raises a number of ethical questions for the community: what is the validity of science if numerical results cannot be trusted? Instead of developing new methodology, should we not spend more time reimplementing existing methods, making them available to all? This may lead to a less productive, yet more trustworthy and reliable, science.


Personnes connectées : 5 Vie privée
Chargement...