Friday, 19 July 2019
Monday, 01. April 2019 16:55

Technical Report on "Reproducible Performance Evaluation in Cloud Computing" Published

The SPEC RG Cloud Working Group has published a Technical Report on "Methodological Principles for Reproducible Performance Evaluation in Cloud Computing"

The rapid adoption and the diversification of cloud computing technology exacerbate the importance of a sound experimental methodology for this domain.


This work investigates how to measure and report performance in the cloud, and how well the cloud research community is already doing it.


We propose a set of eight important methodological principles that combine best-practices from nearby fields with concepts applicable only to clouds, and with new ideas about the time-accuracy trade-off.


We show how these principles are applicable using a practical use-case experiment.  To this end, we analyze the ability of the newly released SPEC Cloud IaaS 2018 benchmark to follow the principles,  and  showcase  real-world  experimental  studies  in  common  cloud  environments  that meet the principles. Last, we report on a systematic literature review including top conferences and journals in the field, from 2012 to 2017, analyzing if the practice of reporting cloud performance measurements follows  the  proposed  eight  principles.   Worryingly,  this  systematic  survey  and  the  subsequent two-round human reviews, reveal that few of the published studies follow the eight experimental principles.


We conclude that, although these important principles are simple and basic, the cloud community is yet to adopt them broadly to deliver sound measurement of cloud environments.

Alessandro Vittorio Papadopoulos, Laurens Versluis, André Bauer, Nikolas Herbst, Jóakim von Kistowski, Ahmed Ali-Eldin, Cristina Abad, J. Nelson Amaral, Petr Tuma, and Alexandru Iosup. Methodological Principles for Reproducible Performance Evaluation in Cloud Computing - A SPEC Research Technical Report. April 2019.

The technical report can be found here.