Size does matter. Comparing the results of a lab and a crowdsourcing file download QoE study
Andreas Sackl; Bruno Gardlo; Raimund Schatz
Over the last couple of years, crowdsourcing has become a widely used method for conducting subjective QoE experiments over the Internet. However, the scope of crowdsourced QoE experiments so far has been mostly limited to video and image quality testing, despite the existence of many other rele- vant application categories. In this paper we demonstrate the applicability of crowd- sourced QoE testing to the case of file downloads. We con- ducted several campaigns in which participants had to down- load large (10-50MB) media files (with defined waiting times) and subsequently rate their QoE. The results are compared with those of a lab-based file download QoE study featuring an equivalent design. Our results show that crowdsourced QoE testing can also be applied to file downloads with a size of 10 MB as rating results are very similar to the lab. However, beyond user reliability checks and filtering, we found the study design to be a highly critical element as it exerted strong influence on overall partic- ipant behavior. For this reason we also present a discussion of valuable lessons learned in terms of test design and participant behavior.
The paper can be downloaded from the ISCA archive.