One Shot Crowdtesting: Approaching the Extremes of Crowdsourced Subjective Quality Testing

Title

One Shot Crowdtesting: Approaching the Extremes of Crowdsourced Subjective Quality Testing

 

Authors

Michael Seufert; Tobias Hoßfeld

 

Abstract

Crowdsourcing studies for subjective quality testing have be- come a particularly useful tool for Quality of Experience re- searchers. Typically, crowdsouring studies are conducted by many unsupervised workers, which rate the perceived quality of several test conditions during one session (mixed within-subject test design). However, those studies often show to be very sensitive, for example, to test instructions, design, and filtering of unreliable participants. Moreover, the exposure of several test conditions to single workers potentially leads to an implicit training and anchoring of ratings. Therefore, this works investigates the extreme case of presenting only a single test condition to each worker (completely between-subjects test design). The results are compared to a typical crowdsourcing study de- sign with multiple test conditions to discuss training effects in crowdsourcing studies. Thus, this work investigates if it is possible to use a simple “one shot” design with only one rating of a large number of workers instead of sophisticated (mixed or within-subject) test designs in crowdsourcing.

The paper can be downloaded from the ISCA archive.