An online approach for joint task assignment and worker evaluation in crowd-sourcing

Abstract

The paper tackles the problem of finding the correct solution to a set of binary choice questions or labeling tasks, by adaptively assigning them to workers in a crowdsourcing system. Such problem becomes quite challenging when we do not initially know neither workers’ abilities, nor questions’ difficulties (besides common a priori statistics), nor (of course) which is the correct answer. Indeed, such problem requires to jointly learn workers’ abilities and questions’ difficulties, while adaptively assigning questions to the most appropriate workers so as to maximize our chances to find which are the correct answers. To address such problem, we first cast it into a suitably constructed Bayesian framework which permits us to obtain an analytically tractable (closed form) single-question inference step, and then we address the more general framework via the Expectation Propagation algorithm, an approximated message-passing iterative technique. We then exploit the information gathered by the inference framework as adaptive weights for a maximum weight matching task assignment policy, proposing a computationally efficient algorithm which maximizes the entropy reduction for the questions assigned at each step. © 2017 IEEE.

Publication
2017 International Symposium on Networks, Computers and Communications, ISNCC 2017