Mining Crowdsourcing Problems from Discussion Forums of Workers
- verfasst von
- Zahra Nouri, Henning Wachsmuth, Gregor Engels
- Abstract
Crowdsourcing is used in academia and industry to solve tasks that are easy for humans but hard for computers, in natural language processing mostly to annotate data. The quality of annotations is affected by problems in the task design, task operation, and task evaluation that workers face with requesters in crowdsourcing processes. To learn about the major problems, we provide a short but comprehensive survey based on two complementary studies: (1) a literature review where we collect and organize problems known from interviews with workers, and (2) an empirical data analysis where we use topic modeling to mine workers’ complaints from a new English corpus of workers’ forum discussions. While literature covers all process phases, problems in the task evaluation are prevalent, including unfair rejections, late payments, and unjustified blockings of workers. According to the data, however, poor task design in terms of malfunctioning environments, bad workload estimation, and privacy violations seems to bother the workers most. Our findings form the basis for future research on how to improve crowdsourcing processes.
- Externe Organisation(en)
-
Universität Paderborn
- Typ
- Aufsatz in Konferenzband
- Seiten
- 6264-6276
- Anzahl der Seiten
- 13
- Publikationsdatum
- 12.2020
- Publikationsstatus
- Veröffentlicht
- ASJC Scopus Sachgebiete
- Theoretische Informatik, Angewandte Informatik, Theoretische Informatik und Mathematik
- Elektronische Version(en)
-
https://doi.org/10.18653/v1/2020.coling-main.551 (Zugang:
Geschlossen)