volunteers vs turkers
volunteer accuracy in crowdsourcing
many workers stop because they’re concerned about the quality of their work, not just monetary reasons
has some good background on how workers complete more tasks for higher payment, but quality doesn’t improve
mason and watts 2009 compared piece rate schemes (per image) vs quote based schemes (paid after a bundle of tasks). Piece rate leads to lower number of tasks.
Results
Pay per task vs pay for time vs pay per annotation
found comparable quality between volunteers and turkers
pay per task is fast, but low recall (of correct annotations, didn’t get many). Large drop in precision w/ high difficulty
pay per time / per annotation caused slower but better work