
Select an Action

Optimizing Worker Performance in Crowdsourcing Platforms
Title:
Optimizing Worker Performance in Crowdsourcing Platforms
Author:
Wu, Ting, author.
ISBN:
9780438131378
Personal Author:
Physical Description:
1 electronic resource (104 pages)
General Note:
Source: Masters Abstracts International, Volume: 57-06M(E).
Abstract:
Recently, the popularity of crowdsourcing has brought a new opportunity for engaging human intelligence into the process of data analysis. Crowdsourcing provides a fundamental mechanism for enabling online workers to participate tasks that are either too difficult to be solved solely by computers or too expensive to employ experts to perform. Though human is intelligent, meanwhile, human is erroneous and greedy, which causes the quality of crowdsourcing results quite questionable. In this thesis, we discuss three novel approaches to optimize the worker performance in Crowdsourcing platforms. They are Diversity-Based Worker Selection, Pay-As-You-Go Scheme and Panel Training. .
In the field of social science, four elements are required to form a wise crowd - Diversity of opinion, Independence, Decentralization and Aggregation. Diversity-Based Worker Selection addresses the algorithmic optimizations towards the "diversity of opinion" of crowdsourcing marketplaces. We propose Similarity-driven Model(SModel) and Task-driven Model(T-Model) for two basic paradigms of worker selection. Pay-As-You-Go-Scheme is a new crowdsourcing paradigm for Object Identification tasks. In this paradigm, requester interactively evaluates each detected object from the crowd, and a worker is paid unit of reward for each detected object if it is verified by the requester. Such a paradigm not only resolves the difficulty for requester to evaluate the performance of the worker, but also avoids same objects being detected by many workers and ending up being meaningless workload. Panel Training focus on one of the most common and natural practice of crowdsourcing - collecting ratings of items. We design a sample-driven rubric to train workers, so they would standardize understanding of the rating criteria.
Local Note:
School code: 1223
Subject Term:
Added Corporate Author:
Available:*
Shelf Number | Item Barcode | Shelf Location | Status |
|---|---|---|---|
| XX(696836.1) | 696836-1001 | Proquest E-Thesis Collection | Searching... |
On Order
Select a list
Make this your default list.
The following items were successfully added.
There was an error while adding the following items. Please try again.
:
Select An Item
Data usage warning: You will receive one text message for each title you selected.
Standard text messaging rates apply.


