![Optimizing Worker Performance in Crowdsourcing Platforms için kapak resmi Optimizing Worker Performance in Crowdsourcing Platforms için kapak resmi](/client/assets/d79c3e4af2b6d196/ctx/images/no_image.png)
Optimizing Worker Performance in Crowdsourcing Platforms
Başlık:
Optimizing Worker Performance in Crowdsourcing Platforms
Yazar:
Wu, Ting, author.
ISBN:
9780438131378
Yazar Ek Girişi:
Fiziksel Tanımlama:
1 electronic resource (104 pages)
Genel Not:
Source: Masters Abstracts International, Volume: 57-06M(E).
Özet:
Recently, the popularity of crowdsourcing has brought a new opportunity for engaging human intelligence into the process of data analysis. Crowdsourcing provides a fundamental mechanism for enabling online workers to participate tasks that are either too difficult to be solved solely by computers or too expensive to employ experts to perform. Though human is intelligent, meanwhile, human is erroneous and greedy, which causes the quality of crowdsourcing results quite questionable. In this thesis, we discuss three novel approaches to optimize the worker performance in Crowdsourcing platforms. They are Diversity-Based Worker Selection, Pay-As-You-Go Scheme and Panel Training. .
In the field of social science, four elements are required to form a wise crowd - Diversity of opinion, Independence, Decentralization and Aggregation. Diversity-Based Worker Selection addresses the algorithmic optimizations towards the "diversity of opinion" of crowdsourcing marketplaces. We propose Similarity-driven Model(SModel) and Task-driven Model(T-Model) for two basic paradigms of worker selection. Pay-As-You-Go-Scheme is a new crowdsourcing paradigm for Object Identification tasks. In this paradigm, requester interactively evaluates each detected object from the crowd, and a worker is paid unit of reward for each detected object if it is verified by the requester. Such a paradigm not only resolves the difficulty for requester to evaluate the performance of the worker, but also avoids same objects being detected by many workers and ending up being meaningless workload. Panel Training focus on one of the most common and natural practice of crowdsourcing - collecting ratings of items. We design a sample-driven rubric to train workers, so they would standardize understanding of the rating criteria.
Notlar:
School code: 1223
Konu Başlığı:
Mevcut:*
Yer Numarası | Demirbaş Numarası | Shelf Location | Lokasyon / Statüsü / İade Tarihi |
---|---|---|---|
XX(696836.1) | 696836-1001 | Proquest E-Tez Koleksiyonu | Arıyor... |
On Order
Liste seç
Bunu varsayılan liste yap.
Öğeler başarıyla eklendi
Öğeler eklenirken hata oldu. Lütfen tekrar deneyiniz.
:
Select An Item
Data usage warning: You will receive one text message for each title you selected.
Standard text messaging rates apply.