


default search action
Human Computation, Volume 6
Volume 6, 2019
- Themistoklis Karavellas, Anggarda Prameswari, Oana Inel, Victor de Boer

:
Local Crowdsourcing for Annotating Audio: the Elevator Annotator platform. 1-11
- Chris Dijkshoorn, Victor de Boer

, Lora Aroyo
, Guus Schreiber:
Accurator: Nichesourcing for Cultural Heritage. 12-41 - Xipei Liu, James P. Bagrow:

Autocompletion interfaces make crowd workers slower, but their use promotes response diversity. 42-55 - Ginger Tsueng

, Arun Kumar, Max Nanis
, Andrew I. Su
:
Aligning Needs: Integrating Citizen Science Efforts into Schools Through Service Requirements. 56-82 - Anoush Margaryan:

Comparing crowdworkers' and conventional knowledge workers' self-regulated learning strategies in the workplace. 83-97 - Amrapali Zaveri, Wei Hu, Michel Dumontier:

MetaCrowd: Crowdsourcing Biomedical Metadata Quality Assessment. 98-112 - Ting-Hao K. Huang, Amos Azaria, Oscar J. Romero, Jeffrey P. Bigham:

InstructableCrowd: Creating IF-THEN Rules for Smartphones via Conversations with the Crowd. 113-146 - Nai-Ching Wang, David Hicks, Paul Quigley, Kurt Luther:

Read-Agree-Predict: A Crowdsourced Approach to Discovering Relevant Primary Sources for Historians. 147-175 - David Gil de Gómez Pérez

, Roman Bednarik:
POnline: An Online Pupil Annotation Tool Employing Crowd-sourcing and Engagement Mechanisms. 176-191 - Susumu Saito, Chun-Wei Chiang, Saiph Savage, Teppei Nakano, Tetsunori Kobayashi, Jeffrey P. Bigham:

Predicting the Working Time of Microtasks Based on Workers' Perception of Prediction Errors. 192-219

manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.


Google
Google Scholar
Semantic Scholar
Internet Archive Scholar
CiteSeerX
ORCID














