evaluating crowdsourcing websites
DESCRIPTION
NDF Barcamp, Hamilton City Library, Hamilton, New ZealandTRANSCRIPT
![Page 1: Evaluating crowdsourcing websites](https://reader036.vdocument.in/reader036/viewer/2022070315/554fb432b4c9057b298b52d4/html5/thumbnails/1.jpg)
How effectively are crowdsourcing websites supporting volunteer participation and quality contribution?
Donelle McKinleyPhD candidate, School of Information ManagementVictoria University of WellingtonNDF Barcamp, Hamilton, 21 June 2013
www.digitalglam.org @donellemckinley
![Page 2: Evaluating crowdsourcing websites](https://reader036.vdocument.in/reader036/viewer/2022070315/554fb432b4c9057b298b52d4/html5/thumbnails/2.jpg)
Crowdsourcing: a short version
Crowdsourcing outsources tasks traditionally performed by specific individuals to a group of people or community through an open call (Howe, 2009).
![Page 3: Evaluating crowdsourcing websites](https://reader036.vdocument.in/reader036/viewer/2022070315/554fb432b4c9057b298b52d4/html5/thumbnails/3.jpg)
Crowdsourcing: a long version
“Crowdsourcing is a type of participative online activity in which an individual, an institution, a non-profit organization, or company proposes to a group of individuals of varying knowledge, heterogeneity, and number, via a flexible open call, the voluntary undertaking of a task. The undertaking of the task, of variable complexity and modularity, and in which the crowd should participate bringing their work, money, knowledge and/or experience, always entails mutual benefit. The user will receive the satisfaction of a given type of need, be it economic, social recognition, self-esteem, or the development of individual skills, while the crowdsourcer will obtain and utilize to their advantage that what the user has brought to the venture, whose form will depend on the type of activity undertaken.”
(Estellés-Arolas & González-Ladrón-de-Guevara, 2012)
![Page 4: Evaluating crowdsourcing websites](https://reader036.vdocument.in/reader036/viewer/2022070315/554fb432b4c9057b298b52d4/html5/thumbnails/4.jpg)
![Page 5: Evaluating crowdsourcing websites](https://reader036.vdocument.in/reader036/viewer/2022070315/554fb432b4c9057b298b52d4/html5/thumbnails/5.jpg)
Potential benefits of crowdsourcing for the institution
O Continue tradition of volunteerismO Market and stimulate interest in collectionsO Signal the institution's openess and
approachabilityO Achieve goals otherwise too costly and labour-
intensiveO Better reflect the diversity of visitorsO Tap into expertise outside the institutionO Engage visitors in new waysO Raise the profile of researchO Demonstrate relevanceO Enable new research questions to be explored
![Page 6: Evaluating crowdsourcing websites](https://reader036.vdocument.in/reader036/viewer/2022070315/554fb432b4c9057b298b52d4/html5/thumbnails/6.jpg)
Common motivations for volunteer participation
O The size of the challengeO The necessity for volunteer contributionO Collaboration with prestigious institutionsO Contribution to researchO EducationO Mental stimulationO Being part of a communityO Personal research interestsO Enhancing a resource from which they will
benefit
![Page 7: Evaluating crowdsourcing websites](https://reader036.vdocument.in/reader036/viewer/2022070315/554fb432b4c9057b298b52d4/html5/thumbnails/7.jpg)
The potential of the crowd
Digitalkoot (National Library of Finland)O First 51 days of the project O 31, 816 visitors to the siteO 15% participated
Transcribe Bentham (University College London)O First 6 months of the projectO 1,207 visitors registered to participate O 21% participated
![Page 8: Evaluating crowdsourcing websites](https://reader036.vdocument.in/reader036/viewer/2022070315/554fb432b4c9057b298b52d4/html5/thumbnails/8.jpg)
Three common scenarios
The website either:
O Follows the structure of the underlying
technology or the organisation
O Adheres to familiar conventions
O Is the product of personal preference
Garrett, J. (2011). The Elements of User Experience: User-centered design for the web. USA: Aiga/New Riders.
![Page 9: Evaluating crowdsourcing websites](https://reader036.vdocument.in/reader036/viewer/2022070315/554fb432b4c9057b298b52d4/html5/thumbnails/9.jpg)
How effectively does your website…?
O Define its objectives
O Reflect visitor motivations
O Align visitor motivations with
relevant incentives
O Minimize sources of frustration and
concern
![Page 10: Evaluating crowdsourcing websites](https://reader036.vdocument.in/reader036/viewer/2022070315/554fb432b4c9057b298b52d4/html5/thumbnails/10.jpg)
How are they impacting on the effectiveness of your website?
O ContentO LanguageO ReadabilityO Website navigationO Arrangement of page elementsO ConsistencyO Visual appearanceO Page load speedO Number and complexity of processes to
complete the desired action
![Page 11: Evaluating crowdsourcing websites](https://reader036.vdocument.in/reader036/viewer/2022070315/554fb432b4c9057b298b52d4/html5/thumbnails/11.jpg)
O Prototype: an early sample or model built to test a concept or process
O Pilot: a small-scale preliminary experiment conducted to evaluate feasibility, time, cost, adverse events, and improve design prior to the launch of a full-scale project
O Soft-launch: the release of a website to a limited audience, in order to (beta) test and tweak a design before being launched to a wider audience
O Beta testing: user testing by a limited audience to ensure the website/software has few faults or bugs, and gather user feedback
O Optimization: increasing the percentage of visitors that fulfill the objective of a webpage or website
![Page 12: Evaluating crowdsourcing websites](https://reader036.vdocument.in/reader036/viewer/2022070315/554fb432b4c9057b298b52d4/html5/thumbnails/12.jpg)
User evaluation: A different kind of user engagement
OFocus groups
OSurveys
OUsability testing
OOngoing feedback channels
![Page 13: Evaluating crowdsourcing websites](https://reader036.vdocument.in/reader036/viewer/2022070315/554fb432b4c9057b298b52d4/html5/thumbnails/13.jpg)
Analytics: How is website design impacting on…?
O The number of online visitors
O Time spent on site
O The number of online visitors who register to
participate
O The number of online visitors who actually
participate
O The number of abandoned and completed
tasks
O The number of return visitors
![Page 14: Evaluating crowdsourcing websites](https://reader036.vdocument.in/reader036/viewer/2022070315/554fb432b4c9057b298b52d4/html5/thumbnails/14.jpg)
OCR text correction: Trove, National Library of Australia
![Page 15: Evaluating crowdsourcing websites](https://reader036.vdocument.in/reader036/viewer/2022070315/554fb432b4c9057b298b52d4/html5/thumbnails/15.jpg)
Evaluation of the Trove crowdsourcing user interface involved:
O Asking potential volunteers to comment on the prototype
O Sitting potential volunteers in front of the computer screen and asking them how they would complete the task
O A soft launch and beta testingO Gathering feedback on the beta version from over
600 users over the course of five months, via a survey, an online contact form, direct observation of user activity, analytics, online comments, and direct contact with users via email and phone
![Page 16: Evaluating crowdsourcing websites](https://reader036.vdocument.in/reader036/viewer/2022070315/554fb432b4c9057b298b52d4/html5/thumbnails/16.jpg)
![Page 17: Evaluating crowdsourcing websites](https://reader036.vdocument.in/reader036/viewer/2022070315/554fb432b4c9057b298b52d4/html5/thumbnails/17.jpg)
Evaluation of the Transcribe Bentham task interface involved:
O Beta testingO A user surveyO Website analyticsO Analysis of user interaction statisticsO Comparisons with studies on
crowdsourcing volunteer behaviour
![Page 18: Evaluating crowdsourcing websites](https://reader036.vdocument.in/reader036/viewer/2022070315/554fb432b4c9057b298b52d4/html5/thumbnails/18.jpg)
UK Reading Experience Database (UK-RED)
![Page 19: Evaluating crowdsourcing websites](https://reader036.vdocument.in/reader036/viewer/2022070315/554fb432b4c9057b298b52d4/html5/thumbnails/19.jpg)
Evaluation of the UK-RED task interface involved:
O Heuristic evaluation O A survey of current and potential
contributorsO Comparison with other
crowdsourcing task interfacesO Comparison with recommended
practice as outlined in crowdsourcing and HCI literature
![Page 20: Evaluating crowdsourcing websites](https://reader036.vdocument.in/reader036/viewer/2022070315/554fb432b4c9057b298b52d4/html5/thumbnails/20.jpg)
Usability and functionality requirements for a NZ-RED task interface
O Minimize user effortO Support integration of the task with research
processesO Enable new visitors and contributors to
understand what the task involves quickly and easily
O Support accurate and controlled data entry O Be easy to use for people reasonably confident
with the WebO Support flexible, structured data entryO Support bilingual data entry
![Page 21: Evaluating crowdsourcing websites](https://reader036.vdocument.in/reader036/viewer/2022070315/554fb432b4c9057b298b52d4/html5/thumbnails/21.jpg)
Challenges impacting on evaluation
OFunding
OExpertise
OOrganisational culture
![Page 22: Evaluating crowdsourcing websites](https://reader036.vdocument.in/reader036/viewer/2022070315/554fb432b4c9057b298b52d4/html5/thumbnails/22.jpg)
What could a more effective website mean for the project?
O More online visitors participateO Tasks are completed more efficientlyO Tasks are completed with greater
accuracyO The task is more enjoyableO Volunteers participate more oftenO The project is more cost-effectiveO More volunteers are willing to participate
in future projects
![Page 23: Evaluating crowdsourcing websites](https://reader036.vdocument.in/reader036/viewer/2022070315/554fb432b4c9057b298b52d4/html5/thumbnails/23.jpg)
Thanks!
For references and other great reads visit http://www.digitalglam.org/crowdsourcing/books/
Presentation and slides will be available athttp://www.digitalglam.org/crowdsourcing/talks/
For crowdsourcing research updates follow www.digitalglam.org @donellemckinley