Skip to main content
It looks like you're using Internet Explorer 11 or older. This website works best with modern browsers such as the latest versions of Chrome, Firefox, Safari, and Edge. If you continue with this browser, you may see unexpected results.

Faculty Resource Guide

This libguide provides information on a wide range of issues of relevance to faculty including how to link to library resources, what you can and cannot put into Moodle, Open Content Sites, and Information about new Scholarly Communication Hubs.

What is Crowdsourcing?
 

Crowdsourcing "is a process that involves outsourcing tasks to a distributed group of people."  It can take many forms. Wikipedia and Encyclopedia of Life ask people to contribute content to an encyclopedia.  Other forms of crowdsourcing include Crowdsourcing Creative Works, Collective Labeling of Items (e.g. Google Images using the ESP Game/Google Image Labeler), Public Problem Solving (e.g. for Urban or Transit Planning), or Data Gathering (e.g. INRIX provides GPS Routing and Real-Time Traffic Updates.)

Another application is collective problem solving for complex issues requiring the human brain. One individual - David Baker - at the University of Washington, took his lifelong love of Games and combined it with his interest in Biochemistry to produce a video game (FOLD IT)  that encourages everyday people to try and solve the issue of how different proteins were structured.  The first problem posed to gamers was to solve the Mason-Pfizer monkey virus (M-PMV)  which they did in Three Weeks.  The site currently has over 240,000 people registered to assist with solving protein structures. Most are not biochemists and are volunteers eager to take on a brainteaser.  Their results are then analyzed by biochemists.

Another example of using crowdsourcing to provide human supercomputing is Galaxy Zoo.  It asks people to assist astronomers in classifying the vast number of galaxies - seen by the Sloan Digital Sky Survey and Hubble Space Telescope - according to their shapes. More than 50 million classifications were received by the project during its first year, contributed by more than 150,000 people and there contributions were as good as those received from professional astronomers, and of use to a large number of researchers.

The power of this type of research is leading many scientists to question the value of crowdsourcing for their research and integrate it into many different projects as discussed here and here.  Some are even looking to Crowdfund a project.  But it should also raise two important questions: (1) Who owns this intellectual property? and (2) Will collaborative efforts be used to promote the Common Good if the answer came from multiple contributors?

Carmen Kazakoff-Lane's picture
Carmen Kazakoff-Lane
Contact:
Carmen Kazakoff -Lane, Scholarly Communications Librarian
John E. Robbins Library - ( LB 2-19 )
270-18th Street
Brandon, Manitoba
R7A 6A9

Ph: (204) 727-7483

VIDEOCONFERENCING:
*Microsoft Meetings: Kazakoff@brandonu.ca
*Zoom Invite to Kazakoff@brandonu.ca


Website Skype Contact: Kazakofflane