Michael Bernstein spoke about his research in crowd-powered systems Aug. 20 to students and faculty at Drexel’s ExCITe Center.
Following a brief introduction, Bernstein announced the subject of his lecture, crowd-powered systems, which he defined as “a way that we can reach out to the intelligence distributed on the Internet and pull it into the kinds of things we use every day.”
Bernstein discussed the subject of Microsoft Word. Although this interface is one of the most highly developed pieces of computer software, it is limited only to helping the user with layout, spelling and grammar.
“It’s not actually helping with the core task it sets out to do — that is, writing,” Bernstein said.
More complex tasks, such as composing or editing a document, are left to the user. Bernstein explained that the process of peer review can be integrated into software like Microsoft Word by using crowd-powered systems. Through a number of programs, the software will take a written piece and put it online for editing.
“I want to picture a world in which I can push my text out and get an entire crowd of people to work with me to improve it,” he said. This way of using human intelligence to support an interactive system is the essence of the crowd-powered system.
Soylent is one such system that uses a crowd built into a word processor. One of the features, called Shortn, can help users decrease the word count of their pieces. This is an issue that often arises when academics want to submit a scholarly article for review, but the draft is over the designated word limit.
Users can submit sections of their papers that they want shortened by selecting the Shortn option, prompting the tool to submit that section online for review. Members of the Internet community can then make the appropriate edits, and the system would compile the results into a shortened, finished product.
This setup typically runs by paid crowdsourcing, in which the online editors are incentivized by small payments to complete an assigned task. By editing a paragraph, for example, a member of the crowd could earn 10 cents, a payment that could accumulate as the crowd member completes more tasks.
The difficulty in this type of approach to crowd-powered systems is that the responses may be of poor quality and may take too long. Researchers in this field have come up with a 30 percent rule, which dictates, “Thirty percent of the results in open-ended questions come back with poor results,” Bernstein said.
Two personas complicate the functioning of programs like Soylent. The “lazy worker” does the bare minimum to complete the task, whereas the “eager beaver” does more than what is asked. The responses from these two categories of crowd members produce a suboptimal result. Bernstein cut to the heart of the problem by saying, “We lack design patterns.”
He suggested applying a find-fix-verify design pattern to the tasks submitted to crowd.
“The idea is we take these open-ended tasks and decompose it into this set of three stages to produce far, far better results,” Bernstein said. This will give a specific structure to the tasks assigned to the crowd in a way that reigns in both the lazy worker and the eager beaver personas.
The second complication with the Soylent approach is that of response times. Although the time between accepting the task and completing the task averages to two minutes, the time between when the task was posted and when the task was accepted is much longer, about 20 minutes. Bernstein proposed to resolve the latter lag time by incorporating the retainer model, which keeps crowd users on call so they can be alerted and respond immediately when a task is posted.
Rapid refinement also helps to streamline the crowdsourcing process. This strategy employs a synchronous crowd — a crowd in which all the members are present at once — to encourage social loafing. Although social loafing is generally a bad phenomenon, it can be used to relieve the pressure that a crowd member may feel to produce the best result, and it therefore decreases the time between accepting and completing a task.
Bernstein said he has high hopes for this method. “I’m going to make a claim that we can actually get these synchronous crowds to act faster in aggregate than even the single, fastest individual member,” he said.
Studies show that rapid refinement is not only the fastest strategy but also the most precise. Studies go even further to show that the combined powers of rapid refinement and the retainer model cut down the time that it takes to get results to less than 10 seconds.
According to Bernstein, software such as Soylent has helped to bring crowd-powered systems to where they are today; however, there is still more to come. The future of crowdsourcing includes the possibility of creating full-time crowd professionals, complete with the questions of contract ethics and education due to the gradual increase in complexity of tasks.
“I hope I have convinced you that we can create these crowd-powered systems that [do] things that neither crowds can do right now nor machine intelligence can do [by] itself,” Bernstein said.
Audience members brought up issues of authorship in a document that has been crowdsourced and whether or not crowdsourced documents can be fairly compared to those produced by more traditional methods.
Anna Lu, a sophomore biomedical engineering major, attended the lecture after receiving an invite by email. She came out of an interest in both the human and technological aspects of the lecture and said following the lecture, “It seems that crowdsourcing can be applied in many different areas, and I’m really excited to see some medical applications of it come forth as a biomedical engineer.”
Youngmoo Kim, director of the ExCITe Center, had invited Bernstein to deliver his lecture because he felt that the message was in line with the center’s goal to achieve a multidisciplinary collaboration.
“Better understanding [of] that relationship between computers and humans, I think, is to everyone’s benefit,” Kim said.