What is crowdsourced public opinion?
For decades, researchers have gathered data on public opinion using expensive, time-intensive approaches such as in-person interviews, survey mailings, and phone calls. Crowdsourcing platforms like Amazon’s Mechanical Turk and ClickWorker dramatically change this dynamic, making the gathering of public opinion far cheaper and easier.
We call this approach “crowdsourced public opinion,” and several studies have shown that the technique provides high-quality research results at far lower costs than traditional methods.
In many cases, the results from crowdsourced public opinion methods are of higher quality than more traditional approaches. One University of Oxford study stated that MTurk resulted in better data than traditional resources for research.
More and more organizations from testing companies like ETS to massive companies like Frito-lay are relying on online tools to crowdsource public opinion. Lego, for example, now uses crowdsourced focus groups to collect ideas from consumers for new products.
Why should I rely on opinion crowdsourcing?
Traditional methods to collect opinion data take a lot of time and money. A high-quality, national representative poll of 1,000 people typically costs at least $30,000. Focus groups of ten individuals run around $1,000 to -$1,200 per person in a major city.
What’s more, traditional approaches to surveys make it expensive to target subpopulations like groups of educators or medical professionals. The RAND Corporation, for example, has a panel of teachers and school leaders. But the cost is as much as $60,000 per survey.
Opinion crowdsourcing offers a low-cost alternative, and using crowdsourcing tools like Amazon Mechanical Turk, experts can gather rigorous, quality data in a far more cost-effective way.
Among other things, the technologies allow for:
ARE CROWDSOURCING OPINIONS A RELIABLE RESEARCH METHOD?
Yes. Last year alone, thousands of experiments or research studies were run on MTurk, and many researchers believe that the approach is often more effective than traditional approaches.
Within academia, tools like MTurk have dramatically changed research. In May 2018, the online magazine JSTOR Daily published an article with this bold headline: “Amazon’s Mechanical Turk has Reinvented Research.” The author, Alexander Samuel, went so far as to say that Mechanical Turk and other similar services have ushered in a “golden age in survey research.”
Several studies have shown that MTurk produces high-quality research results. A study led by Tara Behrend of George Washington University raised the specific question of whether or not survey data collected from an online pool would be as reliable as more traditional university panel survey pools.
In addition to identifying the fact that “crowdsourcing respondents were older, were more ethnically diverse, and had more work experience,” the “reliability of the data from the crowdsourcing sample was as good as or better than the corresponding university sample,” she concluded.
Another study found similar results. Lead by Michael Buhrmester from the University of Texas at Austin, the team found that MTurk data was found to be “at least as reliable as those obtained via traditional methods.” Additionally, the study concluded that MTurk could be used to achieve “high-quality data inexpensively and rapidly.”
Many large companies have been using these tools, and in a review by the research team at Yahoo, the authors compared MTurk to large market research studies, as well as another online survey platform. The conclusion was that online platforms, including MTurk, could be relied upon to achieve reliable results in just hours and at a fraction of the cost of more formal survey settings.
What are crowdsourced public opinion panels?
Panels are groups of people who render their opinion or perspective on products, services, or research data. Experts have used panels for decades to get focused research from specific sub-populations.
At the Learning Agency, we have created or partnered to create different types of panels.
Educator Panel. The Learning Agency has gathered more than 800 educators who are actively teaching in the field. These panel participants are involved in surveys, interviews, and focus groups.
Most of these educators are teachers. Some are leaders and support staff. They are somewhat younger and whiter than the population as a whole.
Breakdown of Roles in Learning Agency Educator Panel
National Representative Panel. The Learning Agency has partnered with Matthew Sisco, a behavioral researcher at Columbia University, who has created a nationally representative panel of online participants.
This system allows highly representative estimates to be generated from online samples with two main features. First, the system maintains a pool of adult participants that mirror the population of the United States on key demographics including age, gender, and location. This allows any survey to be completed by a sample that closely represents the national population.
Second, the system employs state-of-the-art statistical corrections to the survey responses that account for any remaining misalignment. This correction is known as “Multi-level Regression and Poststratification” or MRP, and it enables accurate predictions of public opinion in fine-grained geographic areas such as states or counties. More simply, the system distributes and analyzes a survey in a way that a) generates estimates of opinions highly representative of national public opinion and b) estimates how opinions vary across different states and/or counties.
What are the advantages of crowdsourcing public opinion?
Cost. A traditional survey using polls and a nationally representative sample typically costs around $30,000. Using a platform like MTurk, costs can be significantly reduced, and a 15 minute, 15 question survey of 1,500 participants costs around $5,000.
Speed. Crowdsourcing public opinion results can be collected within hours, days or weeks, in comparison to many weeks and sometimes months from more traditional means. Online platforms can very quickly be collected and tabulated, making evaluation much quicker and easier, as well.
Innovation. In a recent study published in the journal, Psychological Science, researchers examined the issue of collective narcissism, or where individuals “excessively (demonstrate) high regard for their own group.” The data was gathered using Amazon’s MTurk, with more than 2,800 Americans from all 50 states.
The study had an innovative design, asking Americans from each state what they thought their state’s contribution was to the overall history of the United States. The results received widespread press attention including in the Washington Post.
Anonymity. There may also be a “hidden” advantage to online research platforms like those found on MTurk, as people are more likely to give authentic answers because they’re online, versus a less anonymous setting.
In one study in the journal Clinical Psychological Science, researchers examined the use of MTurk to study clinical populations. Researchers were looking for the prevalence of severe psychiatric disorders within crowdsourcing populations, including Amazon’s MTurk.
Their findings suggested that crowdsourcing participants may be more comfortable revealing sensitive personal information than they would be in more traditional interviews and surveys. The anonymity of online forums can carry a distinct advantage here, and in this way, MTurk can offer a higher quality result.
What are the downsides to using MTurk and other tools?
There are some concerns about MTurk as a research tool.
Fairness. Some, understandably, argue that many MTurk requesters don’t pay people enough. We strongly believe that people should be paid fairly, and we don’t publish any projects that pay less than $15/hr, the minimum wage in New York City, and higher than Washington D.C.’s $14/hr minimum wage. For our teacher and other panels, we typically pay closer to $20 an hour.
Focus. Another concern about online, crowdsourcing tools is that participants are not as focused as more traditional participants would be. In other words, an online survey can be completed while watching television, or talking with someone else.
However, researchers have tested this idea, and it seems that people do, in fact, pay attention. In one study in the Journal of Experimental Political Science, for example, few differences were found in the quality of results between laboratory administered questionnaires and those administered online. In other words, surveys taken at home, or in an uncontrolled setting, were just as reliable as more traditional settings.
Indeed, several studies have shown that MTurk respondents paid better attention than their counterparts; one study concluded that, of three different types of tasks, “MTurkers were more attentive to the instructions than were college students” when it came to the appropriate completion of attention-check questions.
At the Learning Agency, we also include “attention checks,” these are questions or instructions designed to ensure that the person taking the survey is paying attention. They might consist of an instruction to write in, “I am paying attention,” on a particular survey answer, or answers that use reverse wording on two different, but similarly worded questions, so that a positive response for one question would mean a negative response for the other question.
Demographics. MTurk presents some issues regarding the demographics of participants. Americans on the platform are generally whiter and more highly educated than the general population. MTurk workers also have a higher rate of unemployment, retired or homemaker status, and more than 10 percent are college students, according to a University of Washington study.
Most differences in demographics can be factored in when final results are evaluated.
A fast and cost-effective way to collect nonprobability samples that are more diverse than those typically used by psychologists.” — Mathematica
What’s Mechanical Turk?
Mechanical Turk, or MTurk, is Amazon’s crowdsourcing marketplace. It’s named after an 18th-Century chess playing “machine” that defeated several human challengers. This “machine” hid, within its workings, a human chess-master who controlled the actions of the machine.
Amazon’s MTurk was launched in November of 2005, and the site soon became home to tens of thousands of “jobs” on the platform. By the spring of 2007, 100,000 workers had registered, from more than 100 countries. Just a few years later, there were half a million from almost 200 countries.
Most estimates indicate that 80% of “Turkers,” as MTurk workers are colloquially known, are located in the United States, and at any time there are 2,000-5,000 workers online at any given time, which is the estimated equivalent of 10,000-25,000 active full-time workers.
At the Learning Agency, we rely on MTurk because it is the largest crowdsourced platform and has proven itself to be an effective way to conduct experiments online, offering a researcher’s trifecta of a wide range of respondents, at a relatively low cost, and within a short duration of time. Most every type of traditional opinion collection method can be replicated, or at least modified using MTurk’s platform.
The data obtained are at least as reliable as those obtained via traditional methods. Overall, MTurk can be used to obtain high-quality data inexpensively and rapidly." — University of Oxford