On the crowdsourcing website Mturk, we’ve been building a panel of teachers in order to better understand how educators view teaching and learning. We've also been conducting large-scale surveys and experimental research to better understand how people learn. Several studies have shown that MTurk produces high-quality research results, and in many cases, the data tool produces research that’s of higher quality than more traditional methods.
Mturk has been shown, for instance, to be a powerful tool to do experiments--and do produce nationally representative results. For example, a Northwestern study published in 2016 compared the results of 20 large political and sociological surveys, each of which were administered to both an MTurk sample and a nationally-representative population-based sample.
After reviewing the data, the Northwestern experts found that the results were extremely similar between Mturk and the other data sources for nearly all surveys, including overall demographic data, the direction of results, and statistical significance.
In some fields like education, MTurk is actually a better source of research data than many other data sets. A University of Oxford study from 2011 argued that Mturk was a better source of data than many other currently used sources for research. Among other things, the researchers argued that that MTurk participants were far more diverse than the American college samples typically used in psychological research.
According to researchers, substantial psychological testing of MTurk participants has show that MTurkers respond similarly in experiments to other participants. More exactly, Turkers exhibit classic psychological effects similarly to in-lab participants, including the heuristics, reasoning errors, and decision biases common in everyday life. This research also shows that they have similar rates of attention to detail and following directions as participants from traditional recruitment sources.
Even the highest-ranking social science journals regularly publish studies based on MTurk data. Over the past few years, for instance, dozens--if not hundreds--of studies have relied upon MTurk data have been published in high-ranking journals like American Sociological Review, Law & Society Review, and Psychological Science.
Our panel of teachers can be used to answer pressing education questions. We've also been using Mturk to do low-cost surveys.
To be sure, Turk has its issues. In the geeky parlance of researchers, it provides a convenience sample and to extrapolate findings, researchers should weight results to make the results more representative. There are also some biases in the population, and people using the site tend to be whiter and younger, according to researchers.
But in the end, it’s clear that sites like Mturk are the future of insight research because they’re fast, inexpensive and high-quality. Or as one recent paper concluded after a long review of the evidence, “Mturk is a fast and cost-effective way to collect nonprobability samples that are more diverse than those typically used by psychologists.”
Given our interest in learning, we’re hoping to explore Mturk as it relates to education. Over at the Center for American Progress, one of us—Ulrich—has already used Mturk in a number of studies, including a diary study on testing and another study on knowledge of learning.
If you’re interested in asking questions of educators, please reach out. We’re happy to also put our panel to work for you, and we will be sure to update these pages as our work continues
--By Joe McFall and Ulrich Boser
Check here for the latest news on the Learning Agency, a consultancy devoted to the science of learning.