There is a scandal unfolding in academia, and it is unraveling one of the most popular fields in social science.
Last year it was revealed that behavioral economist Dan Ariely, a professor at Duke’s business school and author of the best selling book Predictably Irrational, had faked the data behind some of his biggest studies. The study was about (I kid you not) reducing dishonesty.
This week, the same group that exposed Ariely is releasing evidence that one of his co-authors also produced fraudulent work. Francesca Gino, a professor at Harvard Business School and best-selling author, had contributed to the dishonesty project that Ariely had manipulated. Her role was an independent experiment that was investigating a similar mechanism to reduce dishonesty. Except, when you dig into the Excel data files, it is clear that she also manipulated the data to get the results she wanted.
Not only is it ironic that the studies were on dishonesty, the careers these authors built off of this academic background is classic. Both Ariely and Gino get corporate speaking gigs to discuss their research where they can command fees for $50,000 to $100,000. And remember how I said Gino was a best-selling author? Her book is titled Rebel Talent: Why It Pays to Break the Rules at Work and Life. Experts writing about what they know, am I right?
As these scandals unfold, I often reflect on what it means for the profession. Here are some of my thoughts.
Trivial work
I am amazed at how much of this fraudulent research is just trivial nonsense. The paper where Ariely and Gino independently messed with the data was about how a reminder to be honest might reduce cheating. This feels like listening to your lucky song before taking a test. It probably doesn’t help, but it might; and it’s not hurting, so why not? If a reminder to be honest had such a big effect, we probably would have discovered that a long time ago. But hey, it’s nice that there was some “scientific” evidence that it could move the needle for some people, so it’s not like we’re hurting anyone. (well…we’ll address that again later)
But then the second study that Gino allegedly manipulated is amazingly trivial. The question was basically if I asked you to write an essay defending Nazi Germany, would you have a more positive view about Clorox? The idea was that doing something that felt inauthentic or morally unclean would manifest itself as a desire to be physically clean. Which might work in a book you read in your 11th grade English class, but it just seems like it wouldn’t matter for the real world.
The third study exposed is about cheating on a coin flip and coming up with creative ideas on how to use a newspaper. They argue that cheaters are more creative. I understand that you have to operationalize the effects you’re looking for, but so far I haven’t been convinced by any of them. Here’s an easy objection: creativity is about synthesizing ideas in your domain, and your ability to come up with uses for a newspaper is unlikely related to your domain of creativity. One area I’m required to be creative is in my research on Haiti. I have to creatively connect ideas from Haiti’s history with economic theories and the data that’s available. I’m producing insights into Haiti’s history that have never been discussed. That’s creative! And it has no bearing on how many uses of a newspaper I can devise in one minute (unless I’m using the newspapers for archival data).
A lot of this work feels like “I’m 14 and this is deep.” I can just imagine the teenage TikTokers who are holding their lav mics and saying things like, “Did you know that in a 2015 study…” and then concluding that modern cleaning companies are built on white supremacy. Which is fine if you want to try to go viral on a platform built for barely educated children with attention deficit issues. But maybe the professors at the world’s most prestigious universities should be focused on something less trivial.
Non-trivial consequences
And that brings us to the non-trivial consequences of this work. Let’s start at the smallest level and move out.
First, even though the work is not only trivial, it’s fraudulent, it has had major personal benefits for these scholars. They get faculty positions at prestigious universities. They have written successful books. They speak at some of the world’s most successful companies. Here’s the list Gino provides on her website. These companies are shelling out big money to hear about this work. Apparently the fraud has led to some very nice personal payoffs.
And if it was just the personal payoffs, maybe it wouldn’t be so bad. After all, there are plenty of snake oil salesmen out there making big money through corporate speaking, YouTube channels, and the like. If companies are happy spending money on this, that’s their decision.
But it’s the university positions that are especially problematic. These professors are educating not only the students who attend their university; they teach the students who become professors at other universities. And those new professors build on the work they learned in school. Even if you weren’t educated at their school, since they are so prominent in the field, you learn about their work no matter where you go.
A whole generation of professors has been directly influenced by this work. At the very least, their work reflects this fraudulent work. In the worst case, many professors have spent scarce career and social resources on extending this work. For example, a team of behavioral scientists tried replicating the experiment where you cut down on dishonesty with a reminder to be honest. Except they did it with over 600,000 Guatemalan taxpayers. They found the prompt had no effect on honesty in paying taxes, which is now unsurprising given the original study was based on fraud. But that lesson came after spending thousands of dollars on this experiment. Real money that could have been used not chasing an idea popularized through dishonesty.
When fraud launches academics into high-status careers, the effects ripple through the profession.
Wrong priorities
Setting apart fraud, if the studies are so trivial, why do they gain so much traction? Why are these studies being published at all?
There’s this unfortunate trend where most people like counterintuitive or surprising findings. This trend is the premise behind books like Freakonomics or Malcolm Gladwell’s work. In turn, academics see that kind of work get promoted, and so it rises in status. Journal editors, on the margin, will take something that’s surprising over something that’s consistent with conventional wisdom.
I know this from experience. I recently wrote a paper on the effects of political instability and foreign investment. It’s a nice piece with new archival data on foreign businesses in Haiti, and it comes with this excellent graph.
The result is not surprising: after the American occupation began in 1915, there was an increase in foreign businesses. Creating stability encouraged business creation. Totally intuitive. I could take a slight counterintuitive angle and say, “Usually a military invasion is a negative signal about the economy. How many businesses were happy to enter Afghanistan in 2002? Look at how counterintuitive an increase is!!” Instead I was satisfied with the interesting finding for Haiti.
When I submitted the paper to a journal, I got the following feedback:
As far as I can judge, the analysis in the paper is accurate, the available data are judiciously and effectively used for the limited purpose at hand, and the paper reads well. My only problem with the paper – and it is a basic problem – is that it is not very interesting. All we learn from it is that when a situation of acute insecurity and threatening political environment is suddenly and drastically put right, business, and particularly foreign business, improves. It would be difficult to imagine otherwise in any case, anytime, anywhere under the sun. So in this case a massive effort at data collection and quantitative history seems to add little to our knowledge of the relationship between political stability and economic growth, somewhat like building a formidable weapon to kill a fly.
I’ll confess that I could have done a better job selling how interesting this work is. I’m still learning how to frame my research. But the subtext of this feedback is that research is only interesting when it’s counterintuitive. And that’s a dangerous path to direct scholars down, especially young scholars. Persistent exposure to this kind of feedback tells researchers the profession values shock over truth.
This will have a few effects on the profession. First, it will push good scholars and teachers out of academia because they fail to produce work that is shocking enough to get published. Fortunately, with all of the great opportunities for economists outside of academia, this just means companies get better economists. Second, it nudges the ones who stay in academia to manipulate their work towards shock value. This is why p-hacking is such a problem.
Over time, these incentives erode trust in research. I already don’t trust any reporting on diet research because every six months there’s something stupid in the news like “Ice cream is a health food!” (Note that the subheading sells the counterintuitive result.) We’re already in an era where people question the efficacy of markets. If the only studies that get published are the ones where a crappy research design produced a counterintuitive result, that will lead to policies that handicap the most effective mechanisms we have for improving our livelihoods.
We all have a personal responsibility towards improving the research environment. As Smokey the Bear said, “Only you can prevent crappy research from spreading like wildfire.”
Beautifully put article.