In December 2017, a pair of Warwick University post-docs, Karsten Müller and Carlo Schwarz, published an intriguing and clever attempt to measure whether there was any correlation between hate crime and social media usage in Germany.
Their place of publication was SSRN, probably the most glorious website in the economics profession—an open platform where researchers can post and update their working papers without having to go through the time and rigor of peer review. It’s fast, it’s vibrant, it’s free, and it’s always at the cutting edge of every interesting economic debate. It’s also rather messy, with a signal-to-noise ratio much lower than that of most peer-reviewed journals. Anybody citing a paper found on SSRN does so in full knowledge that it carries a big flashing implicit “caveat lector” sign.
The media first picked up on this SSRN paper in January 2018, when the Economist published a “Daily chart” under the headline “In Germany, online hate speech has real-world consequences.” The short article, citing the paper, explained that the research had shown that “for every four additional Facebook posts critical of refugees, there was one additional anti-refugee incident,” while taking pains to note that “this correlation is of course no guarantee of causation.” Notably, the Economist article only addressed the volume of right-wing anti-refugee posts, rather than the amount of Facebook use broadly.
The New York Times, by contrast, took a very different approach to the paper. Instead of simply reporting on the economists’ findings, the newspaper sent a reporter to Germany. On Tuesday, it ran a long, complex feature article of almost 3,000 words, by Amanda Taub and Max Fisher, examining the nexus of hate and social media usage in the country. The Times used the study as a jumping-off point for old-fashioned shoe-leather reporting: There were no quotes in the article from the paper’s authors, but many from local residents. The headline, too, was suitably hedged—“Facebook Fueled Anti-Refugee Attacks in Germany, New Research Suggests.”
Still, the most explosive part of the article centered on the Warwick research. The Times referred to the paper as “a landmark study,” and spotlighted one showstopping finding in particular.
Their reams of data converged on a breathtaking statistic: Wherever per-person Facebook use rose to one standard deviation above the national average, attacks on refugees increased by about 50 percent.
Twitter, of course, jumped right onto this statistic. This tweet, for instance, racked up almost 5,000 retweets:
Immediately, journalists and economists around the world started downloading the paper to try and work out whether it was reliable, and whether it said what the Times said that it said. Neither task is easy: The paper is technically complicated, difficult to read and understand. Its results are hard to judge without replicating a lot of hard statistical work. And to make matters worse, my search for the breathtaking statistic revealed that it was never explicitly stated in the paper.
Part of the problem comes from the fact that the researchers updated the paper on the SSRN site the very day the Times article ran. But even the old version of the paper doesn’t say what the Times says it says.
To grok the statistic lauded as the centerpiece of the study, you have to get there sideways. First, you need to look at the difference between Frankfurt and Dresden. The paper then explains:
As a case study, consider the cities of Frankfurt/Main and Dresden, which are about one standard deviation apart at the AfD users over population measure. The estimated effect of a typical number of AfD refugee posts in a city like Dresden are 0.043 attacks per 10,000 asylum seekers, while it is 0.029 for a city such as Frankfurt. This shift in the share of right-wing social media users implies around 50% more attacks on refugees.
This passage is far from easy to understand, but the gist is that in Dresden, which has one standard deviation more AfD users than Frankfurt, an anti-refugee Facebook post is 50 percent more likely to result in an attack on refugees. It’s probably worth noting that Frankfurt is Germany’s financial capital, the epitome of the rich and comfortable former West Germany. Dresden was in East Germany, has much more neo-Nazi activity, and much more support for the far-right Alternative für Deutschland (AfD) party. But this example is supposed to be illustrative of any two cities with a one standard deviation difference in Facebook use.
The paper then continues:
We next replace the share of AfD users with the share of people active on the Nutella Facebook page… The coefficients on the interaction term here are very similar, and still highly statistically significant.
The idea here is that by looking at how many people are active on the Nutella Facebook page, you can get a good indication of how active the broader population is on social media. (This may sound odd, but it’s actually a pretty clever way of estimating general use of the network activity—you don’t need to be a neo-Nazi to enjoy a delicious chocolate-y spread.) And areas in the top third of Nutella activity on Facebook do seem to have more attacks on refugees.
We can easily “read off” which effect local propagation of right-wing social media has: the coefficient of around 2.8 implies that, even within the same county, a municipality with many Facebook users has approximately 0.024 more refugee attacks (per 10,000 refugees) than a municipality with few users in a typical week. This corresponds to an increase of almost two-thirds of the mean of the dependent variable, a large effect.
Finally, the paper concludes:
Overall, the findings we present in this section suggest that exposure to right-wing refugee salience on social media is a predictor of violent attacks on refugees. This is true both for municipalities with many right-wing Facebook users as well as those with high social media affinity that is unrelated to observable municipality characteristics.
This is as close as the paper comes to the New York Times’ breathtaking statistic. The 50 percent number isn’t there, and neither is the standard deviation of national Facebook use.
It makes intuitive sense that in areas of the country where Facebook usage is high, people will be more likely to see anti-refugee sentiment on Facebook. It arguably makes sense to then assume that these people would also be more likely to act on that sentiment by attacking refugees. What the Warwick paper suggests is that this intuition is empirically true.
But in reality, the breathtaking statistic didn’t come directly from the paper. Rather, it came from long phone conversations in which the paper’s authors walked the newspaper’s journalists through the data, and the methodology, and the results. The authors of the paper told Max Fisher that it’s possible to arrive at the statistic just by using the data in the paper; as I’m not a statistics Ph.D., I haven’t confirmed that, and it’s not clear that anybody else has either. (We’ll update this post if someone does.)
What is abundantly clear, however, is that the authors of the paper are more interested in presenting a methodology for trying to estimate these effects than they are in presenting the actual results. I can tell you, having spent a large amount of time reading two different versions of the paper, that in no way does it “converge on a single breathtaking statistic.” Neither, frankly, does it rise to the level of a “landmark study.” As it continues to be dissected in public, it seems the idea of using the Nutella Facebook page as a proxy for Facebook usage more generally, while clever, may well fail to stand up to scrutiny.
That’s all OK. The white paper was written by a pair of post-docs without any peer review, and there’s no particular reason why it should have been ready for the the social-media klieg lights that suddenly got trained on it. Besides, it’s at pains to include a very strong version of the standard cover-your-ass social-science disclaimer:
The results in this section should be interpreted as purely suggestive and do not allow for causal inference.
This means that even if there is extremely strong correlation between anti-refugee sentiment on Facebook and attacks in the real world, this study isn’t designed to assess if one is causing the other.
The Times’ breathtaking claim, then, is not on the authors—it’s on the New York Times, which should have been much more careful and circumspect in this case. When the Times uses words like “landmark” and “breathtaking,” it starts making claims that would be very difficult for any white paper to stand up to.
When a study becomes the focus of a big article in the paper of record, it’s going to undergo an immediate and discomfiting level of scrutiny. A lot of that is going to land at the feet of the authors, like the post-docs here. But frankly, the majority should be directed at the Times—especially when it starts citing statistics and conclusions which even the authors of the paper don’t seem to be comfortable including in their work.
With hindsight, the Times should have avoided terms like “landmark” and “breathtaking,” and should probably have avoided mentioning specific results at all. The white paper is intriguing, and it was a great idea to use it as a jumping-off point for the newspaper’s shoe-leather reporting. The study was not, however, something to cite as a significant scientific advance. Facebook deliberately makes it extremely difficult for external researchers to quantify its effects on society, which means the best we can hope for is to piece together a jigsaw puzzle of suggestive evidence. (If the company would just make its data available, we’d stop being forced to estimate via imperfect Nutella-proxies.) But as things currently stand, no one piece of research is going to be the kind of smoking gun that the Times tries to turn this one into.