Skip to main contentSkip to navigationSkip to navigation
‘If Facebook has its way, there will be no independent research of its platform.’
‘If Facebook has its way, there will be no independent research of its platform.’ Photograph: Jaap Arriens/NurPhoto/Rex/Shutterstock
‘If Facebook has its way, there will be no independent research of its platform.’ Photograph: Jaap Arriens/NurPhoto/Rex/Shutterstock

Facebook is obstructing our work on disinformation. Other researchers could be next

This article is more than 2 years old
Laura Edelson and Damon McCoy

The company’s hostility to academic scrutiny limits our ability to understand how the platform amplifies political falsehoods

Last week, Facebook disabled our personal accounts, obstructing the research we lead at New York University to study the spread of disinformation on the company’s platform. The move has already compromised our work – forcing us to suspend our investigations into Facebook’s role in amplifying vaccine misinformation, sowing distrust in our elections and fomenting the violent riots at the US Capitol on 6 January.

But even more important than the effect on our work is what Facebook’s hostility toward outside scrutiny means for the many other researchers and journalists trying to study Facebook’s effects on society. We’ve already heard from other researchers planning similar projects who are now pulling back. If Facebook has its way, there will be no independent research of its platform.

Our dispute with Facebook centers on a research tool called Ad Observer. Ad Observer is a web browser extension that Facebook users can choose to install to share with us limited and anonymous information about the ads that Facebook shows them. The data they share with us includes the categories advertisers chose when targeting them. Examples might be “married women” or “interested in dating” or “lives in Florida”.

Using data collected through Ad Observer, and also data collected using the transparency tools Facebook makes available to researchers, we’ve been able to help the public understand how the platform fails to live up to its promises, and shed light on how it sells the attention of its users to advertisers.

In a forthcoming paper, we show that Facebook has failed to include more than 100,000 ads that meet its own criteria as political, social and issue ads in its public archive. For example, it failed to include ads supporting Joe Biden ahead of the 2020 elections; Amazon ads about the minimum wage; and an anti-mask ad targeted to conservatives run by a group called Reopen USA, whose Facebook page posts anti-vaccine and anti-mask memes.

We have also shown how highly partisan and misleading news sources get far more engagement on Facebook than reliable news sources do, and we will be publishing an expanded version of this analysis in another forthcoming paper.

But we’re not the only researchers who use Ad Observer’s data. For the past three years, we’ve been making information we collect through Ad Observer and through Facebook’s tools available to other researchers and journalists, so they can conduct their own investigations.

The Markup has used our data to report on how ads with QAnon content and merchandise from extremist militia groups have slipped through Facebook’s filters, despite bans. The Markup also used the data to demonstrate how corporate advertisers such as ExxonMobil and Comcast promote seemingly contradictory messages about hot button issues to different audiences. Reporters from Florida to Kentucky to New Mexico used it to report on trends in political advertising in their states ahead of the 2020 elections.

In disabling our accounts last week, Facebook claimed that we were violating its terms of service, that we were compromising user privacy, and that it had no choice but to shut us down because of an agreement it has with the Federal Trade Commission. All of these claims are wrong. Ad Observer collects information only about advertisers, not about our volunteers or their friends, and the FTC has stated that our research does not violate its consent decree with Facebook.

Unfortunately, Facebook’s campaign against us is part of a larger pattern of hostility toward outside scrutiny. Just last month the New York Times reported that Facebook, after internal controversy, was dismantling a team working on CrowdTangle, its marquee transparency tool for researchers who want to see how unpaid Facebook posts spread and gain engagement on the platform. The paper reported this week that the White House itself was having so much trouble getting a straight answer from Facebook on vaccine misinformation that officials asked to speak directly to the platform’s data scientists, rather than its lobbyists.

Social Science One, launched in 2018 to great fanfare, was supposed to provide researchers with access to Facebook user data in a safe way. But the data offered proved to be much less useful than anticipated, to the point that the funders of the project dropped out. In our work we’ve shown how Facebook’s transparency tools fall short of promises.

We can’t let Facebook decide unilaterally who gets to study the company and what tools they can use. The stakes are too high. What happens on Facebook affects public trust in our elections, the course of the pandemic and the nature of social movements. We need the greater understanding that researchers, journalists and public scrutiny provide. If Facebook won’t allow this access voluntarily, then it’s time for lawmakers to require it.

  • Laura Edelson is a PhD candidate in computer science at NYU’s Tandon School of Engineering who studies online political communication and develops methods to identify inauthentic content and activity.

  • Damon McCoy is an associate professor of computer science and engineering at the New York University Tandon School of Engineering.

Explore more on these topics

Most viewed

Most viewed