There is a better way for Facebook to resolve its dispute with NYU researchers
More and more, I wonder why we have built a world in which so much civic discourse takes place inside a handful of giant digital malls.
So let’s talk about Facebook’s decision to deactivate pages and personal accounts associated with New York University’s Ad Observatory project, which took data voluntarily provided by Facebook users and analyzed it for the purpose of better understanding. the 2020 elections and other topics. in the public interest.
In one corner, you have university researchers working to understand the effects of the platform on our democracy. In the other, you have a company battered by nearly two decades of privacy scandals and regulatory fines, still terrified that a Cambridge Analytica sequel is lurking somewhere on the wall. platform.
I first wrote about this case in October, when Facebook sent researchers its first cease-and-desist notice. The issue was with a browser extension created by a team at NYU that, if installed, collects data about the ads you see on Facebook, including information about how those ads are targeted. Facebook already makes similar data publicly available through its online ad archive, but NYU researchers say it is incomplete and sometimes inaccurate – among other things, they say, many political ads are never labeled as such.
No one I’ve spoken to on Facebook thinks NYU’s work is fundamentally not in the public interest. Other political advertising mediums do not allow campaigns to target voters with almost the level of precision Facebook does, and the persistent belief that Facebook threw the 2016 election on Donald Trump has drawn closer scrutiny of advertising practices. company in 2020. It’s no wonder academics want to study the platform.
Anticipating this interest, the company created the Facebook Open Research and Transparency platform earlier this year. But like most of the company’s academic partnerships, FORT has come under fire for being too limited in the face of Facebook it provides. In the case of the election, for example, it will only provide data for the 90 days leading up to election day, despite the fact that the presidential campaign lasted well over a year. Additionally, according to the researchers, FORT forces researchers to access the data on a laptop computer provided by Facebook, preventing them from using their own machine learning classifiers and other tools on the available data.
That’s why, when the NYU team received this ban last fall, they said they planned to ignore it. “The only thing that would make us stop doing this would be for Facebook to do it itself, which we asked them to do,” said researcher Laura Edelson. the the Wall Street newspaper.
Facebook has said it won’t ban NYU until well after the election and has kept its word. But on Tuesday night, the company dropped the hammer on the NYU team. “We have taken these steps to stop unauthorized scratching and protect the privacy of individuals in accordance with our privacy program under the FTC Order,” said Mike Clark, director of product management, referring Facebook’s consent decree with the Federal Trade Commission.
Alex Abdo, an attorney for the NYU researchers, told me he was baffled by Facebook’s actions.
“On the one hand, it’s not surprising, on the other hand, it’s totally shocking that Facebook’s response to the search that the public really needs right now is to try to shut it down,” he said. he said in an interview. “Privacy in research and social media is a really tough issue. But the answer cannot be that Facebook decides unilaterally. And there’s no independent research project that’s more respectful of user privacy than the Ad Observer.
So let’s talk about privacy. The Ad Observer was designed to collect data on individual advertisements and target audiences, and also to anonymize this data. Mozilla, the non-profit organization behind the Firefox browser, conducted a review of the extension’s code and its consent flow and ultimately recommended that people use it.
“We decided to recommend Ad Observer because our reviews have assured us that it respects user privacy and supports transparency,” Marshall Erwin, company security manager, said in a blog post. “It does not collect personal messages or information about your friends. And it does not compile a user profile on its servers.
You probably won’t be surprised to learn that Facebook sees it differently. Despite the efforts of the researchers here, the company told me, the Ad Observer still collects data that some users may object to. If a person pays to boost a post, such as a fundraiser, the information, including that user’s name and photo, ends up in the hands of researchers at NYU. The Ad Observer may also collect similar information from ad comments. And Facebook says that information gleaned from an ad “why am I seeing this?” “Sign” may be used to identify other people who have interacted with the advertisements and determine their personal information.
In each of these cases, the actual harm to the user appears to be extremely minor, if you can call it bad at all. But Facebook says it’s against their rules, and that they must enforce those rules, not least because Cambridge Analytica was the story of a seemingly well-meaning researcher who ultimately sold the data he collected and created. arguably the biggest scandal in the company’s history.
That’s why I have at least some empathy for Facebook here. The company is continually criticized for the way it collects and uses personal data, and here you have a case where the company is trying to limit this data collection, and many of the same critics who still bring up Cambridge Analytica on Twitter three years later, simultaneously argue that Facebook has a moral obligation to let the Advertising Observatory slip.
But letting things slide isn’t really in the spirit of the General Data Protection Regulation, the California Privacy Act, and several other privacy regulations. (As a smart person on our Sidechannel server put it, “GDPR does not have a blanket search exemption.”)
Unlike some earlier reports, Facebook is not arguing that Ad Observer violates its FTC consent decree, he told me. But the company has at least some good reason to prevent large-scale data scraping like that represented by researchers at NYU. The rise of Clearview AI, a dystopian surveillance company that built facial recognition in part by collecting publicly available photos on Facebook, made this case visceral this year.
As the fight between NYU and Facebook has turned ugly this week, I think there are some obvious (albeit difficult) ways forward.
One is that Facebook could extend its current data export tools to allow us to voluntarily contribute our data to projects like Ad Observer, but in an even more privacy-protective way. To hear Facebook say it, if NYU’s browser extension only collected a handful of data types less, that might have been acceptable to the business.
If you think users have the right to discuss their personal experiences on Facebook, I think you should also agree that they have the right to voluntarily provide personal data that speaks of that experience. By the nature of Facebook, anyone’s personal experience will also contain a lot of other data from potentially unwilling friends. But the company already allows me to export my friends’ data – when they tag me in comments, send me Facebook messages, etc. The company is already much closer to finding a way to allow me to share this information with researchers than it appears.
Another option – rarely used in the United States – is for Congress to pass a law. It could draft national privacy legislation, for example, and create a dedicated exclusion for qualified academic researchers. This could force platforms to disclose more data in general, to academics and to everyone. It could create a federal agency dedicated to monitoring online communication platforms.
The alternative, as always, is to wait for the platforms to self-regulate – and be continually disappointed with the outcome.
The NYU-Facebook row was always going to end where we find it today: neither side had a good incentive to back down. But we all have reason to hope researchers and tech companies get along better. The stakes are too high for platforms to remain a black box forever.
“You would think they would be able to distinguish between the Cambridge Analyticas of the world and the bona fide, privacy-conscious researchers of the world,” Abdo told me. “If they can’t do it, then there’s really no hope of independent research on Facebook’s platform.”
If Facebook can’t – or won’t – make that distinction, Congress should do it for them.
This column was co-published with Platform, a daily newsletter on Big Tech and democracy.