In this column distributed by the Elon University Writers Syndicate, Enrique Armijo, associate dean for academic affairs and an associate professor at Elon Law, examines the recent revelations about Facebook data that was leveraged by the analytics firm Cambridge Analytica.
This column was distributed by the Elon University Writers Syndicate and was published in the News & Observer of Raleigh, N.C. Views expressed in this column are the author’s own and not necessarily those of Elon University.
—
By Enrique Armijo
Facebook’s ad business is booming, but by most other metrics, it has had a very bad couple of years. The social media company, which changed its mission statement in 2017 to “giv[ing] people the power to build community and bring the world closer together,” has given some people the power to use information about the rest of us in ways we could not have intended.
The latest controversy concerning Facebook user data came to light on March 16 and involves a data analytics firm closely associated with White House strategist Steve Bannon that worked for the Trump campaign. According to reporting by the New York Times and London’s the Guardian, in 2014, a Cambridge-based, Russian-American computer science professor gathered data from about 270,000 Facebook users who were offered three or four dollars to download an app called thisisyourdigitallife that included personality quizzes (not “Which Muppet Are You?,” but similar to those offered by psychological measurement centers), the results of which the app’s developers said would be used only for research purposes.
However, the app collected not only survey results from those who willingly downloaded it to share their data, but it also scraped data from the survey respondents’ friend networks as well — those who had never taken the app’s quizzes, downloaded the app, or had probably even heard of it at all, to the tune of more than 50 million users.
That information was then shared (in violation of its terms of service, Facebook claims) with Cambridge Analytica, the data analysis firm that President Trump’s 2016 campaign hired to develop and run targeted political advertising on social media. Perhaps most worryingly, even though Facebook asked Cambridge Analytica to delete the information in 2015 when it learned of this activity and found its sharing violated the social media company’s terms and conditions, reports indicate that personal data from these Facebook users was still being held by Cambridge Analytica as late as 2017.
Facebook’s first response to the Cambridge Analytica news was to emphatically deny that “a data breach” had occurred. From a legal exposure perspective, this was the right approach, as at least a dozen states have data protection laws that make disclosing user information a crime.
But from a public relations point of view, the fact its denial was accurate demonstrated Facebook’s problem. When the thisisyourdigitallife app was being downloaded by those 270,000 users, Facebook’s terms of use and its application programming interface, or “API,” permitted third-party apps to collect not just data on the app downloader’s identity, postings, status updates and more, but the downloader’s Facebook friends’ data as well—even if they didn’t interact at all with the third-party apps that were downloaded and used.
Facebook doesn’t permit this kind of scraping anymore, but there’s no debate it did at the time. The problem with the Cambridge Analytica controversy isn’t that an app developer dissembled about what the data he collected from both willing and unwilling users would be used for, or whether he shared it with a political data research firm. It’s that the data of nonconsenting Facebook users was available for sharing at all.
As is the case with the old-line media outlets like newspapers and television stations, interactions of users with the apps and websites of other companies have been completely integrated within the Facebook platform. There are very few third-party websites or apps that don’t permit (or for those users who favor the convenience of a single password over dozens of different ones, encourage) users to log in and interact with the app or site through their Facebook account credentials. This has been so even for wildly popular gaming apps like FarmVille, which a normal user would think is for harvesting virtual trees and crops, not the personal information of the user and her network of friends.
Sharing personal data via Facebook, in other words, isn’t a misuse of Facebook — it’s the whole point of Facebook. Without monetizing personal data, Facebook has nothing to sell.
The Cambridge Analytica story has also reminded us that it’s not just postings by people to Facebook or interactions with apps that have created this vulnerability. Much of what Facebook knows about us comes from what we like — or more specifically, what we “Like.” Compiling and analyzing user “likes” allows companies to predict not just our political preferences, gender, location, and race, but even our fears, doubts and life traumas.
Advertisers—who, unlike its users, actually pay Facebook—can then use that data to tailor their messages in micro-targeted ways, using both real and fake news, in ways that were unimaginable before social media. This has caused a sea change in not just product-based advertising (that pair of shoes you looked at on Zappos a year ago will follow you around the Internet forever), but in political advertising as well. Are you a Hillary-adoring, early-voting, extroverted Latina lesbian in a solid blue state? Algorithms can identify you as such and pick the ad that the algorithm believes is most likely to get you to volunteer at the polls.
An NRA-interested, Trump-leaning but undecided white Independent voter in a blue-collar battleground state curious to hear more about this funny business with the Clinton Foundation that other people are talking about in your feed? No need to look for the articles; they’ll find you.
Whether this targeting actually works is an open question, and many think it doesn’t. The point is that our data is being used in attempts to manipulate us without our consent, irrespective of whether those attempts are eventually successful.
Those waiting for Facebook to solve these issues will be waiting a very long time. The trove of data it collects about each of its users is exactly what makes it valuable to those who do business with it. And deleting our Facebook apps altogether and en masse is an unsatisfying solution as well. Some people need Facebook to stay connected to friends and family from far away, and in some countries, Facebook is the de facto Internet.
So a world without Facebook is very hard to imagine. The solution, if there is one, thus has to come from somewhere else. We might start with ourselves.
If we have to trust our friends or Facebook to decide how private our data is going to be, then maybe the only solution is to be more thoughtful about what we share. Both Facebook’s mission statement and its business model rely on another well-known adage: knowledge is power. By giving up so much about ourselves every day on social media, we give up a certain part of our power as well.