Commentary: Facebook's evasions are dangerous
Facebook's business model is pretty simple. It offers a free service to connect users with loved ones and others of similar interests. In return, it collects data about what its users do online (and, increasingly, offline). It then uses this data...
Facebook's business model is pretty simple. It offers a free service to connect users with loved ones and others of similar interests. In return, it collects data about what its users do online (and, increasingly, offline). It then uses this data to sell targeted advertising. The whole thing works brilliantly.
Yet the company-like so many others that profit from selling access to their users' data-seems reluctant to describe so explicitly how it makes money. Instead, in response to news that data from as many as 50 million users was gathered and shared by a third party without their knowledge, Facebook issued a statement that begins: "Protecting people's information is the most important thing we do at Facebook."
There is always some distance between what a company says it does and what it actually does. For Facebook, it's greater than most. And-unlike for many companies-this disconnect is no longer an abstract concern. As the network assumes a central role in public life in the U.S. and elsewhere, it is warping politics and misleading voters. Facebook's cherished myth that it's simply "bringing the world closer together" is becoming hard to defend.
Its latest scandal offers a case in point. In 2014, a researcher got permission from both Facebook and about 270,000 users to gather personal data about them. He then shared that data-and that of about 50 million of their Facebook friends-with a company called Cambridge Analytica, which used it to create "psychographic" profiles of voters and later worked for the presidential campaign of Donald Trump.
Sharing data without users' consent is "against our policies," says Mark Zuckerberg, Facebook's founder and CEO. But the larger question, for regulators as much as the company itself, is how much control users should have over Facebook's vast repository of personal data, which has become an immensely powerful political tool with few safeguards to speak of.
The Trump team used such tools to great effect. It tested tens of thousands of ad variations on the network. It used Facebook data to deliver provocative messages to receptive subsets of the electorate-knowing the broader public would be none the wiser-while conducting "major voter suppression operations" to dampen turnout for Trump's opponent. As it happens, Russia had the same idea.
This stuff works only because Facebook furtively collects so much intimate information. Its business is selling data about its users.
And data has no values or principles of its own. It can help elect a good candidate or a bad one. It can be used for valuable research or vile psychological experiments. It can reveal stunningly personal details-and it is now in the hands of a huge range of people, responsible and otherwise.
Facebook has every right to make money. But the company-and its billions of users, for that matter, as well as the few billion remaining humans who have yet to join-shouldn't delude themselves about how it does so. Only then will they be able to clearly measure the consequences of their choices.