Facebook Keeps Telling Us It’s Too Big. Maybe It’s Time to Listen



A few years ago, I heard a piece of advice from a New York City restaurant inspector that has stayed with me. The nature of his job–looking for, and finding, vermin and other health violations in commercial kitchens–had turned this man off restaurants altogether. He brown-bagged all his lunches. In the rare instances where he couldn’t get home in time for dinner, he would eat at a fast-food chain.

It sounded counterintuitive, he said, but the big national chains, ever fearful of bad publicity, include harsh penalties in their franchise agreements for failing health inspections. As a result, while McDonald’s or Taco Bell may not be the best place to get a nutritious meal, they’re comparatively safe bets for avoiding food-borne illness.

I thought of this tidbit after seeing Facebook’s departing chief security officer, Alex Stamos, describe his soon-to-be-former company on Twitter as a kind of bring-your-own feast of information. Making the point that everything a user sees in his or her feed is there because another user shared it, Stamos wrote that Facebook is “like a pot luck….where everybody brings their own food from the outside, and the host decides how to arrange the buffet table based upon a model of what people like to eat.”

Stamos, whose plan to exit the company later this year was reported Monday by the New York Times, was responding to reports that the shadowy data firm Cambridge Analytics gained illicit access to data collected from 50 million Facebook accounts and used it to help Donald Trump’s presidential campaign. With Facebook still reeling from accusations that its laxity enabled Russia to conduct large-scale information warfare during the election, polarizing the American populace and perhaps influencing the result, Stamos and his fellow executive were anxious to minimize the appearance of culpability here.

In particular, in a now-deleted set of tweets, Stamos pushed back against the idea that Cambridge’s acquisition of user data constituted a “breach.” Rather, it was an unethical action by a researcher, who gained access to the data legitimately but violated the terms of access by transferring it to third parties, as well as by Cambridge, which allegedly lied when, in 2015, it told Facebook it had deleted all copies of the data.

As techno-sociologist Zeynep Tufekci pointed out, this attempt to shift the blame “is a more profound & damning statement of what’s wrong with Facebook’s business model than a ‘breach.'”

Basically, it’s Facebook saying what it has been saying for years: We’ve built a network that’s too big for us to police, so we’re not going to. Instead, we’ll leave that responsibility to everyone else: to developers, to abide by data-usage policies; to users, to report harassment and phony news; to advertisers, to disclose, or not, who’s paying for a political message.

(Facebook isn’t the only big tech platform to resort to this kind of buck-passing. YouTube recently said it would help users identify conspiracy theories by linking conspiracy videos to Wikipedia pages–even though, to state the glaringly obvious, those pages, like YouTube videos, are user-generated.) 

Christopher Wylie, who helped start Cambridge Analytica before turning whistleblower, says Facebook’s response to finding out about the company’s unauthorized data access relied on the honor system. “[L]iterally all I had to do was tick a box and sign it and send it back, and that was it,” he told the Observer. It wasn’t until after the Observer and the Times published blockbuster stories on Cambridge Analytica’s actions this past weekend that Facebook finally took real action, suspending Cambridge’s account and demanding to audit its servers. (Whether the 50 million affected Facebook users will be notified that their data was used to build “psychographic profiles” for the purposes of shaping their political beliefs remains to be seen.)

Can Duruk, a veteran software engineer and blogger, says Facebook may have a point, in a backhanded sort of way: Any claim that it could protect its users from bad actors would necessarily be overstated. “[I]s there a point where such a giant bag of private data has a life of its own that no single entity can really keep it safe?” he writes.

“I personally think a single entity that governs [two billion users’] media diet and communication channels is scary, firstly due to its size. We expect just a bunch of people mostly living in a suburb in Northern California [to] safeguard US elections in 2018, not cause ethnic cleansing in Myanmar, not help Duterte kill off people he doesn’t like in Philippines, not kill off media companies by mistake in Serbia, and a few more.”

But what’s even scarier is an entity that governs 2 billion people’s media diets, yet persists in thinking of itself as a homely potluck dinner. Think about the analogy for even a few seconds, and it becomes distressing that anyone at Facebook–let alone the person responsible for securing it–could entertain it.  

There’s exactly one reason Facebook embraces the everybody-bring-a-dish model: because it scales. If you’re having a potluck, you can invite 10 people or 100 people or 1,000 people without doing any extra cooking. For Facebook, the marginal cost of adding a million users is practically nothing, the major reason it had operating margins of almost 50 percent last year.

But while the costs don’t scale, the potential harms do. A potluck with 10 guests isn’t an inviting target for a terrorist looking to poison a lot of Americans. A potluck with a thousand guests is. Particularly if the invitation says, “We won’t be checking the food in advance, so guests are encouraged to let us know if anything tastes like poison.”

For years, Facebook has been telling us it’s too big to prevent all the bad behavior it enables. What it really means is We’re too big to stop all this bad behavior and keep getting bigger.

Fast-food restaurants are safe to eat at because people who work there who they won’t have a job if they’re not. Facebook keeps proving itself unsafe for users, and the world, because people who work there keep defining safety as someone else’s job.



Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com