lunes, 30 de agosto de 2021

HELL IN FACEBOKK

 T


THE STREETS OF DAVOS, SWITZERLAND, were iced over on the night of January 25, 2018, which added a slight element of dan ger to the prospect of trekking to the Hotel Seehof for George Soros' annual banquet. The aged financier has a tradition of host ing a dinner at the World Economic Forum, where he regales tycoons, ministers, and journalists with his thoughts about the state of the world. That night he began by warn- ing in his quiet, shaking Hungarian accent about nuclear war and climate change. Then he shifted to his next idea of a global men- ace Google and Facebook. "Mining and oil companies exploit the physical environ- ment: social media companies exploit the social environment," he said. "The owners of the platform giants consider themselves the masters of the universe, but in fact they are slaves to preserving their dominant posi- tion Davos is a good place to announce that their days are numbered."

Across town, a group of senior Facebook executives, including COO Sheryl Sandberg and vice president of global communica tions Elliot Schrage, had set up a temporary headquarters near the base of the moun tain where Thomas Mann put his fictional sanatorium. The world's biggest companies often establish receiving rooms at the world's biggest elite confab, but this year Facebook's pavilion wasn't the usual scene of airy bon- homie. It was more like a bunker-one that saw a succession of tense meetings with the same tycoons, ministers, and Journalists who had nodded along to Soros' broadside.

Over the previous year Facebook's stock had gone up as usual, but its reputation was rapidly sinking toward junk bond status. The world had learned how Russian intelligence

operatives used the platform to manipulate US voters. Genocidal monks in Myanmar and a despot in the Philippines had taken a liking to the platform. Mid-level employees at the company were getting both crankier and more empowered, and critics every where were arguing that Facebook's tools fostered tribalism and outrage. That argu ment gained credence with every utterance of Donald Trump, who had arrived in Davos that morning, the outrageous tribalist skunk at the globalists' garden party.

CEO Mark Zuckerberg had recently pledged to spend 2018 trying to fix Facebook. But even the company's nascent attempts to reform itself were being scrutinized as a pos sible declaration of war on the institutions of democracy. Earlier that month Facebook had unveiled a major change to its News Feed rankings to favor what the company called "meaningful social interactions." News Feed is the core of Facebook-the central stream through which flow baby pictures, press reports, New Age koans, and Russian made memes showing Satan endorsing Hillary Clinton. The changes would favor interactions between friends, which meant, among other things, that they would disfavor stories published by media companies. The company promised, though, that the blow would be softened somewhat for local news and publications that scored high on a user- driven metric of "trustworthiness."

including Schrage, wanted to tell publish ers their scores. It was only fair. Also in agree ment was Campbell Brown, the company's chief liaison with news publishers, whose job description includes absorbing some of the impact when Facebook and the news Indus try crash into one another.

But the engineers and product managers back at home in California said it was folly. Adam Mosseri, then head of News Feed. argued in emails that publishers would game the system if they knew their scores. Plus, they were too unsophisticated to understand the methodology, and the scores would con stantly change anyway. To make matters worse, the company didn't yet have a reli able measure of trustworthiness at hand

Heated emails flew back and forth between Switzerland and Menlo Park. Solu tions were proposed and shot down. It was a classic Facebook dilemma. The company's algorithms embraid choices so complex and interdependent that it's hard for any human to get a handle on it all. If you explain some of what is happening, people get confused. They also tend to obsess over tiny factors in huge equations. So in this case, as in so many others over the years, Facebook chose opac ity. Nothing would be revealed in Davos, and nothing would be revealed afterward. The media execs would walk away unsatisfied.

After Soros' speech that Thursday night. those same editors and publishers headed back to their hotels, many to write, edit, or at least read all the news pouring out about the billionaire's tirade. The words "their days are numbered" appeared in article after arti cle. The next day, Sandberg sent an email to Schrage asking if he knew whether Soros had shorted Facebook's stock.

Far from Davos, meanwhile, Facebook's product engineers got down to the pre cise, algorithmic business of implementing Zuckerberg's vision. If you want to promote trustworthy news for billions of people, you first have to specify what is trustworthy and what is news. Facebook was having a hard time with both. To define trustworthi ness, the company was testing how people responded to surveys about their impres sions of different publishers. To define news, the engineers pulled a classification sys tem left over from a previous project-one that pegged the category as stories involv ing "politics, crime, or tragedy." That particular choice, which meant the

algorithm would be less kind to all kinds

Davos provided a first chance for many media executives to confront Facebook's leaders about these changes. And so, one by one, testy publishers and editors trudged down Davos Platz to Facebook's headquar ters throughout the week, ice cleats attached to their boots, seeking clarity. Facebook had become a capricious, godlike force in the lives of news organizations; it fed them about a third of their referral traffic while devouring a greater and greater share of the advertising revenue the media industry relies on. And now this. Why? Why would a com pany beset by fake news stick a knife into real news? And what would Facebook's algo rithm deem trustworthy? Would the media

executives even get to see their own scores? Facebook didn't have ready answers to all of these questions; certainly not ones it wanted to give. The last one in particu lar-about trustworthiness scores-quickly inspired a heated debate among the com pany's executives at Davos and their col leagues in Menlo Park. Some leaders.

of other n

technolog

Facebook

ers in Da

reviews v

one at th

When or

about it r

level eng

the fuck

The co

interact

blisterin

at reforr

for Face

annus h

current

a story

place i

work.

by its c

inexor

Fac

kept a

user r

peopl

But th

sion

cam

mom

blec

the

app

tha

tra

plic

SO

Sto

no

frted to tell publish

y fair. Also in agre

wn. the company's

orbing some of the

nd the news indus

er.

product managers

la said it was folly

ad of News Feed

ishers would game

their scores Plu

TAREAS

out in public on the eve of the stories' publi cation, hoping to upstage them. It's a tactic with a short-term benefit but a long-term cost. Investigative journalists are like pit bulls. Kick them once and they'll never trust you again.

Facebook's decision to take that risk, according to multiple people involved, was a close call. But on the night of Friday, March 16, the company announced it was suspending Cambridge Analytica from its platform. This was a fateful choice. "It's why the Times hates us," one senior executive says. Another communications official says, "For the last year, I've had to talk to report ers worried that we were going to front-run them. It's the worst. Whatever the calculus. it wasn't worth it."

The tactic also didn't work. The next day the story-focused on a charismatic whistle blower with pink hair named Christopher Wylie-exploded in Europe and the United States. Wylie, a former Cambridge Analyt ica employee, was claiming that the com pany had not deleted the data it had taken from Facebook and that it may have used that data to swing the American presidential election. The first sentence of The Guardian's reporting blared that this was "one of the tech giant's biggest ever data breaches" and that Cambridge Analytica had used the data "to build a powerful software program to pre dict and influence choices at the ballot box."

The story was a witch's brew of Rus sian operatives, privacy violations, confus ing data, and Donald Trump. It touched on nearly all the fraught issues of the moment. Politicians called for regulation; users called for boycotts. In a day, Facebook lost $36 bil lion in its market cap. Because many of its employees were compensated based on the stock's performance, the drop did not go unnoticed in Menlo Park.

To this emotional story, Facebook had a programmer's rational response. Nearly every fact in The Guardian's opening para graph was misleading, its leaders believed. The company hadn't been breached-an academic had fairly downloaded data with permission and then unfairly handed it off. And the software Cambridge Analytica built was not powerful, nor could it predict or influence choices at the ballot box.

But none of that mattered. When a Face book executive named Alex Stamos tried on Twitter to argue that the word breach was being misused, he was swatted down. He

of other news from health and science to technology and sports-wasn't something Facebook execs discussed with media lead ers in Davos. And though it went through reviews with senior managers, not every one at the company knew about it either. When one Facebook executive learned about it recently in a briefing with a lower level engineer, they say they "nearly fell on the fucking floor

The confusing rollout of meaningful social interactions-marked by internal dissent, blistering external criticism, genuine efforts at reform, and foolish mistakes-set the stage for Facebook's 2018. This is the story of that annus horribilis, based on interviews with 65 current and former employees. It's ultimately a story about the biggest shifts ever to take place inside the world's biggest social net work. But it's also about a company trapped by its own pathologies and, perversely, by the inexorable logic of its own recipe for success. Facebook's powerful network effects have kept advertisers from fleeing, and overall user numbers remain healthy if you include people on Instagram, which Facebook owns. But the company's original culture and mis sion kept creating a set of brutal debts that came due with regularity over the past 16 months. The company floundered, dissem- bled, and apologized. Even when it told the truth, people didn't believe it. Critics appeared on all sides, demanding changes that ranged from the essential to the con- tradictory to the impossible. As crises multi- pilled and diverged, even the company's own solutions began to cannibalize each other.

ated to understand

scores would co

To make matten

't yet have a res

thiness at hand

back and forth

Menlo Park So

hot down. It was

a. The company's

es so complex and

d for any huma

ou explain some

ple get confused

wer tiny factor in

ase, as in so many

ook chose opa

ed in Davos and

afterward The

way unsatisfied

Thursday night

blishers headed

to write editor

During out about

words "their days

article after an

sent an email t

whether Soms

mile, Facebook's

own to the pre

fimplementing

want to promote

s of people you

is trustworthy

k was having a

ne trustworth

ing how people

t their impres

To define news

sification sys

s project-one

stories invol

hich meant the

nd to all kinds

And the most crucial episode in this story-the crisis that cut the deepest-began not long after Davos, when some reporters from The New York Times, The Guardian, and Britain's Channel 4 News came calling. They'd learned some troubling things about ashady British company called Cambridge Analytica, and they had some questions,

cles at the time to that of up to 87 million people in their combined friend networks. Rather than simply use all of that data for research purposes, which he had permission to do, Kogan passed the trove on to Cam bridge Analytica, a strategic consulting firm that talked a big game about its ability to model and manipulate human behavior for political clients. In December 2015, The Guardian reported that Cambridge Analytica had used this data to help Ted Cruz's presi dential campaign, at which point Facebook demanded the data be deleted.

This much Facebook knew in the early months of 2018. The company also knew because everyone knew-that Cambridge Analytica had gone on to work with the Trump campaign after Ted Cruz dropped out of the race. And some people at Facebook worried that the story of their company's relationship with Cambridge Analytica was not over. One former Facebook communi- cations official remembers being warned by a manager in the summer of 2017 that unre- solved elements of the Cambridge Analytica story remained a grave vulnerability. No one at Facebook, however, knew exactly when or where the unexploded ordnance would go off. "The company doesn't know yet what it doesn't know yet," the manager said. (The manager now denies saying so.)

The company first heard in late February that the Times and The Guardian had a story coming, but the department in charge of for- mulating a response was a house divided. In the fall, Facebook had hired a brilliant but fiery veteran of tech industry PR named Rachel Whetstone. She'd come over from Uber to run communications for Facebook's WhatsApp, Instagram, and Messenger. Soon she was traveling with Zuckerberg for public events, joining Sandberg's senior manage ment meetings, and making decisions-like picking which outside public relations firms to cut or retain that normally would have rested with those officially in charge of Face book's 300-person communications shop. staff quickly sorted into fans and haters.

The

And so it was that a confused and frac tious communications team huddled with management to debate how to respond to the Times and Guardian reporters. The standard approach would have been to cor rect misinformation or errors and spin the company's side of the story. Facebook ulti mately chose another tack. It would front run the press: dump a bunch of information

IT WAS, IN SOME WAYS, AN OLD STORY. Back in 2014, a young academic at Cam bridge University named Aleksandr Kogan built a personality questionnaire app called thisisyourdigitallife. A few hundred thou sand people signed up, giving Kogan access not only to their Facebook data but also because of Facebook's loose privacy polisoon deleted his tweets. His position was right, but who cares? If someone points a gun at you and holds up a sign that says HAND'S UP, you shouldn't worry about the apostrophe. The story was the first of many to illuminate one of the central ironies of Face book's struggles. The company's algorithms helped sustain a news ecosystem that pri oritizes outrage, and that news ecosystem was learning to direct outrage at Facebook.

As the story spread, the company started melting down. Former employees remember scenes of chaos, with exhausted executives slipping in and out of Zuckerberg's private conference room, known as the Aquarium, and Sandberg's conference room, whose name, Only Good News, seemed increas- ingly incongruous. One employee remem bers cans and snack wrappers everywhere; the door to the Aquarium would crack open and you could see people with their heads in their hands and feel the warmth from all the body heat. After saying too much before the story ran, the company said too little after- ward, Senior managers begged Sandberg and Zuckerberg to publicly confront the issue. Both remained publicly silent.

"We had hundreds of reporters flood- ing our inboxes, and we had nothing to tell them," says a member of the communica- tions staff at the time. "I remember walk ing to one of the cafeterias and overhearing other Facebookers say, 'Why aren't we say- ing anything? Why is nothing happening?""

According to numerous people who were involved, many factors contributed to Face- book's baffling decision to stay mute for five days. Executives didn't want a repeat of Zuckerberg's ignominious performance after the 2016 election when, mostly off the cuff, he had proclaimed it "a pretty crazy idea" to think fake news had affected the result. And they continued to believe people would fig- ure out that Cambridge Analytica's data had been useless. According to one executive, "You can just buy all this fucking stuff, all this data, from the third-party ad networks that are tracking you all over the planet. You can get way, way, way more privacy-violating data from all these data brokers than you could by stealing it from Facebook."

"Those five days were very, very long." says Sandberg, who now acknowledges the delay was a mistake. The company became par alyzed, she says, because it didn't know all the facts; it thought Cambridge Analytica had deleted the data. And it didn't have a spe

cific problem to fix. The loose privacy poll cies that allowed Kogan to collect so much data had been tightened years before. "We didn't know how to respond in a system of imperfect information," she says.

Facebook's other problem was that it didn't understand the wealth of antipathy that had built up against it over the previ ous two years. Its prime decisionmakers had run the same playbook successfully for decade and a half: Do what they thought was best for the platform's growth (often at the expense of user privacy), apologize if someone complained, and keep push ing forward. Or, as the old slogan went: Move fast and break things. Now the pub lic thought Facebook had broken Western democracy. This privacy violation-unlike the many others before it wasn't one that people would simply get over.

Finally, on Wednesday, the company decided Zuckerberg should give a television interview. After snubbing CBS and PBS, the company summoned a CNN reporter who the communications staff trusted to be rea sonably kind. The network's camera crews were treated like potential spies, and one communications official remembers being required to monitor them even when they went to the bathroom. (Facebook now says this was not company protocol.) In the inter view itself, Zuckerberg apologized. But he was also specific: There would be audits and much more restrictive rules for anyone want- ing access to Facebook data. Facebook would build a tool to let users know if their data had ended up with Cambridge Analytica. And he pledged that Facebook would make sure this kind of debacle never happened again.

nudity or spam, but it could get close. "My understanding with food safety is there's certain amount of dust that can get into the chicken as it's going through the processing and it's not a large amount-it needs to be a very small amount," he told WIRED.

The interviews were just the warmup for Zuckerberg's next gauntlet: A set of public, televised appearances in April before three congressional committees to answer ques tions about Cambridge Analytica and months of other scandals. Congresspeople had been calling on him to testify for about a year, and he'd successfully avoided them. Now it was game time, and much of Facebook was ter rified about how it would go.

As it turned out, most of the lawmakers proved astonishingly uninformed, and the CEO spent most of the day ably swatting back soft pitches. Back home, some Face book employees stood in their cubicles and cheered. When a plodding Senator Orrin Hatch asked how, exactly, Facebook made money while offering its services for free Zuckerberg responded confidently. "Senator, we run ads," a phrase that was soon embla zoned on T-shirts in Menlo Park.

THE SATURDAY AFTER THE CAMBRIDGE Analytica scandal broke, Sandberg told Molly Cutler, a top lawyer at Facebook, to create a crisis response team. Make sure we never have a delay responding to big issues like that again, Sandberg said. She put Cutler's new desk next to hers, to guarantee Cutler would have no problem convincing division heads to work with her. "I started the role that Mon day," Cutler says. "I never made it back to my old desk. After a couple of weeks someone on the legal team messaged me and said, "You want us to pack up your things? It seems like you are not coming back.

Then Sandberg and Zuckerberg began making a huge show of hiring human keep watch over the platform. Soon you couldn't listen to a briefing or meet an executive without being told about the tens of thousands of content moderators who had joined the company. By the end of 2018, about 30,000 people were working on safety and security, which is roughly the number of newsroom employees at all the newspapers in the United States. Of those about 15,000 are content reviewers, mostly

A flurry of other interviews followed. That Wednesday, WIRED was given a quiet heads-up that we'd get to chat with Zuck erberg in the late afternoon. At about 4:45 pm, his communications chief rang to say he would be calling at 5. In that interview, Zuckerberg apologized again. But he bright ened when he turned to one of the topics that, according to people close to him, truly engaged his imagination: using Al to keep humans from polluting Facebook. This was less a response to the Cambridge Analyt ica scandal than to the backlog of accusa tions, gathering since 2016, that Facebook had become a cesspool of toxic virality, but it was a problem he actually enjoyed figur ing out how to solve. He didn't think that Al could completely eliminate hate speech or

No hay comentarios:

Publicar un comentario

zen consultora

Blogger Widgets

Entrada destacada

Platzy y el payaso Freddy Vega, PLATZI APESTA, PLATZI NO SIRVE, PLATZI ES UNA ESTAFA

  Platzy y los payasos fredy vega y cvander parte 1,  PLATZI ES UNA ESTAFA Hola amigos, este post va a ir creciendo conforme vaya escribiend...