RSN Fundraising Banner
FB Share
Email This Page
add comment
Print

Taibbi writes: "We shouldn't be asking Facebook to fix the problem. We should be fixing Facebook."

Mark Zuckerberg. (photo: B&T)
Mark Zuckerberg. (photo: B&T)


Can We Be Saved From Facebook?

By Matt Taibbi, Rolling Stone

03 April 18


The social media giant has swallowed up the free press, become an unstoppable private spying operation and undermined democracy. Is it too late to stop it?

e shouldn't be asking Facebook to fix the problem. We should be fixing Facebook. It's our collective misfortune that this perhaps silliest-in-history supercorporation – a tossed-off hookup site turned international cat-video vault turned Orwellian surveillance megavillain – has dragged us all to the very cliff edge of modern technological capitalism.

We've reached a moment in history where many companies are more powerful than even major industrialized nations, and in some cases have essentially replaced governments as de facto regulators and overseers. But some of those companies suck just a little too badly at the governing part, leaving us staring into a paradox.

The Russians call this situation a sobaka na sene, a dog on the hay. Asleep in the manger, the dog itself won't eat the hay. But it won't let you eat it either.

We've got to get the dog off the hay.

For much of the past year and a half, the Social Network has been everywhere in the news. It's ubiquitous in a bad way for the first time in its existence. The blithely addictive social media site bathed in unthreatening baby-blue graphics that one tech columnist derided as "the place where you check to see who married Jill the cheerleader" has found itself at the center of an exploding international controversy.

A recent Wired cover story is a typical press treatment. Legions of current and former employees whispered to the mag about Facebook's toxic culture. The firm was said to have overreacted to conservative criticism some years back and gone too far the other way in an ill-fated search for "balance," inadvertently handing Trump the White House in the process.

Facebook was also rocked by recent revelations that Cambridge Analytica, a firm partly owned by the same conservative Mercer family that became a primary sponsor of Donald Trump's foundering campaign in the summer of 2016, may have used personal information from 50 million Facebook users to deliver targeted ads to likely Trump voters.

Cambridge Analytica has since been revealed to be a con's con – in 2015, it was selling Ted Cruz on "secret sauce" intelligence services it hadn't even finished designing yet. The story created instant worldwide panic, despite the fact that manipulating private information is the sort of service Facebook has long provided as a matter of routine. Any third-party app built on the site, not just those created by arch-conservatives, would be able to perform the same data-sucking trick. As former Facebook adviser Dipayan Ghosh puts it, "The problem goes far beyond the scope of the current controversies. The story here is about sheer market power."

The headlines are scary, but the pathology behind them is actually the most alarming and unreported aspect of the Facebook story. The world seems simultaneously to be denouncing the company for having meddled with an election, and demanding that it meddle more responsibly in the future. From senators to members of the media to security officials, the solution to the problem of "fake news" and foreign intervention in our elections has been absurdly simplistic: Just have Facebook fix it.

All this outside pressure is hitting home. After years of resistance, Facebook's polarizing supergeek CEO, Mark Zuckerberg, is suddenly accepting the challenge of reforming an industry he knows nothing about, i.e., the press. Ominously, he recently vowed to spend 2018 working on "these important issues."

It's a seismic change. As recently as November 2016, Zuckerberg, who exudes all the warmth of a talking parking meter, could be heard lashing out at people who "insist we call ourselves a news or media company." He later scoffed at the idea his firm played a significant role in the election, and refused to discuss the possibility that Facebook had responsibility for reversing the declining quality of news reporting.

But by the beginning of 2018, Facebook began a sharp – and subtly frightening – turnaround. No longer denying its outsize media role, the company announced one initiative designed to create a trustworthiness measurement for news, and another to increase the content you get from close friends and family, presumably as opposed to evil (and possibly foreign) strangers.

The goal, said Facebook News Feed chief Adam Mosseri, was to "make sure the news people see, while less overall, is high-quality." Mosseri, who's been with the company since its earlier days, tells Rolling Stone that Facebook's original developers never imagined being in the position the firm is in now. "I don't think anyone foresaw the scale that we got to," Mosseri admits.

Now, he claims, Facebook is just trying to do the right thing. "We take our responsibilities seriously," he says, explaining the thinking behind the new initiatives. "In a world where the Internet exists, how can we make the world better?"

Facebook's decision to accept "responsibilities" in the news realm, even in this rudimentary and characteristically disingenuous way, has mind-blowing implications for a country that has functioned without a true media regulator for most of its history.

That's because all of these horror-movie headlines about fake news and "meddling" gloss over the giant preceding catastrophe implicit in all of these tales. For Facebook to be both the cause of and the solution to so many informational ills, the design mechanism built into our democracy to prevent such problems – a free press – had to have been severely disabled well before we got here.

And it was. Long before 2016 had a chance to happen, the news media in the United States was effectively destroyed. For those of us in the business, the manner of conquest has been the most galling part. The CliffsNotes version? Facebook ate us.

Internet platforms like Zuck's broke the back of the working press first by gutting our distribution networks, and then by using advanced data-mining techniques to create hypertargeted advertising with which no honest media outlet could compete. This wipeout of the press left Facebook in possession of power it neither wanted nor understood.

It was all an insane accident. Facebook never wanted to be editor-in-chief of the universe, and the relatively vibrant free press that toppled the likes of McCarthy and Nixon never imagined it could be swallowed by a pet-meme distributor.

But it happened. As a result, we're now facing a problem potentially worse than either a Trump election or a Russian cyber-incursion: a world in which the informational landscape for billions of people is controlled more or less entirely by a pair of advanced private spying operations, Google and Facebook – and Facebook especially.

The Facebook mess is really the final chapter in a decades-long collision of the news media with the Internet. Many smart people expected this tale to end well. It hasn't. The creators of the Internet sold their invention as inherently democratizing. Instead, information is now so concentrated that a 1984 scenario is just a few clicks away.

***

This may sound obvious, but since even Facebook appears not to have understood this issue, here's a brief reminder: The media business has always been first and foremost about distribution.

News consumers once had direct and powerful relationships with publishers, before the technological changes that made Facebook possible. "People identified with the fact that they read the local newspaper," says Jim Moroney, former publisher of The Dallas Morning News. "They connected with being readers of The Dallas Morning News, The Boston Globe, The New York Times and so on." Newspapers developed those relationships over long periods of time via the hardcore brick-and-mortar process of building distribution networks.

"Your major advantage as a media business rested in your distribution system," says David Chavern, director of the News Media Alliance. "Everything from your printing press, to the people loading papers into trucks, to the trucks themselves, to the stores, to the kids delivering papers to subscribers' doorsteps."

The physicality of the distribution system lent credibility to both news and ads. Moreover, the difficulty and expense in building those systems meant that few people could do it, and newspapers earned for themselves built-in revenue streams from services like employment and real-estate ads, where they were usually the only game in town.

This model allowed newspapers to be remarkably free of government regulation. The same wasn't exactly true of radio and TV stations, which had to answer to the Federal Communications Commission. But TV and radio also once enjoyed enormous advantages that no longer exist.

"TV and radio, those were scarcity businesses," says Moroney. "There were only so many licenses in a market, which meant only so many stations in a market. And beyond that, there were only so many 30-second ad spots you could sell. You couldn't have a whole hour be ads. If you were good at managing your scarce inventory, you could make a lot of money."

In the early part of the 20th Century, it wasn't considered necessary for the government to meddle in news licensing. But in an ancient preview of the Internet, there was by the 1920s an explosion of new radio stations, resulting in a "cacophony of signal interference" that, much like today, made a mush of the news-following experience.

This led then-Commerce Secretary Herbert Hoover and others to explore the question of how to weigh "spectrum scarcity" with the needs of a democratic society. The result was a pair of landmark federal laws, the Radio Act of 1927 and the Communications Act of 1934. It was a trade-off. Companies that licensed airwaves had to agree to operate in the "public interest, convenience and necessity."

Of course, the federal government, with its high-minded "public interest" standard that supposedly pushed broadcasters to serve "all substantial groups," somehow managed to keep in place a brutal system of racial apartheid, among other huge misses. It also denied the viewpoints of anti-war activists, capitalist critics and a host of others. But the core idea, that a news media in the broad public interest must exist, has been in place almost from the start. Even the likes of Washington and Jefferson helped institute the practice of giving cheap or even free postage rates to newspapers.

"Abolitionist newspapers were sent to the South thanks to these policies," notes University of Illinois professor Robert McChesney. "Even back then there was this idea of subsidizing reporting."

With each new expansion of communications technology, Americans almost always came up with guidelines for how to sync up the citizenry's informational needs to the new invention.

Then the Internet came along.

***

In many ways, the Facebook controversy is a canard. It's less a real crisis about Russians, the Trump election or scamsters like Cambridge Analytica than a long-overdue reckoning. Americans who for decades have been clinging to reassuring myths about the origins and purpose of the Internet are finally beginning to ask important questions about this awesome Pentagon-designed surveillance tool they've enthusiastically welcomed into their homes, bedrooms, purses and pockets.

Conventional wisdom sees the Internet as an invention that, yes, was designed for narrow military uses, but unexpectedly blossomed into a powerful democratizer. "The Internet was viewed as a force for good, supporting inclusion and democracy," says Dr. Lawrence Landweber, a member of the Internet Hall of Fame and former president of the Internet Society. "This view was widely held in the industry as well as among political leaders," he says. "Remember Google's motto around 2000 was 'Don't be evil.' "

There are, however, less-flattering histories of the Internet, which began as a defense project in the Sixties. Some critics, like Surveillance Valley author Yasha Levine, will tell you that keeping tabs on domestic and foreign resistance movements was one of the net's original design goals, which is one reason it's no surprise most of the big Internet-based firms today – Facebook, Google, Amazon – also contract with the military and/or security services.

In his book, Levine points to the fact that from the very start, the proto-Web banked info collected by the likes of the Defense Intelligence Agency and the National Security Agency. "Surveillance was baked into the original mission of the Internet," Levine says.

No matter what the intent behind the invention, it seems that little thought was given to how the Internet would impact the existing commercial news business. Landweber, for instance, says Internet developers never conceived of a world where Internet platforms would acquire hegemonic power in this sphere. "Getting most of one's news via the Internet, as well as the notion that social media companies would manipulate one's personal data for commercial or political benefit, was not anticipated," he says. He adds, "The current situation would have shocked early Internet developers."

Which brings us back to Facebook, which to this day seems at best to dimly understand how the news business works, as is evident in its longstanding insistence that it's not a media company. Wired was even inspired to publish a sarcastic self-help quiz for Facebook execs on "How to tell if you're a media company." It included such questions as "Are you the country's largest source of news?"

The answer is a resounding yes. An astonishing 45 percent of Americans get their news from this single source. Add Google, and above 70 percent of Americans get their news from a pair of outlets. The two firms also ate up about 89 percent of the digital-advertising growth last year, underscoring their monopolistic power in this industry.

Facebook's cluelessness on this front makes the ease with which it took over the press that much more bizarre to contemplate. Of course, the entire history of Facebook is pretty weird, even by Silicon Valley standards, beginning with the fact that the firm thinks of itself as a movement and not a giant money-sucking machine.

This is how Zuckerberg described Facebook in Initial Public Offering (IPO) documents from 2012:

Facebook was not originally created to be a company. It was built to accomplish a social mission – to make the world more open and connected.

"The great myth" about the company, says former Facebook ad manager Antonio García Martínez, "is that Zuck gives a shit about money."

Facebook was not originally created to be a company. It was built to accomplish a social mission – to make the world more open and connected.

"The great myth" about the company, says former Facebook ad manager Antonio García Martínez, "is that Zuck gives a shit about money."

García Martínez, whose absurdist memoir about his time at Facebook, called Chaos Monkeys, may be the funniest business book since Liar's Poker, laughs as he recalls his time at the firm."It's more like a messianic cult," he says. García Martínez is the most interesting and damaging defector to have ever left the ranks of Facebook. An iconoclastic combination of Travis McGee and Michael Lewis, he is a former physics Ph.D. candidate from Berkeley who worked at Goldman Sachs before his two years at Facebook, and now spends much of his time writing and sailing. He has lifted the curtain on ruthless profit-hoovering practices he helped design. His main gripe with Facebook seems to be its total lack of self-awareness about its own ambition.

García Martínez continually describes the company's corporate atmosphere as an oddball religion where Zuckerberg is worshipped as an infallible deity – sort of like Scientology, but without Tom Cruise or space invaders.

"You can tell your value in the company by where you're seated in relation to Zuck," he says.

The Facebook religion doesn't involve a virgin birth. It does, however, feature an asexual creation myth, glamorized by fictionalized accounts like The Social Network, in which Zuckerberg is shown one-upping God by creating the future in fewer than seven days of nerdly transcendence.

From there, Zuckerberg legendarily grew the company to fantastic dimensions. To this end, he had the help of Silicon Valley hotshots like Napster's Sean Parker and early investment from the likes of PayPal founder, libertarian icon, future Trump supporter and Gawker-smashing press critic Peter Thiel.

Facebook ballooned in size at a spectacular rate – it's gone from 100 million users in 2008 to more than 2.1 billion today, consistently adding 50 to 100 million users per quarter, steadily making itself into the town square of the world. And it boasts awesome revenues: a staggering $40.7 billion in 2017 alone.

That Facebook saw meteoric rises without ever experiencing a big dip in users might have something to do with the fact that the site was consciously designed to be addictive, as early founder Parker recently noted at a conference in Philadelphia.

Facebook is full of features such as "likes" that dot your surfing experience with neuro-rushes of micro-approval – a "little dopamine hit," as Parker put it. The hits might come with getting a like when you post a picture of yourself thumbs-upping the world's third-largest cheese wheel, or flashing the "Live Long and Prosper" sign on International Star Trek day, or whatever the hell it is you do in your cyber-time. "It's a social-validation feedback loop," Parker explained. "Exactly the kind of thing that a hacker like myself would come up with, because you're exploiting a vulnerability in human psychology."

This echoes what García Martínez says about Facebook. "It isn't a media company internally," he says. "It's a hacker company internally."

Viewing Facebook through the hacker lens makes it a lot easier to understand. The firm's overwhelming dependence on free or found content is one thing. Another is its casual rerouting of taxpaying responsibility through supposed "headquarters" in tax havens like Ireland. The company, like most of the modern tech giants, seems to pay almost nothing in taxes in the countries where it is most popular, for example paying just £4,327 in British taxes in 2014.

All of this smacks of a particular brand of piracy unique to the new generation of tech firms, whose leaders tend to celebrate the "move fast and break things" libertarian ethos. The Thiels and Zuckerbergs represent a new class of CEO who, like the wealthy self-financed superheroes in comic-book movies, could get the job done by themselves if only the pesky government toe-draggers would get the fuck out of the way. Rules, like paywalls and taxes, are for suckers: We reward people who can get past them.

Zuckerberg, on his profile in the days of "thefacebook.com," even listed himself as "Enemy of the State."

In his book, García Martínez describes a scene in which a college kid named Chris Putnam developed a virus that made your Facebook profile look like MySpace, and deleted user content to boot. Instead of taking legal action, Facebook hired him. "The hacker ethos prevailed above all," García Martínez noted.

It's a misconception that Facebook sells the personal data of its users. What it sells is its hackerish expertise in snatching and analyzing your personal info from everywhere – on the site and outside it. Facebook keeps tabs on who has an anniversary coming up, who's in a long-distance relationship, who uses credit cards, who likes baseball and who likes cricket, who observes Ramadan, who's participated in a time-share, and countless other things.

That such data is collected mainly to more efficiently shove ads in your face is widely understood today. What's less well-understood is that monetizing user info was a key element of Facebook's business model going back to its first days.

"We were always using the data," says Mosseri, who runs the News Feed. "We did it to improve the user experience."

Mosseri's take – which whitewashes out the role data-powered ads played in the company's early growth – is typical of Facebook defenders. Ironically, not unlike traditional media companies, whose editorial chiefs have always pissed on their own sales reps as lower life-forms and refused to admit their influence on news-coverage decisions, Facebook from its first years had a schizoid, embarrassed attitude toward its own ad department.

In the beginning, the company featured no ads. Zuckerberg, when he talked publicly about ads back then, said only that he might offer them in the "future" for purely utilitarian reasons, i.e., to "offset the cost of the servers."

Not $40 billion or anything, just a few pennies here and there.

Facebook quickly established a pattern within the firm in which surrogates and partners developed the powerful money-making technology, while the Christ-complexing Zuckerberg focused on expanding the cloud of flatulent self-congratulation that began to hover over Facebook's ballooning global presence.

Time after time, Facebook would make a move that publicly highlighted its "social mission," while really it was just growing its economic footprint and increasingly monopolistic market share.

One of Facebook's early problems, for instance, was that the novelty of people sharing pictures of their kids' soccer trophies soon started to wear off. Without content with a little more heft, Facebook was what one snickering industry writer called "a stupid site, AOL for adults."

That changed with the introduction of the News Feed in September 2006. This move revolutionized both social networking and the news business. Back then, the feed was clearly designed to be more in tune with the site's toxic never-ending-high-school vibe than an actual news source.

"News Feed highlights what's happening in your social circles on Facebook," then-product manager Ruchi Sangvi wrote. "So you'll know when Mark adds Britney Spears to his Favorites or when your crush is single again."

In between finding out that Zuck likes Britney Spears or a prior stalking target had changed his or her relationship status, you might now also receive links to – news! Such simultaneously ridiculous and horrifying milestones litter the road to the Great Media Disaster of 2016.

***

Although it seemed frivolous on its face, the Facebook News Feed made a consumer mockery of the 24-hour cable-news channel, which was really just a repeating loop of a handful of daily reports. Facebook made it possible for users to see more than 1,000 news stories per day, and on average a user actually did see, in between all that other stuff, about 200. This was hacker culture writ large again, in that the feed was built around content grabbed for free out of the Internet ether.

"Media brands are diluted when people say things like, ‘I read this on Facebook,' " says Chavern.

This was more than a branding problem for media firms. It was a profound issue that spoke to how the decision-making processes of modern news consumers were being warped.

Once upon a time, a person had to make a conscious decision to pick up a newspaper, turn on the evening news or buy a magazine. Now, news came to you – was even offered to you, suggested in the way a magician offers a card – as part of an artificial entertainment experience that skewed consumer expectations in a highly specific way.

"I read this on Facebook" soon came to mean something like "I read this in a highly individualized intellectual masturbation session." News became a thing that only made it through if it fit into those constant, round-the-clock sorties Facebook was flying straight to your personal pleasure center. Simultaneously, the news stopped being a broadcast program designed to be digested, for good or ill, by a group, as families had once done over their nightly meatloaf.

Most problematic of all, however, was the combination of algorithmic data analysis and free news content, which accelerated junk news trends that had already begun to poison the media business. TV stations like Fox had long ago ditched what you might call "eat your vegetables" media, i.e., news, often investigative, that either requires significant mental effort to understand, some willingness to question one's own beliefs, or both.

To hear old newshounds tell it, there was allegedly a time when we media vermin didn't sling junk out of pure shame. Old-timers even tell tales, probably apocryphal, of days when ad executives weren't even allowed on the same floor as editorial staff.

But by the Eighties and Nineties, everyone in media was realizing that audiences cared more about seeing graphics, panda births and newscasters withstanding hurricane winds than they cared about news. The innovation of stations like Fox was to sell xenophobia and racism in addition to the sensationalist crap.

But even Fox couldn't compete with future titans like Facebook when it came to delivering news tailored strictly for the laziest, meanest, least intellectually tolerant version of you. Facebook knew more about you personally, what you might like and also what might tickle your hate center, than any TV, radio station or newspaper ever had.

Ben Scott, who with Ghosh co-wrote a paper on Facebook called "Digital Deceit" for the New America Foundation, says the power of Internet platforms to match people to mental junk was unprecedented.

"Forget about ever seeing eat-your-vegetables media again," says Scott. "In the new world, not only will you only see sugar media, but you'll only see your favorite brand of sugar media. Other information, you won't even know it exists."

Dr. Ofir Turel of California State University-Fullerton, who's written extensively about Facebook, says use of the site has a lot of the features of an addictive activity, like ease of use, variable rewards and feelings of anxiety when we're not engaged with it.

"All addictions operate on the variable-reward system," says Turel, who estimates that about five to 10 percent of the population could now meet the criteria of being at risk for social media addiction. Chronic users spend hours staring glassy-eyed at screens in search of the tiny rushes that come with likes or with the reading of articles validating their views. Mental horizons are narrowed. A study by the Proceedings of the National Academy of Sciences (yes, the acronym is PNAS) concluded, "Facebook users were more likely to interact with a limited number of news sources."

Additionally, they posited, "The main driver of misinformation diffusion is the polarization of users on specific narratives rather than the lack of fact-checked certifications." Translation: Lazy thinking and sheltered mental environs lead to more misinformation than fake news does.

Facebook's News Feed was a big part of the reward system designed to keep people coming back. "The interest is not to inform you," Turel says. "The interest is to get you to stay on the site."

Peter Eckersley, chief computer scientist of the Electronic- Frontier Foundation, describes the News Feed in even starker terms. "It's designed to match people to information that will reinforce their existing prejudices, whatever those are," he says.

Facebook advocates justify basically all of their practices on the premise that connecting people is inherently a net plus for the world. A recent memo leaked to Buzzfeed showed one company exec conceding that terrorists may eventually use the site to successfully coordinate attacks, but so what because "we connect people. Period. That's why all the work we do in growth is justified."

Moreover, company officers say using data collection to make both the ads you see and the news you're exposed to more tailored to you personally is actually a good thing. Mosseri points out that Facebook is not a news program but an online community in which people talk about everything under the sun with their friends. And most people have so many friends that living in a bubble of endlessly automated stupidity, he says, is impossible.

"It's hard to have hundreds of like-minded friends," he says. "Broadly, it balances things out."

Another thing that balances out? Age. There's some evidence that the very young, as they often do, are rejecting a bad habit from their parents' generation. About 100 million Facebook users in America are age 25-44 in 2018, but it gets dicey after that, with just 6.8 million users between the ages of 13 and 17.

Tech billionaire and Dallas Mavericks owner Mark Cuban, who has been a heavy critic of Facebook, says time may reckon with the firm. "I think they are losing impact domestically, with zero influence on millennials and younger," he says. "But [they have] overwhelming influence on boomers and Gen X'ers."

***

As late as 2013, just before Facebook went public, executives tried to convince Zuckerberg to own his company's basic nature and push the firm past a crucial ethical and financial Rubicon. The debate was over changing Facebook's terms of service so that users would have to agree to allow data gleaned from the famed "like" button to be used for commercial purposes.

The company had at least superficially resisted this idea, and even with the IPO approaching, Zuckerberg balked. "Don't use the like button," he reportedly told García Martínez and others in early 2012.

A lot of Facebook's value was in the like button. When users liked something, particularly in voluntary product reviews and surveys, it generated intelligence about how to effectively target those people with advertising. Moreover, users who see their friends liking a product are more likely to try that product themselves.

In any case, on May 18th, 2013, the company held its IPO, and launched with a market capitalization of $104 billion. But the IPO was considered a fiasco on Wall Street. It also caused a mild stir when the company's first 10-K report was released, showing that the firm took advantage of stock-option loopholes to make more than a billion in profits without paying a dime in state or federal taxes – in fact, Facebook in 2013 received a $429 million tax rebate.

The big public rollout was also marred by lawsuits, and the stock price began declining in the wake of disappointing revenues. The shares originally sold at $38, and dropped to a low of $17.55 later that year.

As it had done consistently in its history, the firm, when faced with financial pressure, moved ever further in the direction of monetizing users' personal data. In this case, it finally went after the like button.

A little over a year after the IPO, on June 12th, 2014, Facebook quietly announced a change to its terms of service. "Starting soon in the U.S., we will also include information from some of the websites and apps you use," the company wrote. "This is a type of interest-based advertising, and many companies already do this."

Facebook didn't just use its data to help advertisers place targeted ads. It also used AI-enhanced technology and tools like GPS to track users' information in order to learn more and more about them, all while constantly improving the reach and power of the company's advertising capabilities. In perhaps the creepiest example, Facebook applied for (and received, last year) a patent for a tool called Techniques, for emotion detection and content delivery. It would use the camera in your phone to take pictures of you as you scroll through content. Facebook would then use facial analysis to measure how much you did or did not like the content in question, so as to determine what kind of stuff to send your way. Ideas like this are what make Facebook, at times, feel like a giant blood-engorged tick hanging off your frontal lobe.

Ghosh, who worked on global privacy and public-policy issues at Facebook, says that the company's technology very quickly became effective beyond anyone's imagination, and wasn't limited to the placing of ads.

He points, for instance, to the "audience networks" program, where an advertiser might ask Facebook to not only put ads in front of the users most likely to respond to them, but to go after eyeballs on other sites.

"Maybe the advertiser is Nike and they're looking to sell the new Air Jordans to men aged 18 to 35 in the D.C. metro area," says Ghosh. "So they'll put ads in front of 100,000 Facebook users, then leverage their own audience to place the ad in front of a similarly sized audience on other networks – maybe NBA.com or a sports site or whatever."

Every time it places an ad in a campaign like this, a platform like Facebook learns more and more about how to most effectively interpret data, not just about its own users but about other sites and the users of other sites.

In Europe and in other parts of the world, these practices sometimes inspired protests and regulatory action. In 2015, Belgium demanded that Facebook stop tracking user data once the user has left the site, which it's reportedly been doing since at least 2014.

This is what people don't understand about the "fake news" problem. This isn't a crack in the system. It is the system. The new age of targeted information distribution is designed to make campaigns of manipulation not just possible but inevitable. It is what the product was designed for.

Moreover, it's all grounded in wholly legal advertising techniques. Scott, who co-wrote "Digital Deceit," gives the example of fake-news campaigns deployed by European far-right parties.

"You'd see some fake story on some little blog somewhere, maybe about immigrants rioting in a big city," he says. "Next thing you know, some tabloid picks it up with a headline: 'Alleged Riot in Munich!' Then you'd see someone promote the hell out of that story using target marketing. Because the platforms know exactly which people to target for you, you can pay to get that promoted content to all those people. From there, the users share the story themselves, and it goes viral," Scott continues. "And every time the platforms do one of these campaigns, they learn more about who's susceptible to what messaging."

This is exactly how the "Russian troll farm" ads were supposedly used. The trolls described in the Robert Mueller indictment simply made use of standard tools that Facebook offers to advertisers. They would take a piece of content – for instance, the ludicrous image of Hillary Clinton as Satan, arm-wrestling Jesus under the headline "If I Win, Clinton Wins" – and blast it out to a targeted audience via the News Feed. The only clue that the ad has been commercially pushed to you comes via a tiny faded notation reading "sponsored" under the name of the origin page.

Despite frantic warnings from Senate Democrats about how a few dozen trolls spending a handful of dollars on these ads managed to reach 126 million people, the far more serious issue is that players with far deeper pockets were using the same tactics. "Facebook will sell to anyone if there's a pot of gold at the end," is how one political source puts it.

"That's why the whole Russia story was misunderstood," says Scott. "People are trying to understand how $100,000 worth of ads could reach 126 million people, when what they should be thinking about is the impact of the Trump campaign spending tens of millions of dollars using the same technology."

Brad Parscale, Trump's digital director on the 2016 campaign, thinks the furor over Facebook being responsible for Trump's election is misguided. They used a lot of Facebook ads, he says, because of the peculiar nature of Trump's advertising needs.

Though The New York Times reported Parscale was persuaded to "try out the firm," Parscale himself has scoffed at the role Cambridge Analytica played in the campaign and insisted that Facebook was just the natural choice for his candidate.

"Elections are identical to movies when it comes to advertising," Parscale explained to me in an earlier interview. He talks about politicians with a kind of bemused detachment, like they're no different from soaps or cereal brands. "You've got Rotten Tomatoes for movies, Real Clear Politics for elections, exact same thing. If it's a completely new movie with new characters, then you go broad on TV to introduce the unknown new product. With Trump, the market knew him. It was a question of reaching a specific group of people in specific places who we needed to turn out. That happens to be exactly what Facebook is good for."

But Facebook shouldn't be blamed for being an effective advertiser. The problem is why it's effective, beginning with its monopolistic scale.

Simply by growing so large that his firm ended up essentially standing between media publishers and media consumers, constantly creating rules about who saw what, Zuckerberg and Facebook have become a thing America has never had before: an entrenched, de facto media regulator. The universe in which most Americans get their news sifted through a giant filter has multiple major consequences.

"There's the big economic effect," says Chavern of the News Media Alliance. "We never had someone in the middle before. Now we do have someone in the middle, collecting all the dollars."

The economics are the reason most newsrooms today look like post-nuclear wastelands. What sane person would buy ad space to sell cars on localnewspaper.com in the vague hope of catching the right eyeballs, when Facebook can instantly serve up 40,000 men age 18 to 54 who are likely to buy an automobile in the next six months?

Press outlets can only sell chunks of vaguely grouped audiences to advertisers. Facebook can bring merchants right to the individual buyer's doorstep at almost the exact moment his hand is reaching for his wallet. There's no comparison, which is why two companies – Google and Facebook – control 63.1 percent of all digital advertising and, as noted previously, nearly all of the growth in that business.

Market share is only one issue. The other problem – the presence of algorithms that effectively determine who gets to see what material – is much more serious.

"They've created rules about who gets to see which stuff," says Chavern. "They also change the rules all the time. And they're also secret rules."

Talk to media executives about Facebook, and they'll complain endlessly about two things: one, that they can never get a straight answer from the company about how the algorithm works ("You're fucking lucky if you can even get someone on the phone," hisses one publisher), and two, if they do get advice about how to optimize content, the advice changes constantly.

Media sites routinely shift their entire commercial strategies to try to reach more people through the Facebook News Feed – the latest mania was video content – only to have the algorithm change suddenly.

For a while, some media developers tried to build brands dedicated to gaming Facebook. But sites like Mashable and Upworthy are being sold or laying off workers after initial spikes of success. There's just no way to build a consistent strategy around a constantly changing, secret system.

Still, Facebook's recent move to re-weight the News Feed again, this time with unhelpfully euphemistic new values like "trusted sources" and "time well spent," will likely put an end to the idea that news companies are not dependent upon Facebook to survive.

The latest changes will instead "serve as the final deathblow to almost two decades of delusional thinking," as VentureBeat writer Chris O'Brien put it.

The arbitrariness of the algorithms has essentially forced media firms to lobby Facebook and Google the way other businesses would lobby government departments. A classic example is the battle over the so-called "first click free" rule.

For years, Google had a rule that gave greater visibility to media companies that offered at least some free content. Outlets complained about the rule, which they claimed shaped the industry early in the online age, forcing firms away from subscription-based models. Under pressure, Google finally scrapped the rule in October 2017, but the damage was already done.

About those subscription-based models: There are people out there who believe the media's only hope is to organize, as a union would, and collectively enforce a giant paywall, denying Facebook and its hacker ethos the oceans of free content that are its lifeblood.

But one would be hard-pressed to find a media executive who believes such a strategy has a chance of working.

"You don't call that play under normal circumstances, but it's 4th and 30 for all of us," says McChesney. "There is no commercial solution. There is no magical business model that will save the news business. It's time we all faced reality."

***

Whether Facebook is just a reflection of modern society or a key driver of it, the picture isn't pretty. The company's awesome data-mining tactics wedded to its relentless hyping of the culture of self has helped create a world where billions of people walk with bent heads, literally weighted down with their own bullshit, eyes glued to telescreen-style mobile devices that read us faster than we can read them.

Surveys show audiences trust the media less than ever but consume news more than ever. Those two deeply troubling data points suggest the Fourth Estate, which was designed to inform the public and provide a crucial check on power, is instead morphing into an entertainment product, which succeeds or doesn't based on how quickly our brains ratify the information offered. This is the opposite of how news is supposed to work.

"Once, a citizen had a right to an opinion," says García Martínez. "Now, they feel like they have a right to their own reality."

Awful as that all is, it's not even the most immediate emergency. Along with Google, Facebook is a clear duopoly, which simply has too much power in the fields of media distribution and digital advertising.

The recent controversies have inspired countless proposals for how to "improve" Facebook. Some have pressed for a tax that would kick Facebook revenues back to public-interest journalism. Others have called for a simple ban on new acquisitions, to prevent the firm from snatching up properties like Instagram and WhatsApp when it clearly can't manage the ones it already has.

But when a tumor starts growing teeth and hair, you don't comb the hair. You yank the thing. And it turns out we have a mechanism for just that.

We need to break up Facebook, the same way we broke up Standard Oil, AT&T and countless other less-terrifying overgrown corporate tyrants of the past. The moral if not legal reason is obvious: A functioning free press just can't coexist with an unaccountable private regulator.

An antitrust action sounds extreme, but given the alternatives – different groups have proposed creating fact-checking star chambers either within government, Facebook or both – it may be the least-intrusive solution, one that moreover doesn't create a "legitimacy" standard that could threaten alternative or dissenting media.

The question is, can we actually break up Facebook?

"It's tough," says former New York governor and attorney general Eliot Spitzer, who policed Wall Street for nearly a decade. "Because market size alone, unless gained through improper means, is not a basis for action."

According to the stiff test the government must meet to file successful antitrust actions today, the state not only has to demonstrate the existence of monopoly, but that consumers are worse off under it, subject to "supernormal" prices. The case against Facebook is not a legal slam dunk.

But not all market harm is about raw numbers, and some of the more celebrated recent antitrust actions, like the breaking up of Ma Bell, have opened the door for the government to consider factors other than mere price.

"Under the traditional antitrust analysis, the issue is whether the consumer pays more," says Louisiana Sen. John Kennedy, a corporate lawyer by profession. "But courts are beginning to look at other types of economic harm." Kennedy, a Republican, says the "black box" nature of firms like Facebook, combined with their unprecedented influence, make it urgently necessary for the government to consider all options.

"We're in a brave new world," he says. "We're waking up and realizing some of these companies aren't companies – they're countries."

The real solution to this problem would be to dial back the use of the data-collection technologies that have turned companies like Facebook into modern-day versions of the "propaganda stations" the Federal Radio Commission was so bent on keeping off the airwaves a century ago.

The difference is Facebook doesn't push Nazism or communism or anarchism, but something far more dangerous: 2 billion individually crafted echo chambers, a kind of precision-targeted mass church of self, of impatience with others, of not giving a shit.

A generation of this kind of messaging is bound to have some pretty weird consequences, of which electing proudly ignorant bubble-thinker Donald Trump is probably just a gentle opener. Given that, we might be too late to fix Facebook – maybe we need to be saved from it instead. 


e-max.it: your social media marketing partner
Email This Page

 

THE NEW STREAMLINED RSN LOGIN PROCESS: Register once, then login and you are ready to comment. All you need is a Username and a Password of your choosing and you are free to comment whenever you like! Welcome to the Reader Supported News community.

RSNRSN