What exactly have Silicon Valley's top social media platforms unleashed on the political world?
The political world is awash in a growing sea of social media-fed misinformation, loosely called fake news. Each week brings eyebrow-raising reports of a threat poised to upend America's already dysfunctional political landscape, or reports that those at the helm of online information ecosystems delight in distorting reality and disrupting societal norms.
Last week, the New York Times reported that unnamed programmers used open-source Google code to create an app putting Michelle Obama's face on an actress in a porn video. Earlier Times reports said faked videos could be coming to campaigns. This week, the New Yorker profiled Reddit, an anarchic site with 1 million subgroups (where Times reporter Kevin Roose discovered tips about forging political porn videos). Reddit CEO Steve Huffman even confessed he considers himself, "a troll at heart... Making people bristle, being a little outrageous in order to add some spice to life -- I get that. I've done that."
There's nothing new about political distortions or rabble-rousing in American culture and politics. But just as social media is revolutionizing and accelerating aspects of the way people and campaigns communicate, these frontline dispatches heralding a disinformation dystopia are frequently missing a key element: context, or magnitude, so readers know what matters -- and doesn't -- about the purported threats or trends. This omission is significant, because as the March issue of Science noted, "about 47 percent of Americans overall report getting news from social media often or sometimes, with Facebook as, by far, the dominant source."
What is known about how social media platforms trigger the brain and stimulate behavior is changing. While campaign consultants almost always say they cannot know if their partisan messaging on social media affects how targeted voters behave on Election Day, scholars are starting to publish research tracing how and why social media is radicalizing politics.
Last week, Science published a study that analyzed 126,000 rumors spread on Twitter and traced how propaganda spreads further and faster than facts do. People are drawn to falsities, like to share it, and social media super-charges that process, its authors said. In a separate article, 15 social scientists warned that dynamic is fanning political extremism.
"Our call is to promote interdisciplinary research to reduce the spread of fake news and to address its underlying pathologies it has revealed," the co-authored article, "The Science of Fake News," concluded. "We must redesign our information ecosystem... We must answer a fundamental question: How can we create a news ecosystem and culture that values and promotes truth?"
That call is not unique, but it highlights how sophisticated the challenge is. Social media uses brain-mimicking artificial intelligence that serves up the content people see. That targeting is based on advanced computing that profiles every online user's keystrokes. This technology was developed for advertisers to provoke sales. But once it is imported into political campaigns, where agendas, candidates and smears are the products being sold, the result is an outbreak of propaganda or provocations that blur lines between perception and reality; between personal prejudices and more objective truths.
Ask political consultants about this dynamic and they'll reply that one can't predict how people receiving their messaging will react. On the other hand, social scientists, academic media analysts and former social media executives are saying that's not entirely so. They say human behavior is frequently predictable, and some academics have begun to connect the dots and produce an evidence trail.
Who Knows What's Really Going On?
Take what we know about how Donald Trump's campaign used Facebook in 2016. It identified and targeted 13.5 million persuadable voters in 16 states with 100,000 different messaging "variants" -- the equivalent of having 100,000 campaign ads at your disposal. Of course, top Trump staffers in October 2016 bragged to Bloomberg.com that using the platform to encourage and suppress voters would elect Trump. But did Trump's use of Facebook tip the race?
"We have precisely no evidence that the Russia stuff or anything that Trump's campaign did moved any votes. If they did, we don't know," said Colin Delany, the founder of Epolitics.com and a columnist specializing in online campaigns for Campaigns and Elections magazine. "We know Trump's people spent whatever he spent on Facebook. We know they ran all these variants. But campaigns do a lot of things that don't work all the time."
How can advertising experts and political consultants know so much about voters they repeatedly target before Election Day -- by merging voter profiles compiled by political parties with the personal profiles platforms generated by Facebook's advertising-driven supercomputers -- yet not know what will likely happen when their micro-targeted audiences actually vote?
"You're never going to be able to get metrics on it. It's not new," said John Zogby, a nationally known pollster and Forbes contributor. "Number one, there's the old cliche we always use. This is about the focus groups, where people say, 'Oh, I don't pay any attention to television advertising or jungles,' and they are humming the jingle for Crest going down the toothpaste aisle."
(Note: You can view every article as one long page if you sign up as an Advocate Member, or higher).