Zogby is saying it's all but impossible to measure what people are subconsciously processing, or what's motivating their behaviors during the campaign season, especially on Election Day when they leave their digital devices and fill out a ballot. That's different from tracking what's happening online before then, as Trump's campaign watched as its Facebook targets liked, commented, or shared its political messaging -- or took another action like submitting an email, volunteered or agreed to attend a rally.
But Zogby also fears the technologists behind Silicon Valley's top social media platforms are in the dark in a different way about what they have unleashed on the political world. They don't realize that devices, and what's presented on screens designed to exploit human nature, are extremely powerful in political contexts. (For example, the MIT study of Twitter reported in Science said people, not computer robots, are mostly responsible for spreading inflammatory content. Why? Because human nature is more drawn to what's unusual, conspiratorial and echoes one's beliefs and biases than to sharing facts.)
"The folks at Facebook and Google are amazingly brilliant people, but they have no idea what they have created," Zogby said. "And that to me is the scariest piece. They can't ultimately rein it in, because they don't even understand [how it invites abuse and the impact]. But in the final analysis, is it anywhere as big as people think it is? Or is thinking it's big itself the power that it has?"
"We're looking at some slices of this," said Garlin Gilchrist II, the center's executive director, referring to social media's positive and nefarious uses in politics. "We've done some work on what is the proportion of news that was shared online over the last couple of years that came from an unreliable source, specifically Facebook."
That soon-to-be published report, which is expected to be striking, is one example. Another research project is tracking how members of Congress use Twitter, sharing information or opinions that mirror their views. In both instances, the goal is arriving at a more specific understanding of how online echo chambers affect the intersection of the personal and the political.
"One of the reasons that we have launched this effort at the U.M. School of Information is to put in place infrastructure to be able to answer those types of questions, to get toward those answers," Gilchrist said. "The better we understand networks and characteristics of information that flows through them, and how people are engaging and using them, the better recommendations can be made -- both for users, and for the platforms, to be able to say, if you want to optimize your network for something different, here are the choice points that will have the most impact. That is where we are starting from."
Virtually every major political campaign in 2018 is going to be using social media, especially Facebook, YouTube and Twitter, even if they don't quite understand it, Zogby said, saying the trend is the latest example of the mutually assured destruction dynamic that's long been a part of campaigns. "Candidates say, 'It's out there. It's a capability. I better pay attention to it if the other guy's got it.'"
The Emerging Research Landscape
Needless to say, those who know the most about how social media platforms engage and provoke behavior, and who have worked with campaigns -- such as with Trump in 2016 -- and presumably have assessed their performance, are the companies themselves. So far, the industry's public response generally has been to help some media outlets brand their coverage as more credible and not tinker with the underlying machinery. But these institutions may be heading toward a reckoning, as experts increasingly are waving flags about intentionally addictive platforms and business models propelling disinformation.
"There's a collective freakout going on regarding the effects of social media on society as a whole," Ethan Zuckerman, director of the Center for Civic Media at MIT, whose research focuses on media and social change, said in an email. "I'd classify the concerns I've heard into four general areas: Social media is addictive and bad for us; social media platforms are killing journalism; social media is being manipulated by bad actors to spread propaganda; and social media leads to ideological isolation and polarization."
"Tristan Harris, a former Google design ethicist now with the non-profit Time Well Spent, is leading the charge on the first issue, and he's got some good arguments," he said. "His critique is mostly individual: Too much screen time is bad for you and the folks who've designed slot machines are the same folks designing social media. Fair enough, and worth researching, but less interesting to me as social/civic phenomenon: screwed-up, addicted citizens make up a dysfunctional body politic, yeah... But this is really about individual impacts, and about the weird phenomenon of people who built these tools now declaring they don't want their kids using them."
"The second subject is over 10 years old now, but still inspires debate, if only because it's a real problem and one that's very hard to solve," Zuckerman continued, referring to how the journalism world has lost its impact and reach because social media often gives more credible content the same weight as uninformed opinion and propaganda. "We will lose something important if we lose local accountability and investigative media. I think this question is incredibly important, but I'm also at a loss for new ideas for solving it."
Facebook and Google's response to this trend has been to try to grade its content -- an approach embraced by mainstream media outlets, in part, because it might help them to regain audiences and bolster their standing with advertisers. Notably, one takeaway from Science's March cover story is that strategy may make the platforms and news media feel better, but it's not likely to work because it's not addressing the underlying human psychology exploited by the platforms' algorithms.
"Fact checking might even be counterproductive under certain circumstances," Science wrote. "Research on fluency -- the ease of information recall -- and familiarity bias in politics shows that people tend to remember information, or how they feel about it, while forgetting the context within which they encountered it. Moreover, they are more likely to accept familiar information as true. There is thus a risk that repeating false information, even in a fact-checking context, may increase an individual's likelihood of accepting it as true."
(Note: You can view every article as one long page if you sign up as an Advocate Member, or higher).