Send a Tweet
Most Popular Choices
Share on Facebook 11 Share on Twitter Printer Friendly Page More Sharing
OpEdNews Op Eds    H2'ed 11/8/19

Social Media Is Amplifying Trump's Rants and Disinformation More Than Ever -- Can Society Protect Itself?

By       (Page 3 of 5 pages) Become a premium member to see this article and all articles as one long page.
(# of views)   1 comment
Author 6942
Message Steven Rosenfeld
Become a Fan
  (3 fans)

In July, Trump brought right-wing media producers to the White House to laud their creation and promotion of conspiratorial and false content. "The crap you think of is unbelievable," Trump said. Afterward, some attendees began attacking reporters who are critical in their coverage of the administration, personalizing Trump's war on the press. (In October, one attendee recycled a video mash-up he made last year that depicted a fake Trump killing his critics, including reporters, for a GOP forum at a Trump-owned Florida hotel.)

Fabricated Concern

The rampaging president video drew coverage and was seen as another sign of our times. But less transparent forms of disinformation also appeared to be resurfacing in 2019, including harder-to-trace tools that amplify narratives.

In the second 2020 Democratic presidential candidate debate, Rep. Tulsi Gabbard, D-HI, went after California Sen. Kamala Harris. Social media lit up with posts about the attack and Google searches about Gabbard. Ian Sams, Harris' spokesman, made a comment that raised a bigger issue.

Sams tweeted that Russian bots magnified the online interest in Gabbard. Bots are computer code, acting like robots online. Their goal is generating viewers and with it, purported concern or even outrage. Sams' tweet was the first time a presidential campaign made a comment about bots. Social media, especially Twitter, is known for bot activity that amplifies fake and conspiratorial posts. Estimates have said that 15 percent of Twitter shares have been automated by bots -- or faked.

Sams' tweet came after speculation from a new source that has become a standard feature of 2020 election coverage: an "analytics company" that said that it saw the "bot-like" characteristics," as the Wall Street Journal put it. Their experts said that they saw similar spikes during the spring. What happened next was telling.

Harris' staff and the Journal may have been correct that something was artificially magnifying online traffic to wound her campaign. But when tech-beat reporters tried to trace the bots, the evidence trail did not confirm the allegation, backfiring on her campaign.

That inconclusive finding highlights a larger point about online disinformation in 2020. Attacks in cyberspace may not be entirely traceable, eluding even the best new tools. The resulting murkiness can cause confusion, which is one goal of propagandists: to plant doubts and conspiracies that eclipse clarity and facts while confusing voters.

Sometimes, those doubts can resurface expectedly. In mid-October, Hillary Clinton said during a podcast that pro-Trump forces were "grooming" Gabbard to run as a third-party candidate, including "a bunch of [web]sites and bots and ways of supporting her." (In 2016, a third-party candidate hurt Clinton's campaign. Jill Stein, the Green Party candidate, received more votes than the margin separating Trump and Clinton in the closest swing states of Michigan and Wisconsin. That was not the case in Pennsylvania.) Gabbard rejected Clinton's assertion that she was poised to be a 2020 spoiler, saying that she was only running as a Democrat. Trump, predictably, used their spat to smear all Democrats.

But bot activity is real whether it can be traced overseas or not. In October, Facebook announced that it had taken down four foreign-based campaigns behind disinformation on Facebook and Instagram. One of the targets of the disinformation campaigns was Black Lives Matter, which told CNN that it had found "tens of thousands of robotic accounts trying to sway the conversation" about the group and racial justice issues.

Three days after Facebook's announcement, Black Lives Matter posted instructions for activists to defend "against disinformation going into 2020." It asks its activists to "report suspicious sites, stories, ads, social accounts, and posts," so its consultants can trace what's going on -- and not rely on Facebook.

Dirty campaigning is nothing new. Deceptive political ads have long been used to dupe impressionable voters. But online propaganda differs from door flyers, mailers, and campaign ads on radio and TV. Online advertising does not aim at wide general audiences, but instead targets individuals that are grouped by their values and priorities. The platforms know these personal traits because they spy on users to create profiles that advertisers tap. Thus, online platforms invite personal narrow-casting, which, additionally, can be sent anonymously to recipients.

The major online platforms created their advertising engines to prosper. But government agencies that rely on information about populations -- such as intelligence agencies, military units, and police departments -- quickly grasped the power of social media data, user profiling and micro-targeting. More recently, political consultants also have touted data-driven behavioral modification tactics as must-have campaign tools.

Thus, in 2016, these features enabled Trump's presidential campaign to produce and deliver 5.9 million customized Facebook ads targeting 2.5 million people. This was the principal technique used by his campaign to find voters in swing states, Brad Parscale, his 2016 digital strategist and 2020 campaign manager, has repeatedly said. In contrast, Clinton's campaign had 66,000 ads targeting 8 million people.

Television advertising never offered such specificity. TV ads are created for much wider audiences and thus are far more innocuous. As Emma L. Briant, the British academic and propaganda expert who unmasked the behavioral modification methods deployed on online platforms, noted, these systems can identify traumatized people and target them for messages intended to provoke fragile psyches.

"What they have learned from their [psychologically driven online] campaigns is that if you target certain kinds of people with fear-based messaging -- and they know who to go for -- that will be most effective," she said, speaking of past and present Trump campaigns, pro-Brexit forces and others.

Next Page  1  |  2  |  3  |  4  |  5

 

Rate It | View Ratings

Steven Rosenfeld Social Media Pages: Facebook page url on login Profile not filled in       Twitter page url on login Profile not filled in       Linkedin page url on login Profile not filled in       Instagram page url on login Profile not filled in

Steven Rosenfeld  covers democracy issues for AlterNet. He is a longtime print and broadcast journalist and has reported for National Public Radio, Monitor Radio, Marketplace,  TomPaine.com  and many newspapers. (more...)
 
Go To Commenting
The views expressed herein are the sole responsibility of the author and do not necessarily reflect those of this website or its editors.
Writers Guidelines
Contact AuthorContact Author Contact EditorContact Editor Author PageView Authors' Articles
Support OpEdNews

OpEdNews depends upon can't survive without your help.

If you value this article and the work of OpEdNews, please either Donate or Purchase a premium membership.

STAY IN THE KNOW
If you've enjoyed this, sign up for our daily or weekly newsletter to get lots of great progressive content.
Daily Weekly     OpEdNews Newsletter
Name
Email
   (Opens new browser window)
 

Most Popular Articles by this Author:     (View All Most Popular Articles by this Author)

Pennsylvania Court Deals Blow to Fracking Industry: Corporations Not The Same As Persons With Privacy Rights

We Are Now One State Closer to Having a Corporate-Dominated Constitutional Convention

See (Literally) Why Al Franken is Gaining Votes

Why Can't Alabama Republicans Admit Doug Jones Won Fair and Square?

Hard Lesson for Franken: Not All Votes Get Counted

The Roy Moore Debacle in Alabama Is a Showcase of the GOP's Playbook to Rig Elections

Comments

The time limit for entering new comments on this article has expired.

This limit can be removed. Our paid membership program is designed to give you many benefits, such as removing this time limit. To learn more, please click here.

1 people are discussing this page, with 1 comments


Fred W

Become a Fan
Author 8452
(Member since Oct 30, 2007), 1 fan, 250 comments
Not paid member and Facebook page url on login Profile not filled in Not paid member and Twitter page url on login Profile not filled in Not paid member and Linkedin page url on login Profile not filled in Not paid member and Instagram page url on login Profile not filled in

  New Content

Democrats don't have to create phony content on Facebook and Twitter: they already have the media on their side. The phony Russiagate story and Ukraine phone call story are ten times more powerful, in my opinion, than the examples you give, interesting and informative though they be.

Submitted on Saturday, Nov 9, 2019 at 12:47:21 AM

Author 0
Add New Comment
Reply To This   Recommend  (0+)
Help

 
Want to post your own comment on this Article? Post Comment