From Washington Monthly
It's high time the federal government do something about It.
If the political world feels increasingly off kilter, there's a probably a good reason. Numerous forces are combining to rollick, not only the United States but the entire world. These include increasing inequality, rising housing costs, wage declines and job destabilization as a result of technological changes, public policy and globalization-and, of course, the COVID pandemic and the looming specter of the climate crisis.
But a huge part of the malaise involves angry political polarization, particularly asymmetric polarization by white supremacists in America and Europe, and ethnonationalists more broadly around the world. Decline in trust in institutions runs hand in hand with a newly fractured media landscape, in which legitimate media companies and journalists struggle to survive financially but disinformation thrives. Those destabilizing forces in turn make it difficult for democratic institutions to take serious action on inequality, economic reforms and climate.
At the heart of much of this bedlam are the deliberate actions of social media companies in general, which have broadly destroyed the revenue model for journalism -- often through deliberate lies -- and created engagement algorithms that incentivize hateful polarization and outright disinformation. And no social media has been more guilty of both than Facebook.
Two big stories dropped this week highlighting Facebook's ongoing role in sabotaging both journalism and democracy in the pursuit of profit.
The first is a devastating story by Karen Hao at the MIT Technology Review on how Facebook's artificial intelligence unit learned how to efficiently drive engagement on the platform by recommending increasingly inciteful and extremist content and groups. Then, when the teams involved in creating this monster began to realize what they had unleashed and took steps to curtail it, the company (largely at the direction of Mark Zuckerberg himself) refused to do anything significant about it -- choosing instead to deflect the problem toward issues of bias rather than polarization and disinformation.
To be sure, bias is also a problem with both the advertising and and content algorithms. Facebook was rightly facing attacks for serving ads for certain products and benefits only to whites and privileged groups at the expense of minorities and the underprivileged, or targeting the latter with socially destructive advertising. But by appearing to aggressively self-regulate on that topic, Facebook got away with doing nothing about the fact that its engagement algorithms were leading its users to false and extremist content.
Worse, Facebook's efforts at controlling bias were manipulated by Trump and the conservative media's endless factory treadmill of self-pitying victimhood into specially privileging the very conservative disinformation that was the biggest offender for asymmetric polarization:
"Facebook did not grant me an interview with Zuckerberg, but previous reporting has shown how he increasingly pandered to Trump and the Republican leadership. After Trump was elected, Joel Kaplan, Facebook's VP of global public policy and its highest-ranking Republican, advised Zuckerberg to tread carefully in the new political environment.
"On September 20, 2018, three weeks after Trump's #StopTheBias tweet, Zuckerberg held a meeting with Quià ±onero for the first time since SAIL's creation. He wanted to know everything Quià ±onero had learned about AI bias and how to quash it in Facebook's content-moderation models. By the end of the meeting, one thing was clear: AI bias was now Quià ±onero's top priority.'The leadership has been very, very pushy about making sure we scale this aggressively,' says Rachad Alao, the engineering director of Responsible AI who joined in April 2019...
"But narrowing SAIL's focus to algorithmic fairness would sideline all Facebook's other long-standing algorithmic problems. Its content-recommendation models would continue pushing posts, news, and groups to users in an effort to maximize engagement, rewarding extremist content and contributing to increasingly fractured political discourse."
Facebook is not, of course, the only social media organization that has contributed to this. Youtube (now a subsidiary of Google's parent company Alphabet) in particular is famous for leading users down a primrose path to radicalization, guiding the unsuspecting down a pipeline from videos on anything from Star Wars to fitness to economics, straight to Jordan Peterson, Prager University or Ben Shapiro in just minutes. But Facebook's algorithms have been particularly aggressive, and its consequences especially devastating. It was Facebook above and beyond any other factor that was responsible for supercharging the ratcheting hate that led to the genocidal massacres of the Rohingya in Myanmar, and its actions to contain the damage have been pitiful.
Next Page 1 | 2
(Note: You can view every article as one long page if you sign up as an Advocate Member, or higher).
David Atkins is president and founder of The Pollux Group, Inc., a qualitative research consultancy specializing in emerging technologies and the changing trends in consumer and socio-political behavior created by the Millennial Generation. A (more...)