Reprinted from Strategic Culture
Espen Egil Hansen is editor of Aftenposten, Norway's largest newspaper.
(Image by AFTENPOSTEN/NICK UT) Details DMCA
We all live in the Age of the Algorithm.
So here's a story that not only encapsulates the age, but dwells on how the algorithm obsession can go horribly wrong.
It all started when Facebook censored the iconic photo of napalm girl Kim Phuch, which became a symbol of the Vietnam War recognized all over the world. The photo was featured in a Facebook post by Norwegian writer Tom Egeland, who wanted to start a debate on seven photographs that changed the history of war.
Not only his post was erased; Egeland was also suspended from Facebook.
Aftenposten, the number one Norwegian daily, owned by Scandinavian media group Schibsted, duly relayed the news, alongside the photo.
Facebook then asked the paper to erase the photo -- or to render it unrecognizable in its online edition. Yet even before the paper responded, Facebook censored the article as well as the photo in Aftenposten's Facebook page.
Norwegian Prime Minister Erna Solberg protested it all on her Facebook page. She was also censored.
Aftenposten then slapped the whole story on its front page, alongside an open letter to Facebook founder Mark Zuckerberg signed by the newspaper director, Espen Egil Hansen, accusing Facebook of abuse of power.
It took a long 24 hours for the Palo Alto colossus to back off and unblock the publishing.
An opinion wrapped up in code
Facebook had to be engaged in much post-fact damage control. That does not change the fact the napalm girl imbroglio is a classic algorithm drama, as in the application of artificial intelligence to evaluate content.
Facebook, just like other Data Economy giants, happens to de-localize filtering to an army of moderators working in companies from the Middle East to South Asia, as Facebook's Monika Bickert confirmed.
These moderators may have a hand on establishing what should be expunged from the social network, according to what customers may signal. But the information is then compared to an algorithm, which comes up with the final decision.
It doesn't take a PhD to note these moderators may not exactly excel in cultural competence, or are capable of analyzing context. Not to mention algorithms -- which are incapable of understanding cultural context and are certainly not programmed to interpret irony, sarcasm or cultural metaphors.