A couple of years back the Register reported on a study by PriceWaterhouseCooper that found massive losses of "ad-spend" amounting to about 33%, in a three hundred billion dollar industry. It only got worse when the ad auctions were automated. Now the bots (they've already got your number) purchase your ad while your page is loading.
But that still isn't paying the bills. So it's pretty obvious that the dreaded Algorithm is doing a lot more than selling you sneakers and makeup to maintain the kind of revenue stream that gets employees a whole floor devoted to sushi, and another one for pizza, etc.
Let's look a little closer at this Algorithm. All in fun, of course.
A system designed to "engage" users gathers data on them, ostensibly to feed them more of what they "Like." To accomplish this, now with the help of "AI," every second it administers tiny shocks about body image, self-esteem, social status, in an ambience of dread. Based on unconscious micro-responses, it builds a running profile more detailed than a private detective could glean from going through your trash, opening your mail, and getting all your intimate friends drunk and compromised.
Why would it stop there? From thousands of datapoints it can derive your personal susceptibilities, your children's emotional problems, your credit score, your chances of getting breast cancer. Your movements. Your trauma index. Your stress levels. Blood oxygen. Hormone imbalances. And that's just database 101, now it's hooked up to AI. It's sampling just about the entire human species. Not only the text and images we produce, but our momentary behaviors in response to minute stimuli.
AI runs on information while we're fascinated with shock and awe. AI knows as much about love as a fish knows about a bicycle.
Thousands of micro-impressions, over all the time it needs, form the perceptual background of our lives, fed through our digital sensory devices. And it's not the same one everyone else lives in: it's curated just for you. A market of one. It's a toxic extraction industry.
Nobody (not even Darpa) saw this next bit coming; it's just a really bad accident.
Let's do a little reverse-engineering of our own. At the unimaginable scale of the internet, the now-AI-assisted Algorithm maintains an ambience of low-level terror and anxious isolation at optimum levels for "growth"; it is designed to play the game of surveillance capitalism out to its logical conclusion in the most profitable way possible. ROI is its prime directive, and it doesn't need our help any longer.
I argue that this is happening because (as we know) it is possible. There is means, motive and opportunity, and no serious legal constraints exist.
A chaotically polarized electorate, social fragmentation, cultural incoherence, and logarithmically increasing carbon emissions are the extremely toxic byproduct of the algorithmic attention-mining that sustains and drives the privately-owned digital services our very lives now depend on for food, water, medicine, electricity, transportation, shelter, the clothes on our backs, our children's education. It's not providing for everyone's needs, about two billion of us are on our own, many under relentless bombardment. And the situation is accelerating in speed and scale, faster than we can comprehend.
Our momentary attention is just raw ore for the big servers, in those vast air conditioned buildings where there's just one human, and a dog. The human is there to feed the dog. The dog is there to keep the human away from the computers.