ENIAC
(Image by (Not Known) Wikipedia (commons.wikimedia.org), Author: Author Not Given) Details Source DMCA
The first computing machine was created for the Army in 1946, at the University of Pennsylvania, Moore School of Electrical Engineering. The world's first computer was called ENIAC, the electrical numerical integrator and calculator. As we know now, ENIAC would be just the beginning, the Neanderthal and the Cro-Magnon. It would have children and grand-children and great, great, great grand-children, who would make calculations like millions per second.
In the early 1990's the Web was born when Tim Berners-Lee proposed HTML, the foundational protocol for Web pages. The letters ML stand for "markup language," and the HT stands for the coinage, hypertext.
It is our common shared illusion that we imagine the data from computers and servers hold as a substance, like some natural resource waiting to be mined from the ground. The information that drives our automated culture come from you and me, we the people, in the form of massive data dumps. It is understood as an elaborate form of string pulling, or sophisticated puppetry, creating a daunting successor reality in information. The irony is very real, since we have become psychologically victimized by the very technological innovations that we have created. It is an ambiguous irony though, because it is not clear to us how much choice we truly have. Once the patterns of life are changed, we lose the ability to grasp what we have forgotten from a previous time. We have all been lost in the process of growing up.
The idea of artificial intelligence gives humans the cover to avoid accountability by duping ourselves that computer networks and machines can take on more of our own responsibility. Algorithms don't represent emotion or meaning, but only statistics and correlations. The servers that drive the network are narcissists. They have no clue where value comes from. It is impossible for high frequency trading to incorporate information about the real world, because there is no time for that information to get into the feedback loop. Digital information after all, is just people in disguise. In this way, digital modernity resembles a sort of gentle blackmail. Once, we become accustomed to having an information service, our cognitive styles and capacities become shaped by the availability of that service.
If John Maynard Keynes, the great and prescient demand side economist had seen the emergence of such machines as ENIAC and its eventual computational power, he would have known his mission was doomed. He would have seen that judgment would become overwhelmed by efficiency. He had once said, that "judgment is to capitalism what the secret police are to a totalitarian state. Without judgment," he stated, "you will have financial anarchy of worse."
Keynes likened it to what happened in the 14th century, when the plague came off of a boat in Venice, as a germ no bigger than a pinhead in the blood of a rat. That rat found another rat to bite, or a person. And so it went. Within ten years, a third of Europe was gone.
Keynes also anticipated the environmental debate, when he said; "we are capable of shutting off the sun and the stars because they don't pay a dividend. The greatest enemy of the truth is very often not the lie; deliberate, contrived and dishonest; but the myth, persistent, persuasive and unrealistic. Belief in myths allows the comfort of opinion without the discomfort of thought."
The way digital networks have been designed, by fashion, though not by necessity, creates ultra-valuable central nodes that spawn temptations from bad actors, whether those actors are traditional legitimate players or not.
We must avoid the last trap door of falling back into thinking of people as components and a central computer server as being the only point of view for defining efficiency and testing efficacy.
Man cheats in his own way, and he is only honest, who is not discovered.
Tim Duff
Tonka Bay, Minnesota