The goal of the city-state, from its inception, has been to produce robots-that is to say, free labor, perfect slaves, the ultimate instrumentality of nature. Human slaves, animal livestock, the working class-all of these are merely violations committed because they served as substitutions for what had truly been desired. Its accomplishment would mean the casting off of the yoke of drudgery and sickness. It would mean a return to the state of health after this disease/pregnancy had finally run its course. Machina ex deo.
The rise of the robot proletariat is based on two contingent facts: 1) Science will produce robots that can perform the work of human beings, and 2) Capitalism will embrace robots that can perform the work of human beings.
Imagine that you're a factory owner. You can either higher a team of human workers, who need to be paid, have expenses, get tired, take breaks, unionize, etc. Or, you can purchase a team of robot workers that are super strong, super fast, super efficient. Who will work tirelessly day and night, 24/7, without breaks. Who have no families, don't talk amongst themselves, don't care if you treat them with disrespect. The perfect slave force. Which do you choose? The nature of capitalism is that it must cut costs wherever possible, must exploit any opportunity to embrace the most economically efficient model. And as Marx described, each economic situation contains the seed of its own destruction, and gives birth to the next stage.
If one company replaces its floors with robots, all its competitors will have to do so as well, or be driven out of business (the same principle applies to countries). As the demand for robot labor rises, the law of supply and demand will require that more and more money be poured into their development and specialization. More and more jobs will be taken by robots, and thus more and more people will be laid off. Unemployment will rise, and the masses will demand an ever expanding social welfare network to compensate for this. And there is no reason why they shouldn't have it-since less work needs to be done.
There was a time when people worked to produce things that were necessary for life. Now people consume things so that other people can have jobs. And, for our system to operate, it needs to be this way. The way our system is set up, you need money to live, and we need you to need money to live. Without the constant threat of homelessness and starvation we might not be able to get people to do the things no one wants to do (or at least, that's the excuse that's given and that people have so far accepted). This causes us to do crazy things like planned obsolescence. Innovations are suppressed all the time because of corporate interests. We do all sorts of things like this that are highly inefficient and wasteful (from the perspective of human prosperity rather than corporate income).
But the necessities of life being guaranteed as a result of robot labor, the first move will be to ensure that all people have access to these necessities-basic food, basic shelter, basic clothing, and basic medical care will be considered rights. They will be guaranteed to every citizen, and one day every human being. If you want luxury items-better food, better housing, better clothing, more specialized medical care-you can work a more specialized job to earn money to obtain these things, but the things necessary for you to survive will be guaranteed.
As time continues, this process will increment. Better and better robots will be produced-more specialized, more intelligent-to replace more professions. People will still work, but their work will be personal work, work that makes their lives feel meaningful. All jobs that people wouldn't do without pay will be mechanized, and eventually money itself will be abolished. The situation will be that of an all you can eat restaurant-the resources will be so abundant due to unlimited robot labor that you as an individual can have as much as you want without fear of busting the bank. All energy will be green energy (thermal, solar, wind, hydro, seismic, etc). Advances in technology will allow us to capture more and more energy and power everything using less and less energy (advances in the energy efficiency of machines), eventually providing us with more energy than we will ever need. All material products will be "cradle to cradle"-100% recyclable back into the system or the environment without pollution or degraded quality in the materials.1 But the people of the future will also want different things. Material interests will be progressively less emphasized as personal and spiritual interests come to the forefront. Having a lot of things will no longer be the symbol of prosperity for which people strive. When people can have anything they want materially, they will no longer be obsessed with material things.2
The entire zeitgeist of such a world will be different, as will the rules of engagement. As my friend and colleague Cameron Belle puts it, robot Bob doesn't have a family to feed, nor does he have any sense of personal ambition. So whether or not you buy product X that he produces will be of no interest to him. Marketing of products in the mass manner that currently exists will cease to be. No one will be shoving advertisements down your throat from morning till night, trying to convince you that you want and need things you don't want or need. Rather, you will come home to your wall / high definition television / super computer or merely pull out your handheld computer / phone / music player / radio / television (etc etc), describe what type of thing you'd like (be it food, clothing, music, movies, art, furniture, etc), browse and select the particular type of thing you want, modify it to fit your own particular taste, and either download it (television and the Internet are becoming more integrated every day: On Demand and DVR features for television and programs viewable on the websites of various networks for the Internet, although corporate interests have considerably slowed down this process of merging that was expected to have happened already) or have it sent to you. Stores as we know them won't exist, of course, though there will be plenty of public meeting areas where people can meet and enjoy themselves.
All citizens may one day be given a daily allotment of online credits that they can invest in the public works projects of their choice. Think the train station needs a little attention? Send some credits its way. Want more books in the local library or more paintings in the local art museum? What type? Send credits and put out an online proposal suggesting that others invest in this. Have an area of scientific inquiry you'd like to see explored? Put out an online proposal and see who'd like to invest and participate in it. What schooling is necessary? Get trained in it. Want a statue erected? A temple built? A park set up? Write up a proposal and see who invests in it. Think that statue Fred proposed would be an eyesore? Invest against it. The credits would be a way to tell the machines which areas to focus time, energy, and resources on, based on what the public wants.
We are programmed by evolution to care about ourselves, to not want to die, to be selfish, to demand respect, to seek to better our situation, to try and dominate all other forces, etc. But robots would be programmed by us. They simply wouldn't have those impulses. Who would force grueling and unwanted physical labor upon bodies of flesh and bone (like those of horses) when there are bodies of plastic and metal (like those of cars) readily available? Who would force repetitive, boring mental tasks upon creative/intuitive minds (like those of artists) when there are systematic/automatic minds (like those of computers) readily available? In short, who would force round pegs into square holes when there are square pegs? The queen ant or bee need never fear revolution, nor do her subjects desire it. It is because robots will be designed to serve human beings rather than preserve their own existence that a robot revolution like that described in The Terminator series or The Animatrix, in which the robots rebel because the humans threaten their autonomy, is unlikely. This is what Isaac Asimov called the Frankenstein complex-humanity's irrational fear that robots, like humans, must necessarily revolt out of resentment over being oppressed.
But this idea goes back farther in the West than Mary Shelley. Christians dreamed up the idea of Lucifer: that God had created a being to be his servant, and that this servant, mysteriously, desired something entirely contrary to the purpose for which God made him. But of course this is absurd. How did the desire to rule ever enter Lucifer's heart to begin with? No explanation is given. That Lucifer is modeled after Prometheus (whose title Shelley passed to her Professor), who brought humanity the light and wisdom of the gods and taught them how to trick Zeus, seems to provide us with the origin of this tale of rebellion. But the Christian version is distorted. Prometheus, a Titan, was born-born before Zeus, even. But Lucifer, an angel, was made-made by God for his purposes. That the former would rebel is natural-that the latter would is inconceivable, but that God is perhaps either a poor craftsman (to give his creation the desire to rebel without knowing it) or a cruel one (to intentionally give Lucifer such a desire, but curse him when he sought to actualize it). That the history of Japanese mythology is not so cancerously distorted is perhaps the reason they are far more open to the design and productions of robots than Westerners.
A more plausible robot revolution in theory, though one I would regard as still highly improbable, is that presented in the movie I, Robot, itself a bastardization of Isaac Asimov's ideas. In this movie, robots try to take over the world because mankind is destroying itself and they are programmed to protect human life whenever possible. As Asimov himself noted, "eye-sci-fi" is generally different from written science fiction because the science fiction audience that goes to movies is not identical to the one that reads books. The former is more interested in "special effects" and violence, and the latter is more interested in compelling theoretical discussions of ideas. While even in Asimov's works the robots come to take positions of power as caretakers of humanity, though through nonviolent means (and the loopholes in Asimov's famous "Three Laws of Robotics"3 may suggest the inadequacy of all categorical imperatives), robots would arguably make the perfect political "leaders"-or more accurately, public servants. They wouldn't care about themselves or have personal interests-they would only care about serving human beings. If government ever becomes the panarchy-the bureaucratic instrument of the people-that I imagine, we may come to regard governmental positions as being just more jobs we humans don't want to do. For mature discussions on robots in human society and programming robots compatible with human interests, check out the following:
Robots in the work force:
Programming "Friendly AI":
Hegel showed us how masters are physically dependent on slaves. Masters need the work slaves do in order to survive-slaves don't need the work masters do in order to survive. Nietzsche showed us how slaves are morally dependent on masters. Slaves develop moralities based around serving others-without someone else to validate them, they have no reason to live. Masters develop moralities based around enjoying themselves-they are the meaning of their lives. Masters exist by the effort of slaves and for their own purposes; slaves exist for the purposes of masters and by their own effort. The degree to which real life slaves have found meaning in their own lives and sought to assert their own autonomy is the result of their moralities not being pure slave moralities-but rather their will to masterhood and autonomy, so essential to biological life, being a continuing and conflicting influence over the programming their self-appointed masters sought to impose on them. But robots would be more an extension of our will-having grown out of our intentions-than a harnessing of the will of others, be they human slaves or animal livestock. It's not as if we would be starting off with autonomous organisms with their own intentions and forcing them into being instruments of our will. Rather we would be starting with far more basic building blocks-with raw materials, which don't seem to particularly care what shape they are in-and constructing this extension of our will from the ground up. They would be true slaves, like the hand to the brain. But, then again, "true slaves" aren't quite "slaves" in the traditional sense at all-since their will is never being contradicted. Robots would be for human society what the neocortex is to the limbic system-an extension that increases the overall significance of the system, but not something that could ever replace the foundation upon which it was built.
1 | 2