Singularity is one of those wonderful words, like democracy or virtue, which are sort of like the old ditty about the pelican, “whose beak can contain more than his belly can.” That is, these words contain within themselves an extraordinary amount of meaning, far more than even their fiercest partisans (or critics) are aware.
In the case of Singularity, that meanings begin with the obvious. For most everyone in the transhumanist community, it is defined (to quote Wikipedia) as “the hypothetical future emergence of greater-than human intelligence through technological means.” It is that moment, probably not excessively far away, when the human will be confronted with the superhuman…a superhumanity which may be organic, or non-organic, or, and perhaps most likely of all (at least at first), an amalgam of both.
But, I think, there is an unexamined nuance here. I suspect most of us think of the Singularity as being a, well, “singular” thing. We have somewhere in the back of our minds the idea that the Singularity comes along once and only once in the lifetime of a civilization or of a species. It is like evolving warm-bloodedness, developing a central cortex, or inventing language—once done, isn’t repeated.
Yet, I wonder if that’s the case.
I’m going to argue that a possible (albeit not inevitable) future for humanity involves not one but many Singularities, with super-intelligence obtained repeatedly, in regular cycles, perhaps interspaced with fairly dramatic Dark Ages, for so long as Homo sapiens exists.
To support this proposition, which is admittedly not intuitive, I’m going to have to work my way through several basic premises. So, please bear with me, and let’s begin with the first of them.
Premise #1) When the Singularity occurs, not everyone will take advantage of it.
Let’s assume that when super-intelligence is possible, it will be widely and easily available. That may not be the case, but I see no reason why it couldn’t be fairly democratic. It might be possible, for instance, to cheaply produce nanotechnical devices that you could ingest, and which could then swim through the bloodstream to the brain and link you directly to machine or other human intelligences. After that, particularly if the nanobots were self-replicating, then the transformation of the whole human race would be largely a matter of handing out pills. The Singularity would happen in the streets…or, at least, on the street corners, in free samples given out to passer-bys.
But will they take them?
I submit that many people won’t. It is, after all, a fairly scary thing. Imagine having someone say to you, “Hey, here’s a cup of nanobot cola. Swig it down and by morning you’ll be completely, utterly, and totally remade. You’ll have powers beyond your wildest dreams, but you’ll never…ever…ever be the same again.” I’m betting a lot of folks would respond to that suggestion with another proposal, far more forcefully stated, about where you could stick your damn ‘bots.
Oh, by the way, that’s just ordinary folks. There will be lots of other people for whom increasing human intelligence will be not only frightening, but actually hateful, even something to be violently opposed. Think about the opposition that already exists, from Right, Left, and Center to stem cell research, or even childhood vaccinations. Or think about the more unhinged of our Neo-Luddites.
And I’ve not touched on our elite groups who would see transhumanism and the Singularity as serious threats to their own positions. You can almost already hear the leaders of various religions, thundering down from a hundred thousand different pulpits, decrying the “secular religion” of the godless transhumanists, which could actually deliver things like near immortality and the end of want, rather them promising them in some unspecified future. As for our presidents, kings, CEOs, and University deans…. surely, they would be all for the enhancement of human intelligence so long as it was their monopoly, but once it was not…once the peasants could participate…’twould be a different kettle of neurons.
So, I predict that after the Singularity, we’ll see the human race divide between those people who decide to remain as they were and those who don’t. Which brings me to…
Premise #2: Transhumans will be hard to see.
The individuals who elect to be more than human will evolve very, very rapidly. They’ll be able to improve themselves continuously, after all. They’ll be adding new powers, new abilities, new facets of themselves, all the time. Which means, I suspect, that they’ll soon evolve right out of sight.
What do I mean by that? Well, as a character in one of my short stories says, consider the ant. It is a very successful being. It lived long before us. It will exist long after we’re gone. But does it see us? Does it think about us as we think about it? As a living being? As an entity with its own goals and aims?
I submit that it doesn’t. First, I’m not sure it has concepts like “entity” and “living,” but, second, even if it did, I’m pretty sure it would see us as just big, warm, moving, something-or-others. And our constructions? Our cities and farms? No different from any mountain or field.
My point, then, is that once transhumans were as far removed from us as we are from insects, they might be pretty much invisible to us. Or, more precisely, we’d see them, but not know what we were looking at.
Premise #3: The Singularity will bring social chaos.
When the Singularity occurs, particularly if those who elected to become transhuman were basically disappearing from view (a la Premise #2), society as we know it might well collapse. The population would be dividing into two radically different sorts of people. Nations, social institutions, economies, even families would be ripped apart. Indeed, there might be fairly dramatic conflicts between humans and the early transhumans, as the former attempted to prevent the latter from consolidating their place in the world.
But, whatever the process, the result will be (to quote Mr. Yeats) that “things fall apart; the centre cannot hold/Mere anarchy is loosed upon the world.” Normal humans might well regress all the way to something like pre-industrial society.
Then, after that, they (meaning normal humans) could easily forget what had happened, particularly if records were lost during the chaos following the Singularity. After all, what would people see if they looked back in time, as it were, a century or two later? Well, they’d see governments and economies falling apart. A population crash. Incidents of civil violence. Maybe barbarians of some sort invading from the peripheries of civilization. Given that evidence, historians might look back and say, “oh, there was a period of social chaos at the end of the twenty-first century, but it wasn’t much different from similar periods of chaos that have happened before and will doubtlessly happen again.”
Which brings me to
Premise #4: Industrial society would reappear.
No matter how total the chaos, no matter how much “anarchy is loosed upon the world,” eventually things would settle down again. Eventually, normal humans would rebuild their civilization.
That might not be easy. Depending on the degree of the chaos following the Singularity, H. sapiens could regress all the way to the Neolithic, or at least to agrarian empires. They might also be laboring under an anti-technical bias taught by their elites—machines are evil, etc. (It wouldn’t the first time something like that happened. One of the things that kept our own, post-Roman Dark Ages so depressingly dark was the doctrine, preached widely enough by Church Fathers, that excessive concern about the physical world detracted from one’s chances for salvation.)
But, sooner or later, the pressures of the real world…of making a living…of easing one’s troubles and increasing one’s pleasures…would make themselves felt. Technologies would be rediscovered or reinvented. Civilization would once again become urban. Farmers would be once more transformed into factory workers. Oxen and horse-drawn carriages would once again give way to steam and internal combustion. And, eventually, in some lab or university, some latter day Babbage would reinvent computing, some second Lady Ada would recreate programming, and, before much longer, a few hundred years at most, someone would have the bright idea of linking brain and CPU…
In short, a second Singularity, a second family of Transhumans, another period of chaos, and the whole cycle would start all over again.
Of course, I’m leaving out the possibility that one or more group of Transhumans might elect to intervene and make certain that the entire human race made the upgrade at one point or another. But, they might not. Maybe there will be something in Transhuman mores that would prevent them from doing so, say, the idea that super-intelligence is like freedom, and cannot be imposed without the consent of those who receive it. Or, maybe, they’ll elect to preserve us as potential replacements should anything go wrong with one or another particular path to transhumanity. Or, finally, maybe they’d keep us around for curiosity’s sake, the way we do zoo animals. (In one of my stories, a character suggests that transhumans will move off the planet entirely, leaving normal humans behind in a sort of world-wide nature preserve, with ourselves as H. sapiens sapiens in the mist.)
In short, if any of these factors prevented transhumans for forcibly extending super-intelligence to everyone, and if any of my four premises are at all right, then I can see no reason why a cycle of Singularities would not be pretty likely. Maybe not inevitable, but likely.
Which means that the Singularity is a multiplicity, not a thing but a cycle, not a consummation but a recurrent phenomenon. Singularities will appear every few generations, as regular as clockwork, until either all of humanity is “transhumanized” or goes extinct. The analogy that almost instantly springs to mind is that other Singularity, the astrophysicist’s Big Bang, which many a scientist now suggests was only one Creation among many, and that it will be repeated in time.
That, in turn, brings up a disturbing question. To wit, how would you know whether or not you were living in a post-Singularity society? How could you tell if it had happened, particularly if, as I’ve suggested, transhumans were more or less invisible to you?
I think we can safely assume that humanity hasn’t yet had a Singularity. If there had been one, I suspect we would find the remains of a previous industrialized culture in our environment. Indeed, given the human propensity for littering on a titanic scale, it would be impossible to escape the knowledge that someone had been there before us. The tons of discarded soft drink cans, landfills full of broken computer keyboards, and pounds of pennies piled up in wishing wells, or the equivalent of all these things, would tip us off. (Though it would make a great high concept for fantasy fiction. The lost civilizations of Atlantis and Lemuria did not sink beneath the waves. They ascended into posthumanity.)
But human cultures in the future may not be so lucky.
Perhaps, then, if only as a first exercise in a brand new science, the Sociology of the Singularity, we could establish a kind of rubric by which a culture could tell if its past included a Singularity. It might look something like this:
- If you are living in a society which emerged from a Dark Age of some sort, and,
- You have overwhelming evidence that societies more rather less technically advanced proceeded your own, and,
- Your society has strong (but fading) taboos regarding technology and computers and/or the use of those things to improve human intellects, and,
- Not often, but every once in a while, there are creditable (as opposed to merely popular) reports or artifacts or individuals who appear and demonstrate inexplicable powers and behaviors, then,
You have had a Singularity.
And almost certainly, another one’s coming.
Victor Storiguard is a writer and editor whose career stretches back for over a quarter of a century. Originally a journalist writing about computers, he has now moved on to science fiction. Much of his work deals with “transhumanism” and “posthumanism.” Critics have said that his tales combine hard technology with the appeal of fairy tales and myth. He can be reached via his blog at victor-storiguard.blogspot.com