Sign In

Remember Me

Singularity 101 with Vernor Vinge

SINGULARITY 101 with Vernor Vinge

The Singularity. Ray Kurzweil has popularized it and, by now, some of our readers no doubt drop it frequently into casual conversation and await it like salvation. (The second “helping?”) but many more are still unfamiliar with the concept.

The contemporary notion of the Singularity got started with legendary SF writer Vernor Vinge, whose 1981 novella True Names pictured a society on the verge of this “event.” In a 1993 essay, “The Coming Technological Singularity,” Vinge made his vision clear, writing that “within thirty years, we will have the technological means to create superhuman intelligence. Shortly after, the human era will be ended.”

We caught up with Vinge at the 2008 Singularity Summit in San Jose, California, where he opened the proceedings in conversation with Bob Pisani of CNBC.

Vinge’s most recent novel is Rainbow’s End.

h+: Let’s start with the basics. What is the Singularity?

VERNOR VINGE: Lots of people have definitions for the Singularity that may differ in various ways. My personal definition for the Singularity — I think that in the relatively near historical future, humans, using technology, will be able to create, or become, creatures of superhuman intelligence. I think the term Singularity is appropriate, because unlike other technological changes, it seems to me pretty evident that this change would be unintelligible to us afterwards in the same way that our present civilization is unintelligible to a goldfish.

h+: Haven’t there been other Singularities throughout history?

VV: Some folks will say there have been singularities before — for instance, the printing press. but before Gutenberg, you could have explained to somebody what a printing press would be and you could have explained the consequences. Even though those consequences might not have been believed, the listener would have understood what you were saying. But you could not explain a printing press to a goldfish or a flat worm. And having the post-Singularity explained to us now is qualitatively different from explaining past breakthroughs in the same way. So all these extreme events like the invention of fire, the invention of the printing press, and the evolution of cities and agriculture are not the right analogy. The technological Singularity is more akin to the rise of humankind within the animal kingdom, or perhaps to the rise of multi-cellular life.

h+: Is the Singularity near?

VV: I ‘d personally be surprised if it hadn’t happened by 2030. That doesn’t mean that terrible things won’t happen instead, but I think it is the most likely non-catastrophic event in the near future.

SINGULARITY 101 with Vernor Vingeh+: Should we be alarmed by the Singularity?

VV: You are contemplating something that can surpass the most competitively effective feature humans have — intelligence. So it’s entirely natural that there would be some real uneasiness about this. As I said, the nearest analogy in the history of the earth is probably the rise of humans within the animal kingdom. There are some things about that which might not be good for humans. on the other hand, I think this points toward something larger. Thinking about the possibility of creating or becoming something of superhuman intelligence is an example of an optimism that is so far-reaching that it forces one to look carefully at what one has wanted. In other words, humans have been striving to make their lives better for a very long time. And it is very unsettling to realize that we may be entering an era where questions like “what is the meaning of life?” will be practical engineering questions. And that should be unsettling. On the other hand, I think it could be kind of healthy, if we look at the things we really want, and look at what it would mean if we could get them. And then we could move forward from there.

h+: What signs would you look for which indicated that the Singularity is near?

VV: There are a number of negative and positive symptoms that a person can watch for. An example of a negative symptom would be if you began to notice larger and larger software debacles. In fact, that’s sort of fun to write about. One of the simplest of positive signs is simply to note whether or not the effects of Moore’s Law are continuing on track.

The fundamental change that may be taking place — humans may not be best characterized as the tool-creating animal but as the only animal that has figured out how to outsource its cognition — how to spread its cognitive abilities into the outside world. We’ve been doing that for a little while ten thousand years. Reading and writing is outsourcing of memory. So we have a process going on here, and you can watch to see whether it’s ongoing. So, for instance, in the next ten years, if you notice more and more substitution for using fragments of human cognition in the outside world — if human occupational responsibility becomes more and more automated in areas involving judgment that haven’t yet been automated — then what you’re seeing is rather like a rising tide of this cognitive outsourcing. That would actually be a very powerful symptom.


  1. To those using the Borg analogy, you are off base. To those that think we will see it coming and likening the Technological Singularity to prior so called singularities I think you are missing the point too. The Borg were a fairly primitive cyborg construct that does not scale well with what we could be looking at when you have a confluence of AI, nano tech, and bio tech.

    Vinge’s point about the printing press is accurate IMHO because the printing press was not the culmination of an exponential process, as is technological change today. I believe that with the exponential nature of technological change, if an HE AI was developed, we would be looking at a matter of days before the world we (hopefully still) live in would be unrecognizable from it’s post singularity state.

    • Agreed. The post-Singularity AIs, with their orders-of-magnitudes greater than human intelligence, will be able to make massive amount of such “one-off” inventions within very short amount of time.

  2. It’s an interesting concept, personally, I’m not as optimistic as Kurzweil – The human brain is far too complex to map every single thought process, synaptic transmission etc etc, computers are fundamentally serial by design having to complete one task before it moves on to the next, the brain on the other hand is completing thousands, if not millions of processes at any given moment. Quantum mechanics will be the only way to tackle the singularity progression, however, I still have my doubts as to even that being able to compete, but we shall see.

  3. Beyond the event horizon of a spacetime singularity the laws that govern the known universe are no longer applicable; the nature and attributes of space, time, and energy are fundamentally changed. The singularity itself cannot be directly observed, only the effect that it has on matter and space outside of the event horizon (gravitational lensing, polar x-ray emission, etc.) can be used to determine its location and existence. Mass and or information continues to be absorbed, but no mass or information is forthcoming. Perhaps similar restrictions occur in all true states of singularity; this might explain why there has never been a documented case of contact with an advanced race or consciousness. When a race evolves to the point where a social/intellectual/technological singularity occurs (a critical mass of intellect) the flow of information and resources is halted and/or reversed.

  4. LOL. sorry I keep seeing stuff on this topic. Personally I see the opposite, not a singularity but a branching deviation that grows, and yes evolves. The evolution doesn’t happen over night as some imply, but happens on a longer time frame just outside the majority of most peoples perception, and then seems more natural to humans. After all not all humans want gene therapy or modifications, and they will not all want to link to a interconnected hive like sentience. Human deviations will go on as long as there is a human race. Now the better question is when do we become something so much different from humans, that the evolved state seem alien or deity/godlike. As to the poster on 1000 core or company site . That doesn’t touch quantum photonic chips P. Threw great diversity, wealth and knowledge will evolve faster, and I mean wealth as not just tangible trinkets. Oh well just 2 for the pot so to say.

    • I think that unless there is a skynet type of startup, it will be like every other creation a slow start, popularity by early adopters, and final mass acceptance once it becomes inexpensive enough for most people, but not everyone will want it.

Leave a Reply