Friday Brain Food: Steve Omohundro on Self Improving Artificial Intelligence

(art © e11en heart vaman www.facebook.com/ellenvaman smile)

“Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then unquestionably be an ‘intelligence explosion,’ and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the last invention that man need ever make.” ~ I.J. Good

Self improving machines will lead to a whole system transition impacting all aspects of human affairs.

Today’s food for your brain is a a 2007 lecture by Steve Omohundro at the Stanford University Computer Systems Colloquium covering fundamental principles of “self-improving systems,” i.e., computer software and hardware that improve themselves by learning from their own operation. Listen all the way to the discussion at the end of the video where Steve covers some of the more interesting implications of self improving machines.

Good, I. J. (1965), Franz L. Alt and Morris Rubinoff, ed., “Speculations Concerning the First Ultraintelligent Machine”Advances in Computers (Academic Press6: 31–88, archived from the original on 2001-05-27, retrieved 2007-08-07

5 Comments

  1. This all assumes you can design a self improving intelligence using human intelligence. If you can’t then no level of nanotechnology will spontaneously generate a super intelligent computer. A million times faster spreadsheet is not more intelligent. As for human intelligence being self improving, I haven’t seen any evidence of that. We are no more intelligent than our ancestors. We just have better tools for memory and communication across time.

    • Actually there is some evidence about this, that is to say, known algorithms and actual research results that indicate otherwise. Self improving algorithms already exist.

      I will be posting further on this subject.

  2. Mike, isn’t human intelligence an example of a self improving system? What are you claiming exactly we “don’t need”? The person you are talking to sounds ignorant regardless of their supposed IQ.

  3. I spoke to a guy with a “verified 192 IQ” who told me he was “building supercomputers that I know Wilkins day kill me.” Need I say more about the shortcomings of IQ-dominant thinking without balance from the intuition and emotional intelligence? To build something that you know will kill you, in the name of “inevitable evolution”, is a perfect example of the ignorance of man playing God with the words “manifest destiny” stamped to his forehead. In truth, we don’t need these things. What we need is the development of the self and intuition and emotional intelligence. They alone can save us. Supercomputers will do nothing but increase the speed we annihilate ourselves.

Leave a Reply