Planning for the End of the Era
Most of us don’t claim to know the future. Even futurists say they are predicting “futures,” not “the future.” Those who give definite predictions are just entertainers.
But we are all forced to make predictions about the future when we decide whether and how to save and invest. You can’t avoid it: Whether you save or spend your monthly salary, whether you buy stocks or put money in a savings account, your decisions today will affect your life in years to come.
As transhumanists, we expect tremendous changes in coming decades, leading up to a shift known as the Singularity. We think we have deeper knowledge of the probability distributions than most people. This gives us a powerful information advantage as we invest, and should let us beat the market. On the other hand, if we choose to ignore these extraordinary insights in planning for the future, our time spent learning about the Singularity is just personal entertainment, like learning the minutiae of Star Trek.
But it’s not that simple to get rich. Different thinkers come to completely opposite conclusions about the effect that knowledge of the Singularity should have on our investment decisions. Some say that Singularity-aware investment is essential; others say that the coming Singularity renders retirement savings moot. This brings us back to square one: Knowledge which leads to contradictory conclusions is worse than useless.
In this essay, I’ll explain the source of this disagreement, and summarize your options for treating the future seriously.
Three Schools
Eliezer Yudkowsky has distinguished three definitions for the term “Singularity”:
(1) Accelerating Change: Technological progress is increasing exponentially.
(2) The Event Horizon: We can’t hope to make any predictions about what will come after the Singularity point.
(3) Intelligence Explosion: Near-human level artificial intelligence will improve itself, then improve itself again recursively, until it reaches far-above-human levels.
Accelerating Change
If you follow the “Accelerating Change” school, you should invest big, now, before other investors catch on. Ray Kurzweil, the leading advocate for this position, says: “Given that the (literally) short-sighted linear intuitive view represents the ubiquitous outlook, the common wisdom in economic expectations is dramatically understated …. But the law of accelerating returns clearly implies that the growth rate will continue to grow exponentially, because the rate of progress will continue to accelerate” (The Singularity is Near, 2006).
You can’t know exactly which companies will succeed, but as the economy grows, an index-fund investment across the entire market will gain in value. As computers replace humans, the value of human labor will shrink and the value of owning the capital–the technology which produces more and more–will grow. By investing now, you can grab a piece of that. And technological companies should go up even more than the market as a whole: Buy a tech-fund now!
Critics of Kurzweil say that accelerating change will bring riches disproportionately to the wealthy, widening the gap between rich and poor. If you happen to be one of the world’s poorest people, this could put a damper on your retirement plans. Kurzweil counters that progress soon enough brings wealth to all, and so even the poorer among us can plan to benefit from accelerating change.
If you believe Kurzweil that everyone will benefit greatly, then you might want to do the opposite: Don’t invest, don’t save for retirement. If we are looking forward to a world of universal staggering wealth, why bother to rush ahead, struggling to become a trillionaire in a world of real-value billionaires? Relax, and enjoy life, while abundance for all takes care of retirement. Economist James Miller described the expected trends as the Singularity approaches: People will live it up rather than investing for their retirement (and there will be side-effects: interest rates will rise).
If you don’t want to save up for retirement, it doesn’t necessarily meaning relaxing and waiting for the pleasant or unpleasant end to come. If you are an altruist, you can work hard to help others, for example by developing new technologies which make life better for everyone, as Kurzweil has done so many times.
Kurzweil himself, however, does not believe that looking superabundance makes saving worthless. FatKat, one of his many ventures, was a hedge fund, using advanced algorithmics to beat the market.
Life Extension
Life extension technology is one aspect of accelerating change. Learning that you have a much longer life expectancy than you thought should make you want to save more. You’ll have to gather enough of a nest-egg to retire for a long, long time, not just for a few decades. On the other hand, you’ll have more time to put the money together, since life-extension will come with health-extension, and the retirement age of 65 will become irrelevant. Retirement will still exist, but it will have nothing to do with resting during your fading years.
Even on today’s non-transhumanist assumptions, you need to put aside enough money to live mostly on the interest; you can’t rely too much on spending down the principal. With an indefinite retirement horizon, retirement with life extension will be like today’s early retirement: If you can save and invest enough money to make your finances self-sustaining, you can step out of the rat-race, while still young and healthy.
The most obvious decision triggered by our transhumanist realizations about life extension is simply: Buy an annuity before the insurance companies figure out what’s going on.
On the other hand, if the world as we know it will come to an end in a few decades, it’s pretty worthless to plan for your centuries of life using the same economic assumptions as today.
Event Horizon
All that comes from the Accelerating Change school of Singularity. The Event Horizon school leads to the opposite decision. In a world reworked beyond our current comprehension, petty human concepts like ”‘savings,” “investment,” “retirement,” and “money” will mean nothing. The size of your brokerage account is unlikely to make a difference to your personal outcome.
Many followers of William Miller, a Messianic prophet who predicted the end of the world on October 22, 1844, sold their possessions before the fated day. This made no sense. Whether the world would end in destruction or utopia, money would do no one any good, nor would the possessions. Why bother?
The Volatile Future
Peter Thiel, founder of PayPal, early investor in Facebook, and major sponsor of transhumanist non-profit endeavors, has a different take on investment planning during the approach to the Singularity. His perspective might be called the Volatile Singularity, foreseeing either an Event Horizon, in which economic value loses its meaning, or Accelerating Change towards superabundance. At the 2007 Singularity Summit, he described the possibilities: On the one hand, we might face the destruction of our civilization, most likely for the worse though perhaps for the better, and in any case rendering any investment moot. On the other hand, we might be rushing towards a Kurzweillian much-improved future, in which case investments can hit the jackpot. Thiel recommended a strategy which succeeds under extremes of welfare or disaster, as opposed to the ordinary middle ground. For example, insurance companies will rake in profits during times of universal well-being, whereas in times of wide-spread disaster, governments will be forced to bail them out; or, if it’s really bad, civilization will collapse, and no investment strategy will mean much.
Intelligence Explosion
The Intelligence Explosion school thinks similarly to the Event Horizon: When an intelligence vastly more powerful than human is the most powerful entity in our world, it will either work towards human welfare or towards other goals, which will probably mean severe harm to human welfare. Either way, you won’t care too much about your pension.
But a different type of planning decision distinguishes the Event Horizon from the Intelligence Explosion schools of thought. Proponents of the former don’t think they can influence a completely unpredictable future. Proponents of the latter work to increase the chances of a superintelligence which seeks our wellbeing by correctly designing the initial “seed AI.” Even a small shift in the probabilities has a tremendous expected utility for the entire human race, the creators of the seed AI among them.
And in fact, employees of the Singularity Institute, which favors the this latter school of thought, have no retirement savings plan, and they invest their valuable personal time in increasing the chances of a positive Singularity. They put their money where their mouth is.
Invest differently
We know something very important that most people don’t about the possible futures we face. You’d think that as a result, transhumanists would have a shared elements in their investment strategies, and that they would beat the market. But there is no coherent direction shared by those who take the Singularity into account in their planning.
This confusion results from a mistaken reliance on present-day assumptions in addressing a completely unknown future. Whether one analyzes the Singularity as Accelerating Change, Event Horizon, or Intelligence Explosion, if you take the forecasts to their logical conclusion, you’ll estimate that most of today’s societal frameworks will be completely disrupted. A leap to a new way of thinking is essential.
We need to find new and better ways to invest than to buy stocks or put money in a pension account. Developing the future technologies that will bring us a positive Singularity is one way to do this. You can do so as a investor, putting down your money in expectation of personal profit; as a donor, seeking return on your investment for the benefit of all humanity; or as a scientist or engineer, creating beneficial technologies while also building up a career which successfully rides the shock wave.
Developing technology for money’s sake is one thing; a specific effort to maximize the welfare of the human species is another. For ordinary technologies, like Customer Relationship Management software, personal wealth maximization has only a little to do with the future of the human race.
One step beyond that is targeting investment into technology that might radically improve the human future–e.g., life extension–while also making you wealthier. But it can be dangerous to let your heart guide your money.
Even further towards extremes of power, there are technologies so powerful that ordinary investing has little to do with them, but they have other attractions: With the most powerful technologies, improving the lot of humanity can align strongly with your personal goals. To take one historical example, Enrico Fermi and other refugees developed the atom bomb to keep Hitler from conquering the world; Nazism had also been their own personal worst enemy. Conversely, technology also has its risks, and many Manhattan Project scientists were also leaders in warning the world against the dangers of global destruction–which would wipe out only all humanity, but also the scientists themselves.
As we move onwards towards more powerful technology than exists today, we get to greater-than-human intelligence. The world-changing potential of superintelligence will radically affect its inventors’ lives, as part of all humanity’s. But at this level of change, ordinary economic calculations fail, as hinted by the mathematical meaning of “Singularity.”
Purely altruistic reasons can motivate contributions of money and personal effort towards the development of extreme world-improving technologies. But can doing so be a better choice, from a purely self-centered perspective, than ordinary investments? This can occur if the following hold with the right probabilities and utility values: The Singularity is expected to bring either extreme satisfaction of human value on the one hand, or total destruction of all human value on the other; the initial conditions, to be set up by a lab’s-worth of scientists and engineers, have a good probability of tipping the results either way; and ordinary economic investments–equity, loans, real estate, etc.–will likely lose their meaning when the economy as we know it undergoes a shift into something completely different.
There would still be still a free-rider problem. If the above conditions hold, you can still wait for someone else to contribute while you focus on your short term goals. But if you find that insufficient investment has occurred to date, then contributing to extreme beneficial technologies, even with no hope of ordinary economic return, would be your personal optimal choice.
That particular combination is a very long shot. But in facing a collapse of all our expectations, we have to decide how to balance our portfolio, perhaps by hedging our bets. There are three elements: Ordinary investments, to leverage the accelerating economy (and also a safer bet if all this transhumanism stuff is poppycock); diversified bleeding-edge tech investments, on the understanding that technology is going to move ahead faster than most people expect; and contributions to long-term research dedicated to tipping the human future away from the worst and towards the best possibilities.
Returns come with risk, and for world-changing technology, both the risks and the potential returns are enormous. But inaction, or pursuit of conservative strategies, also carry risk. The decision is up to you.
Joshua Fox works at IBM, where he co-founded IBM’s redaction product and now manages its development. He has served as a software architect in various Israeli start-ups and growth companies. On the transhumanism side, he is a Research Associate of the Singularity Institute for Artificial Intelligence. Links to his talks and articles are available at his website and blog.
This is all sound woolly wild to me.
– a newcomer to the transhumanist discourse who finds talk of an “superabundance” of intelligence or riches in the future amusing.
@srsly Today’s world appears to be a superabundance of knowledge [if not intelligence] and riches to someone from the year 1000. Poor people are fat! People fly in the air. From Beijing, you can see the streets of London at will. I don’t know if new wonders will increase, but this at least proves that they are possible.
What about all the technologies that will be turned into consumer products along the way, and that most people reading this will want? Cybernetic enhancements, household nanotech, hardware for personal AI, etc.
It’s not really important, but Eliezer Yudkowsky was not the first to distinguish the Three Schools. This idea – showing three definitions (by Vinge and Kurzweil) and analyzing how they are connected was presented on February 22, 2007 at Moscow transhumanist seminar:
http://www.transhumanism-russia.ru/content/view/383/104/
http://www.transhumanism-russia.ru/documents/presentations/sing_med.ppt
This was before Eliezer presented this idea at a Singularity Summit and elsewhere. I don’t think he knew about this presentation, since it’s in Russian, but just setting the record straight.
@Danila, if someone can take the time to translate that and publish that Russian-language, I’d love to see it.
Nice article. I think the most widely understood definition of the singularity is: the rise of super-intelligence–aka the INTELLIGENCE EXPLOSION–which could (hopefully does) include human intelligence augmentation.
The other ideas aren’t really “definitions” at all.
ACCELERATING CHANGE is the reason we think the singularity will happen, not the definition of the singularity.
And EVENT HORIZON is kind of an hypothesis about how closely the metaphoric “technological singularity” matches the characteristics of a black hole. Or maybe you could look at EVENT HORIZON as simply what Vinge said: that it is/was difficult for SciFi authors to come up with plots for periods after an intelligence explosion. Certainly, there is difficulty in prediction, but as we move forward in time, the so-called event horizon moves further into the future too. Maybe humans will reach a point where they have zero predictive capacity: moment to moment, nothing makes sense = complete feeling of insanity. Maybe, intelligence-enhanced transhumans will be smart enough to understand what’s going on in the world and predict some of the future, just like humans can predict some of the future right now. Either way, EVENT HORIZON isn’t really a definition so much as an observation/hypothesis/question about the texture of the future.
Anyhow, if you assume there will be a near-term intelligence explosion, then the only two realistic scenarios for humanity are total riches (and life extension) or total wipe out (or worse, life extension coupled with eternal torture). Either way, that kind of puts the kibosh on worrying about saving for the future. Either you won’t need the savings (because your future ability to create wealth will be enormous) or you won’t need the savings (because you’ll be dead).
More here, if you’re interested: http://www.vancewoodward.com/journal/2011/11/12/predicting-the-future.html
and here
http://www.vancewoodward.com/journal/2011/7/30/blow-your-money-now-when-its-worth-something.html