AI VS IA
What is the nature of The Singularity and where will it first appear?
Most believe it will come from Artificial General Intelligence (AGI); a software system and computer that outperform any human. There was also recently an article H+ about emergent global intelligence; a natural evolution of our current model of using computer systems to create a synergy with humans, where we become a component of an emerging global mind. A third option is the augmented intelligence scenario which paradoxically reaches The Singularity while never allowing it to occur. We internalize any and all technological enhancements to directly increase our own intelligence and we do it directly through these augmentations.
Several problems are resolved using this approach. Science still has a relatively poor understanding of what makes a human being sapient. To create an artificial intelligence that can surpass any human being we must therefore still come to understand the base functions of intelligence itself. To make an analogy, it is not good enough to know all the parts of an engine. In order to create a different engine you must know why those parts work and how they function together to generate power. Keeping the analogy, it is much easier to refine and improve an engine than to design a novel one. You do not need to understand all aspects of a carburetor or fuel injector to imagine that one performs better.
Neuroplasticity simplifies matters. Potentially, if a particular area of the brain is technically augmented, the brain will respond and develop or can be caused to do so. This includes not only directly adapting to the enhancement itself but potentially a wider increase in overall processing capability. And although errors, noise, and uncertain and complex environments will be encountered by our expanded sensory abilities and cognitive enhancements, luckily our brains are already designed to compensate for these effects.
The IA approach can also produce greater total gain for everyone. Supercomputers capable of housing a superhuman intelligence might be large and expensive. Due to their cost, and potentially their owner’s desire to maintain the machine’s use for competitive advantage, we might not see much networking and knowledge sharing between AGIs. Initially their numbers will be small of course and there will be little interaction between them. But over time, their will be many AIs and they will communicate and share information. In contrast IA will build from existing low cost consumer technologies like mobile devices, smartphones, and augmented reality glasses.
We should also keep in mind the virtues of human experience and embodiment. Decades of accumulated knowledge. The constant bombardment of chaotic information and stimulation that creates that moment of inspiration. It seems that the spark that drives innovation will be hard to replicate in a machine that is spoon-fed information. The IA approach easily leverages human knowledge and experience.
In either event, it is unlikely an any emergent or built AGI will be given free reign. Deep down humans dearly value their agency. They like being in control and fear things they cannot control. And no matter the fail-safes and shackles placed on an AGI, humans will worry about it escaping or possibly usurping us somehow. Intelligence augmentation does not have this risk.
However the one thing that does stand in the way of an augmented intelligence path to singularity is bio conservatism. There is the worry that by enhancing ourselves that we will make ourselves less human. A valid concern. And eventually it could happen that we become something other than or more than human via this path. Humans will augment themselves until their minds have become effectively post human. At that moment the IA singularity will have occurred as humans will have become something unrecognizable to our current baseline humanity.
If transhumanism is widely embraced as we all hope it will be, then we may hope that small and simple enhancements won’t be seen as taboo. Perhaps the IA patth will occur as a slow and gradual change instead of an explosive leap. What is considered average intelligence would slowly change and with it our definition of being human.
The singularity is that time when a greater human intelligence, but if we transform humanity into that intelligence we also technically fail to create something beyond ourselves and instead evolve into this future super intelligent being.
Perhaps the IA path will allow us to have our cake and eat it too.