By Ben Goertzel These last couple months have been fascinating (and exhausting) for me. I’ve largely taken a break from my AI research work and I’ve spent most of my time organizing a new AI-meets-blockchain project — SingularityNET — and...
Category: Friendly AI
The Asilomar AI Principles Should Include Transparency About the Purpose and Means of Advanced AI Systems
author Bill Hibbard The recently published set of 23 Asilomar AI Principles are intended to guide the development of artificial intelligence (AI). Two principles, 7 and 8, call for transparency about the reasons for harm caused by AI and transparency...
One thing we can do now is to advocate for the development of AI technology to be as open and transparent as possible — so that AI is something the whole human race is doing for itself, rather than something being foisted on the rest of the world by one or another small groups. In collaboration with Ethiopian AI firm iCog Labs, we have created an online petition in support of transparent AI.
Professor Doctor Hugo De Garis, Doctor Ben Goertzel and Ivar Moesman discusses various aspects of exponential growth of technology, artificial intelligence, bitcoin and 3D printing
Human level AI will be passed in the mid 2020’s, though many people won’t accept that this has happened. After this point the risks associated with advanced AI will start to become practically important.
When we talk about artificial intelligence (AI) – which we have done lot recently, including my outline on The Conversation of liability and regulation issues – what do we actually mean?
We’ve already started discussions around driverless cars, but there’s so much more to deal with when it comes to artificial intelligence.
As artificial intelligence technology advances, a portion of our population cannot help but react with panic.
Despite worries about threats from artificial intelligence, debates about the proper role of government regulation of AI have generally been lacking.
Do you fear babies? Then don’t fear AGI-. Should babies fear you? Then don’t fear AGI+.