Adam Ford Interviews MIRI’s Luke Muelhauser

 

Interview on AI, the Singularity, and it’s importance.
Questions/Talking points:
What is Intelligence?
Differences: AI & AGI
Spectrums of General Intelligence
Forecasting AI
Accelerating Returns
Convergent Outcomes
Is Consciousness Required for an Intelligence Explosion?
Could we Detect Consciousness in AI?
The Singularity: Three Major Schools
Thinking Clearly about the Singularity
Avoiding Common Anthropomorphisms when Thinking About the Future
Speed Bumps
Signs that a Singularity is Imminent
Accelerators
Large Datasets, Automated Science & Quantum Computing
What is the Likelihood of a Hard Takeoff?
Approaches to Safe AI: Theory or Practise?
Avoiding an Uncontrolled Intelligence Explosion
Benefits of a Controlled Intelligence Explosion
Increasing the Likelihood of a Controlled Intelligence Explosion
Should we let AI Decide what is Ethical?
Oracle AI
De Novo AI
Friendliness & Reverse Engineering The Brain
Convincing Arguments for the Importance of AI
What would Friendly AI Look Like?
Coherent Extrapolated Volition
CEV: Extrapolating Volition to a Single Point


For more videos of lectures and interviews with thought leaders please
Subscribe to Adam Ford’s YouTube Channel


1 Comment

Leave a Reply