You can take on pretty much any plant in the world.
You versus the plant and that plant is going down.
Liron explains the nature of Super-Intelligence, highlighting the orders of magnitude difference in speed.
You can take on pretty much any plant in the world. You versus the plant and that plant is going down. Liron explains the nature of Super-Intelligence, highlighting the orders of magnitude difference in speed.
John is explaining how Intelligence levels and goals are independent of each-other and how we might end up into a Universe tiled with endless Philadelphia Eagle logos!
John is explaining how Intelligence levels and goals are independent of each-other and how we might end up into a Universe tiled with endless Philadelphia Eagle logos!
John is explaining how AGI morality structures may very well be different from ours, similar to how we treat creatures of lesser intelligence.
John is explaining how AGI morality structures may very well be different from ours, similar to how we treat creatures of lesser intelligence.
Liron is giving his take on the corrigibility problem, the crushingly hard open scientific problem of designing AI systems that don’t resist modifications, even if such actions might interfere with their original goals.
Liron is giving his take on the corrigibility problem, the crushingly hard open scientific problem of designing AI systems that don’t resist modifications, even if such actions might interfere with their original goals.
To expect AI alignment to happen by default is like expecting a rocket randomly fired into the sky to land exactly where we would like it to.
Liron gives his thoughtful analysis on this visual analogy.
To expect AI alignment to happen by default is like expecting a rocket randomly fired into the sky to land exactly where we would like it to. Liron gives his thoughtful analysis on this visual analogy.
AGI is not your next gadget, it’s your portal to an utterly different Earth!
Liron explains how we are facing a discontinuity akin to that our planet went through when life was born and started terraforming the previously dead surface.
AGI is not your next gadget, it’s your portal to an utterly different Earth! Liron explains how we are facing a discontinuity akin to that our planet went through when life was born and started terraforming the previously dead surface.
Liron gives us an intuition about how hard the AI alignment problem is at its core, drawing a contrast between the AI’s recursive self-improvement to build up its capabilities to be way super-human level and the thing we’re asking it to do at the same time, which is to be biased in the way that humans are biased and maintain the fragile human values.
Liron gives us an intuition about how hard the AI alignment problem is at its core, drawing a contrast between the AI’s recursive self-improvement to build up its capabilities to be way super-human level and the thing we’re asking it to do at the same time, which is to be biased in the way that […]
Curated Sources
Quality External References
Recommended Reading
Handpicked Essential Reads
Open Letters
Statements signed by hundreds of AI experts, professors and tech leaders
Books
Channels
Creators contributing to raising AI risk awareness
AI-Safety Orgs
Institutes and Establishments
On mainstream news
Articles from major news outlets exploring AI risks in traditional journalism