Ted had a fascinating conversation with Sean Lane, founder and CEO of Crosschx. After talking about the power of bots to scale humans, we speculated with a hammering nails analogy. If today we are all pounding in nails with rocks, the coming (next 2-5 years and beyond) democratization of AI will give everyone not just hammers, but nail guns. With these nail guns people will build as-yet unimaginable cities and those cities will be the base from which AGI gets built.
Sean then played out “When will it happen?” game. Adding his answer to ours from last (0054) episode, here are our answers:
- Sean 2027/2045/2075
- Ted 2027/2042/2099
- Brandon 2040/2070/2110
When asked if we’re being alarmist on this show, calling out the existential risk from AGI, Sean posited that the alarm helps cause the conditions for the risk to be mitigated so we’ll never be right. The harm we’re alarmist about will never come to pass in part because we’re raising the alarm.
Also, two new (to us) ways to think about why the paperclip maximizer thought experiment is important. Because all AI researchers know about the thought experiment, maximizing paperclips is now a variable applicable to all AI systems. The good guys then will:
- Make darn sure their own systems can’t turn the solar system into paperclips.
- Be on the lookout for bad actors who might be intentionally trying to maximize paperclips.