#127 | One might fire, or not fire—and it would come to absolutely the same thing
On Trying To Stop The Unstoppable Progress Of AI
“The development of full artificial intelligence could spell the end of the human race….It would take off on its own, and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.” — Stephen Hawking (2014)
“I visualise a time when we will be to robots what dogs are to humans, and I’m rooting for the machines.” — Claude Shannon
In the haunting “The Stranger” by Camus, the protagonist epitomises nihilism. He lives in equanimity, unphased by turmoil. His mum’s death provokes some sadness, but only because it seems to him it should. Later, he commits murder with cool detachment:
“And then it crossed my mind that one might fire, or not fire—and it would come to absolutely the same thing.”
One thousand one hundred important people just signed a petition calling for a temporary halt to the development and release of powerful large language models. This pause would hypothetically give regulators a window to catch up. But the horse has bolted; the bullet has left the barrel. The Rubicon, crossed.
I see some nihilistic traits among AI’s harshest critics. Yes, there is a lot of noise. But do they really care about the outcome? A petition won’t block the road between us and extinction (and, at this rate, they claim to be certain of extinction). If they were concerned, they might do more than an e-signature.
Very few work at the edge of AI; fewer worry about pausing progress. AI’s trajectory this year has been exponential — it will continue. The notion that government-driven regulation might be quick enough to course-correct is absurd. Any substantial policy would be ruled out given it might slow down the US’s head-start over China.
“I had lived my life one way and I could just as well have lived it another. I had done this and I hadn’t done that. I hadn’t done this thing but I had done another. And so?” — The Stranger
Our existence unfolds. And with it, so does AI. I’m super optimistic about how we will use these tools; my outlook is informed by the following:
Fear is more basic than optimism. It requires less personal risk. It looks smarter to neg when faced with the unknown. And artificial intelligence is unknown. People with boundless optimism appear disconnected from reality; the crazies. But are they really always wrong? This time it’s different, say the nay-sayers. This time will spark the End Of Days! But perhaps, like every other technological innovation, our quality of life will dramatically improve. Perhaps this time it’s not different.
There will be change. Those who embrace the change will do okay. Regardless, it’s not “if we want to” embrace the change, but how. Per Camus’s protagonist, the path we take is up to us; the destination is the same. But rather than succumbing to nihilistic indifference, we can be optimistic. Let’s run (skip?) enthusiastically towards history while it’s in the oven.
My week in books
Amp It Up by Frank Slootman. This is motivating. The subtitle says it all: “Leading for Hypergrowth by Raising Expectations, Increasing Urgency, and Elevating Intensity” If you want a summary, my friend Ernest has kindly written one.
A quote:
As one of my former bosses observed: “No strategy is better than its execution.” But those folks actually have it backward. Strategy can’t really be mastered until you know how to execute well. That’s why execution must be your first priority as a leader. Worrying about your organization’s strategy before your team is good at executing is pointless. Execution is hard, and great execution is scarce—which makes it another great source of competitive advantage
Live well,
Hector
PS. A noteworthy comment from Noahpinion’s blog above:
“AI researchers consistently proclaim the potential of their creations to wreak havoc, causing mass unemployment, ceaseless post-truth propaganda, or even human extinction. This hardly seems like an optimal PR strategy.”
True!! Imagine if Pfizer had constantly mentioned a 3% chance of their new COVID vaccine wiping out the vaccinated within a decade. The panic!