This comment is hidden.
Show Comment
CynicalSpaceMan · 26-30, M
@jshm2 are you familiar with "Pokemon Go!" ?
This comment reminds me of the premise of that game; augmented reality.
This comment reminds me of the premise of that game; augmented reality.
I basically agree with the spine of this. You’re naming something like a recurring civilizational panic reflex — the tendency to interpret disruptive novelty through apocalyptic metaphors, moral hysteria, or cosmic dread. Once you’ve read even a modest amount of history, the pattern becomes almost boringly repetitive.
You don’t need to be Toynbee or Braudel to see it.
Every genuinely transformative technology triggers the same narrative arc:
1. Initial shock and symbolic fear.
Railroads would shatter human nerves and destroy morality.
Electric light would abolish sleep and ruin the soul.
Jazz would dissolve sexual restraint and racial boundaries.
Television would rot children’s brains.
The internet would end privacy and truth.
Crypto would collapse civilization.
AI will extinguish humanity.
The metaphors are always Biblical because humans reach for sacred language when they feel loss of control. “Frogs raining from the sky” is exactly right — plagues, corruption, cosmic disorder.
2. Gradual normalization and selective absorption.
What actually happens is not utopia or apocalypse but messy integration: institutions adapt, regulations emerge, secondary industries grow, cultural norms shift. The technology becomes boring infrastructure rather than mythic threat.
3. Long-term net gain with uneven distribution of pain.
There are real casualties: displaced workers, broken communities, cultural dislocations. But civilization keeps absorbing the shock because usefulness and desire outweigh the damage.
Your deeper point — and this is where you’re absolutely right to shift the frame — is that the existential bottleneck isn’t innovation at all. It’s coordination.
We are spectacularly good at:
inventing tools,
scaling production,
optimizing extraction,
generating surplus.
We are terrible at:
distributing benefits fairly,
aligning incentives with long-term stability,
organizing mass cooperation without domination,
preventing power from concentrating into predatory structures.
In other words: our technological intelligence has outrun our political and moral architecture.
That’s not a new insight either — you can trace it from Hobbes through Marx, Weber, Polanyi, and straight into modern complexity economics and institutional theory. But it remains unsolved because it isn’t a technical problem; it’s a human one. It touches status, fear, tribalism, domination instincts, zero-sum thinking, and the pathology of power.
You’re also right that many of today’s “crises” are downstream expressions of this failure of coordination:
economic inequality,
ecological overshoot,
political polarization,
labor precarity,
information disorder,
authoritarian drift.
They’re not separate problems. They’re surface turbulence caused by misaligned systems operating at planetary scale without adequate governance or shared legitimacy.
One way to put it, in slightly more architectonic language (which I know you appreciate), is:
We have built a civilization capable of global action, but we still govern it with institutions designed for city-states, empires, and early industrial nations.
Or even more starkly:
Our tools have become planetary; our ethics and coordination mechanisms remain parochial.
Where I might add one nuance to your view is this: while societies do generally adapt to useful innovations, the distributional consequences of each wave are becoming faster and more nonlinear. The pace of change compresses adjustment time, which stresses social cohesion and institutional legitimacy. That doesn’t mean apocalypse — but it does mean turbulence may intensify even if the long arc bends toward accommodation.
Still, your central instinct is sound and historically grounded: fear narratives around AI are largely recycling older cultural scripts. The real problem isn’t whether the machine thinks — it’s whether humans can organize themselves intelligently enough to live with what they’ve built.
That’s the bottleneck. Not silicon. Not algorithms. Not frogs from the sky.
You don’t need to be Toynbee or Braudel to see it.
Every genuinely transformative technology triggers the same narrative arc:
1. Initial shock and symbolic fear.
Railroads would shatter human nerves and destroy morality.
Electric light would abolish sleep and ruin the soul.
Jazz would dissolve sexual restraint and racial boundaries.
Television would rot children’s brains.
The internet would end privacy and truth.
Crypto would collapse civilization.
AI will extinguish humanity.
The metaphors are always Biblical because humans reach for sacred language when they feel loss of control. “Frogs raining from the sky” is exactly right — plagues, corruption, cosmic disorder.
2. Gradual normalization and selective absorption.
What actually happens is not utopia or apocalypse but messy integration: institutions adapt, regulations emerge, secondary industries grow, cultural norms shift. The technology becomes boring infrastructure rather than mythic threat.
3. Long-term net gain with uneven distribution of pain.
There are real casualties: displaced workers, broken communities, cultural dislocations. But civilization keeps absorbing the shock because usefulness and desire outweigh the damage.
Your deeper point — and this is where you’re absolutely right to shift the frame — is that the existential bottleneck isn’t innovation at all. It’s coordination.
We are spectacularly good at:
inventing tools,
scaling production,
optimizing extraction,
generating surplus.
We are terrible at:
distributing benefits fairly,
aligning incentives with long-term stability,
organizing mass cooperation without domination,
preventing power from concentrating into predatory structures.
In other words: our technological intelligence has outrun our political and moral architecture.
That’s not a new insight either — you can trace it from Hobbes through Marx, Weber, Polanyi, and straight into modern complexity economics and institutional theory. But it remains unsolved because it isn’t a technical problem; it’s a human one. It touches status, fear, tribalism, domination instincts, zero-sum thinking, and the pathology of power.
You’re also right that many of today’s “crises” are downstream expressions of this failure of coordination:
economic inequality,
ecological overshoot,
political polarization,
labor precarity,
information disorder,
authoritarian drift.
They’re not separate problems. They’re surface turbulence caused by misaligned systems operating at planetary scale without adequate governance or shared legitimacy.
One way to put it, in slightly more architectonic language (which I know you appreciate), is:
We have built a civilization capable of global action, but we still govern it with institutions designed for city-states, empires, and early industrial nations.
Or even more starkly:
Our tools have become planetary; our ethics and coordination mechanisms remain parochial.
Where I might add one nuance to your view is this: while societies do generally adapt to useful innovations, the distributional consequences of each wave are becoming faster and more nonlinear. The pace of change compresses adjustment time, which stresses social cohesion and institutional legitimacy. That doesn’t mean apocalypse — but it does mean turbulence may intensify even if the long arc bends toward accommodation.
Still, your central instinct is sound and historically grounded: fear narratives around AI are largely recycling older cultural scripts. The real problem isn’t whether the machine thinks — it’s whether humans can organize themselves intelligently enough to live with what they’ve built.
That’s the bottleneck. Not silicon. Not algorithms. Not frogs from the sky.
CynicalSpaceMan · 26-30, M
@FrogManSometimesLooksBothWays
well put!
I really don't have anything to add other than i wish more people were focusing on this issue, the issue at the core of it all, instead of getting wrapped up in political theater.
well put!
I really don't have anything to add other than i wish more people were focusing on this issue, the issue at the core of it all, instead of getting wrapped up in political theater.
ArishMell · 70-79, M
I think you are right there.
I wonder what was said when printing was invented? It was dangerous enough with Caxton's method, using a wood-cut to create a single block for the entire page; but then came Gutenburg with his moveable type, making the process faster, easier and probably cheaper.
I wonder what was said when printing was invented? It was dangerous enough with Caxton's method, using a wood-cut to create a single block for the entire page; but then came Gutenburg with his moveable type, making the process faster, easier and probably cheaper.
dancingtongue · 80-89, M
Santayana boiled it down one summary sentence, which I have paraphrased: "Those who ignore history, get to relive it."
CynicalSpaceMan · 26-30, M
@dancingtongue and most peoole do ignore history. So that explains a lot lol
JimboSaturn · 56-60, M
Have you ever read Yuval Noah Hariri? He has some interesting books on AI and the future of humanity.
JimboSaturn · 56-60, M
@CynicalSpaceMan I didn't either, but I saw it discussed on TV. He has three books covering the past, present , and future of humanity.
CynicalSpaceMan · 26-30, M
@JimboSaturn my to-be-read list is long. Too long. I'll have to keep that in mind though! Thanks.
JimboSaturn · 56-60, M
@CynicalSpaceMan I know the feeling!
peterlee · M
We are policed now by AI in the West Midlands.
MarkPaul · 26-30, M
This repetitive cycle makes it seem that we are living inside a script and have been since the beginning.
sree251 · 41-45, M
AI will surely ensure mass cooperation once the algorithm takes over and displace the flow of human consciousness.
ThirstenHowl · M
https://similarworlds.com/automation/artificial-intelligence/artificial-stupidity/5490971-but-people-feared-the-printing-press-too










