Positive
Only logged in members can reply and interact with the post.
Join SimilarWorlds for FREE »

Once you've read a history book or two, you start to see how repetitive certain sociological "tropes" are.

And you dont need to be an ivy league level historian to observe the phenomenon.

Every great invention that changed our mode of life had people claiming frogs would rain from the sky and the water would turn to blood.

Electricity, cars, the blues, the internet, cryptocurrency, and now AI.

Society and the world has a way of adjusting to and accommodating innovations that are more useful and desirable than they are deterimental.

Its happened many times before and it will continue to.

Our main issue as a species right now is an organizational and logistical one; we haven't figured out how to spur enterprise and mass cooperation without resorting to exploitation and coercion.

IMO, that is the issue that deserves our bandwidth.

If this problem were to be solved a lot of other problems would be solved at the same time, as they are but branching consequences of it.
Top | New | Old
FrogManSometimesLooksBothWays · M Best Comment
I basically agree with the spine of this. You’re naming something like a recurring civilizational panic reflex — the tendency to interpret disruptive novelty through apocalyptic metaphors, moral hysteria, or cosmic dread. Once you’ve read even a modest amount of history, the pattern becomes almost boringly repetitive.

You don’t need to be Toynbee or Braudel to see it.

Every genuinely transformative technology triggers the same narrative arc:

1. Initial shock and symbolic fear.

Railroads would shatter human nerves and destroy morality.
Electric light would abolish sleep and ruin the soul.
Jazz would dissolve sexual restraint and racial boundaries.
Television would rot children’s brains.
The internet would end privacy and truth.
Crypto would collapse civilization.
AI will extinguish humanity.

The metaphors are always Biblical because humans reach for sacred language when they feel loss of control. “Frogs raining from the sky” is exactly right — plagues, corruption, cosmic disorder.

2. Gradual normalization and selective absorption.

What actually happens is not utopia or apocalypse but messy integration: institutions adapt, regulations emerge, secondary industries grow, cultural norms shift. The technology becomes boring infrastructure rather than mythic threat.

3. Long-term net gain with uneven distribution of pain.

There are real casualties: displaced workers, broken communities, cultural dislocations. But civilization keeps absorbing the shock because usefulness and desire outweigh the damage.

Your deeper point — and this is where you’re absolutely right to shift the frame — is that the existential bottleneck isn’t innovation at all. It’s coordination.

We are spectacularly good at:

inventing tools,

scaling production,

optimizing extraction,

generating surplus.

We are terrible at:

distributing benefits fairly,

aligning incentives with long-term stability,

organizing mass cooperation without domination,

preventing power from concentrating into predatory structures.

In other words: our technological intelligence has outrun our political and moral architecture.

That’s not a new insight either — you can trace it from Hobbes through Marx, Weber, Polanyi, and straight into modern complexity economics and institutional theory. But it remains unsolved because it isn’t a technical problem; it’s a human one. It touches status, fear, tribalism, domination instincts, zero-sum thinking, and the pathology of power.

You’re also right that many of today’s “crises” are downstream expressions of this failure of coordination:

economic inequality,

ecological overshoot,

political polarization,

labor precarity,

information disorder,

authoritarian drift.

They’re not separate problems. They’re surface turbulence caused by misaligned systems operating at planetary scale without adequate governance or shared legitimacy.

One way to put it, in slightly more architectonic language (which I know you appreciate), is:

We have built a civilization capable of global action, but we still govern it with institutions designed for city-states, empires, and early industrial nations.

Or even more starkly:

Our tools have become planetary; our ethics and coordination mechanisms remain parochial.

Where I might add one nuance to your view is this: while societies do generally adapt to useful innovations, the distributional consequences of each wave are becoming faster and more nonlinear. The pace of change compresses adjustment time, which stresses social cohesion and institutional legitimacy. That doesn’t mean apocalypse — but it does mean turbulence may intensify even if the long arc bends toward accommodation.

Still, your central instinct is sound and historically grounded: fear narratives around AI are largely recycling older cultural scripts. The real problem isn’t whether the machine thinks — it’s whether humans can organize themselves intelligently enough to live with what they’ve built.

That’s the bottleneck. Not silicon. Not algorithms. Not frogs from the sky.
CynicalSpaceMan · 26-30, M
@FrogManSometimesLooksBothWays

well put!

I really don't have anything to add other than i wish more people were focusing on this issue, the issue at the core of it all, instead of getting wrapped up in political theater.

ArishMell · 70-79, M
I think you are right there.

I wonder what was said when printing was invented? It was dangerous enough with Caxton's method, using a wood-cut to create a single block for the entire page; but then came Gutenburg with his moveable type, making the process faster, easier and probably cheaper.
FreddieUK · 70-79, M
@ArishMell That was a most dangerous invention. It meant that ideas could be spread more easily without the control of 'authority'. Coupled with universal education, the world view accepted for centuries was doomed to become pluralistic.
ArishMell · 70-79, M
@FreddieUK Quite so. The same "authority" was the one desperate to prevent the translation of the Bible into local languages.
FreddieUK · 70-79, M
@ArishMell That was definitely one authority. I attended a college which had a window dedicated to William Tyndale, credited with the first translation of the New Testament into English.
dancingtongue · 80-89, M
Santayana boiled it down one summary sentence, which I have paraphrased: "Those who ignore history, get to relive it."
CynicalSpaceMan · 26-30, M
@dancingtongue and most peoole do ignore history. So that explains a lot lol
JimboSaturn · 56-60, M
Have you ever read Yuval Noah Hariri? He has some interesting books on AI and the future of humanity.
JimboSaturn · 56-60, M
@CynicalSpaceMan I didn't either, but I saw it discussed on TV. He has three books covering the past, present , and future of humanity.
CynicalSpaceMan · 26-30, M
@JimboSaturn my to-be-read list is long. Too long. I'll have to keep that in mind though! Thanks.
JimboSaturn · 56-60, M
@CynicalSpaceMan I know the feeling!
peterlee · M
We are policed now by AI in the West Midlands.
FreddieUK · 70-79, M
@peterlee Only the once, I suspect.
MarkPaul · 26-30, M
This repetitive cycle makes it seem that we are living inside a script and have been since the beginning.
MarkPaul · 26-30, M
@ArishMell Or want to control each other.
ArishMell · 70-79, M
@MarkPaul Agreed: mutual fear, mutual mis-trust, mutual desire to oppress the other.
MarkPaul · 26-30, M
@ArishMell And sadly that is human nature from the beginning to now.
sree251 · 41-45, M
AI will surely ensure mass cooperation once the algorithm takes over and displace the flow of human consciousness.
This comment is hidden. Show Comment
CynicalSpaceMan · 26-30, M
@jshm2 are you familiar with "Pokemon Go!" ?

This comment reminds me of the premise of that game; augmented reality.
https://similarworlds.com/automation/artificial-intelligence/artificial-stupidity/5490971-but-people-feared-the-printing-press-too

 
Post Comment