Very well considered.
I must admit I've not often thought about that, but have long noticed that so many people expect perfection of systems or services. Thery fail to realise two aspects.
The first is that no human-made system can be ever be 100% perfect in design or operation - people make mistakes or misjudge things, yet somehow that's treated as if wilful wrong-doing. The fault is not in the operators, but the system might be insufficiently protected against simple errors.
The second is that artificial things can wear out, break down unexpectedly, be damaged by external attacks like fires, floods and criminals - yet somehow so many imagine this should be impossible. Then when the unfortunate operators try to perform repairs or servicing to minimise break-downs, they are villified if the necessary work temporarily inconveniences the users - they (the operators) can't win!
I can think of three text-book examples of this, and all come down to people who are surely not perfect themselves, expecting perfection in others. Sorry - I am hopeless at remembering dates (I am not perfect!) but these are all within the last few decades:
1) Italy: Senior geological service manager arrested after an earthquake was a lot more sever than predicted. Earthquakes are notoriously not predictable.
2) Britain: Weather-forecaster Michael Fish villified over a storm being worse than predicted. Storms are not easy to forecast, even less easy in his day (1990s). Besides, he was right: we did not have a "hurricane".
3) Germany: A train left a station, entered a single-track section and collided with one already approaching from the other end. Fiorst action? Prosecute anyione the Police thought had casued it, the lower the rank the better.
4) Greece: Almost the same type of railway accident - the prosector's victim was the station-master who gave the drive "right away".
NB: I do not know the outcomes of either prosecution.
The common thread is reactions by very imperfect people, to events they could not understand, largely by basic technical ignorance but also by inability to imagine imperfection in other people and the systems they operate.
Natural events are very difficult to predict; but to find the underlying cause in the railway accidents I would ask if the railways had any built-in protection from human operational errors. If not, then yes, human error - but way back in the system's design, by people not foreseeing all possibilities including human fallibility in subsequent operation.
,,,,,
There is though, another aspect of that fallibility with results making it impossible to identify quite what, as the dead cannot speak. This is lost attention or other momentary lapses.
The locomotive driver who caused the 1952 Harrow & Weadstone Station railway disaster, was apparently matched in the 1980s Moorgate crash on London Underground. We cannot know but I wonder if the respective drivers had become somehow mentally "lost", maybe only for seconds, but long enough for catastrophe. (Less excuseably that way though, at Harrow.)
Some of the so-called "Bermuda Triangle" disappearances may be similar but all those resulted from people failing somehow - usually incomprehensibly, but investigations revealed a few losses were by wilful negligence and/or pure ignorance.
;;;;;
In the end, perhaps the greatest human weakness is the inability to acknowledge simple human fallibility, and to understand it. Blame is much easier, even when the failure is by no means intended.
Recently, the phycisist Prof. Brian Cox pointed out on the radio that the human brain is most complex system known in the cosmos - mechanisms like quantum physics being comparatively simple.
Yet evn in full health it can still let us down!