1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
ON THE DISASTER TRAIL
What Does This Guy Know Anyhow?
In 2007, the post-apocalyptic movie I Am Legend starred Will Smith as the only living man in New York City, having survived a virus that decimated most of the world by either killing humans or turning them into vampire mutants that only came out at night.81 Capitalizing on the popularity of the blockbuster that raked in nearly $600 million in revenues, the magazine Popular Mechanics published an article titled “I Am Legend’s Junk Science: Hollywood Sci-Fi vs. Reality”82 that sought to determine if a number of aspects of the movie were scientifically plausible. It asked whether an abandoned Manhattan infrastructure could really be covered in tall weeds after only three years, whether a retrovirus could spread and wipe out humanity that fast and that thoroughly—recognizing that there is yet no known virus that turns humans into vampires—and whether the Brooklyn Bridge (and the Manhattan Bridge next to it)83 could be destroyed in the manner depicted in the movie. This last question arose because the movie showed both bridges missing about three-quarters of their center span, the remaining quarter projecting outward from one of their towers but not supported by cables, while their east span all the way to the Brooklyn shore was still supported by their suspension cable.
When contacted by the magazine to comment on whether this kind of damage was realistic, I mentioned that it was odd, to say the least. This is because the way suspension bridges are designed, the main cables are run from the shore to the first tower, to the second tower, and then to the opposite shore, like a giant clothesline—one that sags a lot. Smaller vertical cables hanging from the main cable are attached to the bridge deck at various intervals to support it. Therefore, as would happen if one would cut a clothesline in its middle, taking out the cables in the middle of a suspension bridge would cause all the spans to collapse, not just the middle one; only the towers would remain standing. And even if the cables were somehow fixed to the top of the towers—they are typically not—and the mid-span of a suspension bridge was bombed, the forces from the shore-spans would pull the towers significantly laterally, and it is unlikely that the towers could withstand those sorts of forces. I suppose a writer could imagine a scenario where the cable was welded at the top of the towers, or got jammed or stuck there by debris, but that is science fiction. In truth, to complicate matters (and something I did not mention to Popular Mechanics, to keep things simple), contrary to the Manhattan Bridge, the Brooklyn Bridge is a hybrid design: it is both a suspension bridge and a cable-stayed bridge, which means that the bridge deck is supported two different ways (much like a person can wear both a belt and suspenders at the same time to make pants fight gravity).84 In cable-stayed bridges, the tension force in each cable is resisted by the deck in compression between the cable and the tower, such that the middle span could be removed without collapsing the rest of the bridge. In fact, cable-stayed bridges are constructed by adding pieces of deck (and cables supporting them) one piece at the time, projecting outward from each tower, until everything meets at mid-span.
However, if the suspension cable were cut and cable-stayed action became the mechanism to carry the Brooklyn Bridge’s deck, the suspension cables could not be left standing up in the air as shown on the I Am Legend movie posters.
It was nice that Popular Mechanics went through all that trouble with that article, to debunk “fake-science” and “fakeengineering,” given the nonsense that Hollywood usually produces in attempts to jazz up stories with visual effects (see later section on earthquake movies). However, the thing I found most interesting when the article was first published online, was the section at the end of the article where others could post comments. The first comment was, “What does this idiot know about bridges anyhow?” That eloquent commentator did not expand further, making it unclear as to whether he was frustrated that I did not expand on the hybrid concept described above, or he if was simply frustrated about everything in life. Apparently, trolls come from a different engineering school where the phrase “everybody can be an anonymous expert when online” is engraved on their diploma. (To Popular Mechanics’ credit, the practice of allowing any wannabe to add comments at the bottom of their online articles has been discontinued, and all comments to previous articles have since been deleted.)

Terrorist Attacks
BOOM It is a gorgeous day. Not a cloud in sight along the entire eastern seaboard. It is September and fall colors are soon to adorn the northern states. It still feels like summer in the southern ones. Noisy kids fill the school cafeteria—at least in those schools that still allow them to have a decent lunch break.1 Then, at noon sharp, in one hundred schools across the entire eastern time zone, with the fury of a firework finale, the usual cafeteria boisterousness is drowned by the sound of banging metal parts and hypersonic bullets leaving the barrels of AK-47s. A butchery—two orders of magnitude worse than Columbine2—by commandoes of armed fanatics that orchestrated a synchronized mass murder to hurt innocent people. A horrible deed intended to strike Americans where it would hurt the most—their kids Fortunately, this has not happened. If it had, it would have shocked the nation, spreading the message that nobody is safe anywhere at ground level, unleashing chaos in every neighborhood. One can only speculate whether unprecedented national outrage would escalate to the point of launching nuclear warheads in retaliation.
There is no shortage of opportunities for anybody who wishes to be a terrorist. Banking on the promised reward of (maybe) seventy-two virgins awaiting in paradise,3 some have been eager to turn themselves into a human-bomb, but others have done it for free.4 And some have even used children for this purpose.5 Nobody would mind suicidal believers exploding in private, without taking hostages in their delusion, but unfortunately, that is not how they operate.
Sadly, there is an infinite number of ways to disrupt the workings of a modern society. One highly effective way to throw a wrench into the gears is to take out one large piece of the infrastructure. Another way is to scar the nation’s psyche. Driving commercial airplanes into the World Trade Center Towers and the Pentagon managed to do both, as it shut down the entire airspace for days and left the population in shock at the sight of the towers collapsing. It was the violation of a way of life.
It has often been said that, for many Americans, the terrorist attacks that occurred on September 11, 2001 (a.k.a., 9/11) were a wakeup call. Everybody alive then remembers where they were on that day. To make things worse, in the weeks that followed, five people were killed and seventeen sickened when they received letters laced with anthrax—a toxic bacteria.6 A lot of stuff and people suddenly became suspicious.
Behind the scenes, all government security and intelligence agencies worked around the clock. More visible to the general public, much energy focused on screening the traf f ic driving across bridges and through tunnels to Manhattan and on confiscating nail clippers and other similarly dangerous weapons from the luggage of airport passengers. On these fronts, things ramped up over time. A gigantic Department of Homeland Security was created, and the government provided airports with equipment making it possible to scan though clothes and see passengers effectively naked—but not in color, for a modicum of modesty.
Yet, only three years later, experts were already commenting that the country had returned to complacency.
7 For example, it was reported that a more serious anthrax attack could produce a hundred thousand deaths and that the health care system did not have the capacity to handle the flux of victims arriving at once with respiratory problems due to anthrax or other bio-terrorism agents—and that it is still the case today.
8 Note that the same is potentially true for any other rapidly ramping-up contagious diseases of natural origin, as highlighted by the recent COVID-19 pandemic in many countries.
This complacency was also highlighted for critical infrastructures. While massive resources were invested to make airports safer, many targets other than airports remained unprotected, including some that could be highly attractive to terrorists seeking to send the nation’s economy into disarray. Many scenarios are possible. A sample of some of those, well known to the government and enemies, have been exposed in details by experts in many books that can be shipped overnight from anyone’s favorite bookstore—so these are not state secrets. Simplest to execute includes driving a tanker truck through the flimsy security gate of a chemical plant to explode it next to tanks holding deadly chemicals, spreading toxic fumes for miles around. Among the fifteen thousand chemical plants and refineries that store hazardous material across the United States, the Environmental Protection Agency (EPA) had identified hundreds where such an attack could kill up to a million people.9 If targeting maximum deadly impact, it does not take a PhD in decision-making to judiciously pick a tank close to residential areas or transportation hubs. More elaborate schemes consist of running speedboats next to large transatlantic tankers carrying either oil or—for a bigger bang—Liquid Natural Gas (LNG) as they enter the ports of Long Beach and Boston, and using “shaped” charges to penetrate the hull of the ships.10 A shaped charge is a piece of steel that can be turned into a sharp projectile by using a little bit of explosive—a simple and inexpensive trick that has been used quite effectively to penetrate the tanks of US forces during recent wars in the Middle East. The resulting oil spills or LNG fires would be disastrous and deadly on their own, polluting waters in urban centers and incinerating buildings within a certain radius of the ignited LNG burning at 3,000°F. Sinking the vessels would be a bonus. However, either way, the actual objective of the terrorists would be to create major economic losses to the nation by shutting down these ports for an extended period. Half of the LNG coming into the United States transits through Boston, and all main pipelines to the oil refineries in California are located in Long Beach, which also happens to be the largest port on the West Coast, where $10 billion of goods transit every year.
11 As far as anthropogenic disasters go, acts of terrorism are the most vexatious because they are disasters inflicted on purpose, unpredictably. The question is not will they happen, but where? More specifically, where are the vulnerabilities? Where are the straw huts and where are the brick houses?
SELF-REGULATING BODIES A nuclear power plant may be subjected to massive regulatory scrutiny and be designed to resist some of the most extreme load conditions—including earthquakes, tornadoes, and a direct hit from a crashing airplane, to name a few—but what is reasonable to expect for the rest of the infrastructure? Governments in most capitalistic countries, including the United States, do not necessarily wish to get involved in regulating every industry, because regulations require allocating resources for inspection and enforcement, which can quickly get out of hand. Nobody has endless resources.
And why harass responsible organizations when no visible harm is being done? The assumption of responsible behavior is the key idea here, bolstered by the belief that irresponsible acts by an enterprise would effectively be tantamount to brand suicide—thus automatically cleansing the market of dubious players. This frees the government to focus on regulating only those industries where errors and failures produce great dangers, typically in response to public perception and thus political pressure. If Tiny Tim breaks a tooth on a one-inch nail that should not have been in his brand-new cereal box and posts a video of this discovery on YouTube, a nationwide recall will follow together with possible incommensurable damage to the brand (on top of a lawsuit by Tiny Tim’s newly found personal injury attorneys). The problem almost takes care of itself. If a nuclear power plant emits massive amounts of radioactive material into the air for days, the problems do not stop with bankruptcy of the shamed power company.
With that mindset, many governments worldwide have typically adopted a “hands-off” approach when it comes to a large part of the private sector and industrial infrastructure —and particularly so in North America.12 Successful selfregulation is a good way for an industry to keep the government off its back, because the consequence of government oversight is a mountain of paperwork that adds nothing to productivity and profits. By analogy, except for accountants who can earn a living out of the process, nobody enjoys working on income tax forms. It is a drudgery that requires plowing through a maze of forms and nonsensical rules, takes an inordinate amount of time, and brings no profit—a tax refund is not a profit, it is a recovered loss. Being under the thumb of government regulations can be like filling tax income forms every single day of one’s life.
The best way to keep bureaucrats at bay is to have a good record of accomplishment when it comes to preventing accidents and failures. Take the chemical industry for instance. It produces thousands of products that are needed across the world by businesses and individuals, and this requires producing and handling some of the most hazardous and toxic materials in existence. Some would argue that maybe it should not handle such dangerous products, but that would be a fundamental misunderstanding of chemistry. It may take dangerous products to produce safe ones. To illustrate the point, take the extremely poisonous chlorine gas— first used as a lethal weapon in World War I.13 Then take sodium, which is so reactive that it explodes when it comes in contact with other products— particularly water. Put chlorine and sodium together, and, boom, out of the explosion comes regular salt —a perfectly safe staple of life. Mother Nature has a pretty dark sense of humor indeed. Of course, nobody would produce salt that way, given that it is much easier, safer, and cheaper to simply let water evaporate out of salt-water fields and collect the remaining salt, but some mundane staples of life have no such safer alternative ways to be produced and must be assembled by combining some nasty, toxic, hazardous gases, liquids, and solids.
For a long time, as far as operations were concerned, because the chemical industry had a relatively good safety track record in the United States (maybe less shiningly so in some less developed countries), it had been for the most part self-regulating, with some oversight from the EPA and the Department of Transportation14—understanding that regulations and number of regulating agencies have tended to increase over time,15 for example including the Toxic Substances Control Act updated in 202116 to better monitor the more than seven hundred new chemicals introduced into the marketplace each year.
17 However, safety under normal operations does not imply security against extreme events. Post 9/11, some experts expressed concerns that little had been done to secure hundreds of chemical plants in the United States to prevent a terrorist attack that could injure or kill up to a million people.18 The same was said about the trains that ship hazardous materials through neighborhoods and the barges that do the same through unmonitored inland waterways.19 Realistic scenarios that could lead to such catastrophic outcomes (such as the ones outlined above) are in the public domain.
Therefore, in hope of keeping terrorists away from hazardous chemicals that could be weaponized, in 2007 Congress authorized the Department of Homeland Security to establish (at the same time as many other Security Acts20) the Chemical Facility Anti-Terrorism Standards (CFATS) program.21 This spawned the “Protecting and Securing Chemical Facilities from Terrorist Attacks Act” that became law in 2014 and that regulates high-risk facilities by imposing some security standards.22 This likely helped plug some security holes (using plenty of paperwork to fill them, some might be tempted to say), but are the security measures in place effective? Hard to know with certainty because there has been no such attack yet—other than an ill-conceived one that failed in France.23 The alternative approach though, when everybody is selfregulating, requires answering the question: Who is in charge of the common good?
THE TRAGEDY OF THE COMMONS The “tragedy of the commons” is a theory elegantly postulating that when rational individuals share a common resource, they will inevitably deplete the shared resource.
Politely said, each individual will attempt to get a greater share of the common resource to receive more benefits out of it than the others do.
24 Since each member in the group is naturally inclined to act that way, the resource will eventually be exhausted. The name “tragedy of the commons” was coined in the 1960s by an ecologist concerned about human overpopulation, who co-opted an 1833 story about cattle herders bringing their cows to graze in a common park.25 With each herder trying to bring as many cows as possible to feed in the free public park, the combined effect of this selfishness forever spoiled the resource. Apparently, the take-away message from the story was that even in the nineteenth century, there were people acting like modern-day CEOs. In the 1960s, with common resources of clean air and water being polluted, fish stocks being depleted, and the earth being stressed by accelerating population growth, it was a timely topic to repackage.
To call that a theory is maybe a stretch, because it is not a profound discovery; most everybody knows freeloaders who empty the bowl of chips before others have had a chance to get any. However, it underscores the fact that if a resource is shared by a collective, some will always try to benefit by taking more than their proportional share. Turning the problem around, if there is a common resource—or a shared market—from which benefits ensue, if an investment is required to make the resource more sustainable or secure and there is no immediate return on the investment, the theory implies that nobody will be motivated to be the first one to incur costs in improving the resource. In other words, nobody wants to be the sucker that foots the bill for the benefit of all the others who wait on the sideline.
Thus, when it comes to making an investment to enhance the security of a private sector infrastructure—such as, hypothetically, a chemical plant— who will do it first if the cost for producing the same chemical product as competitors will be higher for the one making that investment? As a result, short of a concerted effort, very little—if anything—beyond what is legally mandated gets done to enhance security against terrorist attacks. That is a first “tragedy of the commons.”26 To make things worse, if one CEO decides to “bite the bullet” and spend on some measures to increase security of its chemical plant (to be “a good corporate citizen” if for no other reason), at the potential cost of some loss in marketshare, it may not necessarily be of any value down the line.
When a terrorist blows up the plant of a more vulnerable competitor, the government, forced to act decisively for political reasons, could step in and indiscriminately require all plants to be temporarily shut down—with every company losing money at the same rate, irrespective of whether or not they had invested to make their plant more secure.
Then, subsequently, since there are always different technologies that can be used to achieve specific objectives, the government may decide to impose on each plant a standardized and completely different set of security measures than those initially implemented by the pro-active CEO, rendering useless the initial investments. Using a very simplistic analogy—security measures in chemical plants obviously being a lot more complex than that—imagine a CEO spending hundreds of thousands of dollars installing metal detectors everywhere, when no other competitor does, only to be told after a terrorist attack months later that metal detectors are not acceptable and that body scanners are now required. To add insult to injury, imagine that to make the adoption of its policy more expedient, the government offers to subsidize the purchase of body scanners for every plant, so that it costs hardly anything to those who never bothered with security. That is the pernicious effect of the tragedy of the commons.
Arguments against enhancing the safety and security of industrial facilities and infrastructure are easy to find.
Suf f ice to say that more than a hundred people die every day in the United States in motor vehicle accidents—that is more than thirty-five thousand per year, and more than 70 percent of them were not drunk.27 By comparison, how many died of terrorist attacks on an average year? Of course, that is a moot point, since one could likewise argue that comparatively few people die in mass shootings—only twenty-six per year—but that does not make it a reason to not care.28 Clearly, safety is a relative concept, since nobody is proposing a driving ban to save thousands of lives every year.
CASCADES Near closure of a business day, the CEO of a large company learned that he lost $100 million in a deal that went sour during the day. Mad as hell, he convened an emergency meeting of his top executives and screamed in their ears that they were nothing more than a bunch of idiots. Rushing out at the end of the meeting, the executives returned to their respective departments to yell at all their staff, to threaten to fire them all, and to tell them to “scram!” The army of disgruntled employees left the office fuming, drove home, slammed the door, and shouted at their spouses.
Minutes later, the frustrated spouses snapped at their kids for not having picked up their toys, washed their hands, or whatever other thing that had been repeated millions of times but not done yet at that very moment. The upset kids got out of their house, heads down, hands in pockets, and kicked their dogs. By the end of the day, thousands of dogs in town were left puzzled, wondering what they had done to be kicked in the ass, out of the blue.
A cascading failure is exactly like that: namely, a series of events that cascade through a chain of related systems, one after the other as a domino effect, often in unexpected ways, because of a single triggering event, leaving those unaware of the linkages along that chain wondering what just happened—like the poor dogs above. In today’s modern society, multiple complex systems operate in entangled ways that are not fully understood.
A regional power outage triggered by an earthquake makes the first domino fall. Whoever did not invest in an onsite backup power generation system is vulnerable. Water utilities that rely on massive amounts of power to pump water across geographical obstacles or up into water towers will suffer. The processes to treat drinking water may also fail, leaving the water supply exposed to contamination and the spread of diseases. Sick people will not show up to work and ambulances trying to reach affected citizens will need to navigate a city gridlocked by nonfunctioning traffic lights.
And so on—all disrupting the economy.
When Hurricane Sandy made landfall in New Jersey close to Atlantic City on October 29, 2012, it had become a posttropical cyclone with hurricane-force wind gusts exceeding 75 mph.29 More importantly, it caused a significant storm surge along the New Jersey and New York coastlines.
Beyond the facts that the hurricane reportedly damaged or destroyed six hundred fifty thousand houses,30 that power outages were extensive with six hundred thousand people without electricity for nearly two weeks, and that the New York Stock Exchange was closed for two days,31 it is also significant that many cascading failures also occurred. In particular, a large tank ruptured at the refining storage facility in Sewaren and spilled 335,000 gallons of fuel into the adjacent waterway.
32 Pollution also occurred from wastewater treatment plant failures, with nearly one billion gallons of raw sewage spilling into the adjacent bay at one plant alone.33 And most importantly, 8 percent of the total US oil refining capacity was located in the affected area and damaged, severely disrupting the nation’s production and requiring intervention by the federal government and the temporary suspension of a number of rules—such as the EPA’s clean gasoline requirements34—to ensure overall fuel availability through the country.
The cascading effects from any hazard can be far reaching and hard to imagine. In 1998, an ice storm dumped freezing rain and drizzle for more than eighty hours in many parts of eastern Canada and the northeastern United States, but predominantly in southern Québec near Montréal.
Accumulation of ice reached three to four inches in many locations. The weight of the ice coating power lines became excessive and the transmission towers that carried these lines collapsed. Hydro-Québec reported fifteen hundred damaged towers, with nearly 1.4 million households without power. The outage affected more than three million people for several days—more than 40 percent of the province’s population of 7.3 million at the time.35 In large parts of Montréal’s south shore, 150,000 people were without power for up to three weeks following the storm.36 The list of infrastructure systems, organizations, companies, and individuals that were severely impacted by that ice storm and power outage goes on and on, and most people would be able to name a lot of the problems that occurred if asked today. Closed shopping malls and businesses, roof collapses under the weight of ice, treacherous travel conditions, carbon monoxide poisoning in stranded vehicles or when propane grills were run indoors, electrocution from downed power lines, and many more, easily come to mind. Yet most people would not think to mention losses to the agricultural sector.
Indeed, many barn roofs collapsed under the weight of ice, crushing livestock to death, with great losses to farmers —a problem exacerbated when some insurance companies played hardball and shrewdly classified the ice storm as an Act of God.37 More broadly, though, when power failed, heating and ventilation systems stopped. Many farm animals literally froze to death or died of asphyxiation—and that kind of frozen meat is not allowed to end up in supermarket freezers.38 Pigs, poultry, and cows died by the thousands, and farmers had to wait weeks to get the carcasses carted away by companies that dispose of dead animals.
Dairy farmers were also hard hit, first because this is a highly mechanized large-scale industry and, without power, only a small percentage of the cows could be milked the oldfashioned way, and second because whatever milk could be collected (using emergency generators or otherwise) could not be cooled, picked up by cistern trucks, and treated at the processing plants.39 Country roads were icy and sometimes blocked by fallen trees, but, more importantly, the processing plants were not operating. A total of 13.5 million liters of milk40 from 5,500 Ontario and Québec dairy farmers had to be dumped.41 The problem was further compounded by the fact that, in a production environment, dairy cows that are not milked regularly become vulnerable to mastitis, an infection of the udder that is extremely painful and can lead to death.42 When mastitis is not fatal, problems ensue with the quality of the subsequent milk production.43 Some of the farmers had to kill or sell their cows.44 The agricultural sector that depends on production from trees also suffered as the massive ice accumulations on trees bent and snapped many in half. For example, roughly 20 percent of the trees used for maple syrup production in Québec were lost.45 This example highlights that all industry sectors—even some that do not intuitively come to most people’s minds, such as farming—depend on the existing infrastructure to a point of maximum optimization. While optimization is good when it comes to operations in normal conditions, it also means operating without redundancy, thus relying on a fragile equilibrium. Optimized systems are slim, efficient, cost-effective—all good things—but not resilient. ON THE DISASTER TRAIL
The Ultimate Tragedy of the Commons In a Communist country—in theory—everything is part of the commons (both words come from the same Latin root, communis).46 In 1990, the European Conference in Earthquake Engineering was held in Moscow, which gave me a chance to visit the USSR in the years of Gorbachev, Glasnost, and Perestroika. The country talked about changes, openness, and transparency, but hard-core Communism still ruled. It felt as if I had time-traveled to the 1940s. Phone receivers weighed as much as a barbell, subway stations were decorated like renaissance museums, and folks drove banged-up Ladas that embodied the art of frugality in automobile manufacturing. Changes: Thousands of people lined up at the first McDonalds ever to open in Russia, to get a taste of Western decadence—with longer wait times there than for Avatar Flight of Passage in Disney World nowadays.47 Hard-core Communism: After sitting for one hour at a table in a restaurant waiting for a menu, I was forced to conclude that the word “service” was absent from the Russian language—but I found out later that the word “bribe” was not.
At the conference, the Russian engineers seemed obsessed with complex mathematical approaches for problems that the rest of the world typically solved using simple computer algorithms (I later learned that there were no personal computers in the Soviet Union in 1990).
Strangely, nobody bothered to close the curtains during the keynote lecture, so the screen remained white—drenched in sunlight—throughout the entire presentation as the entire audience could distinctly hear the slides change in the projector. Equally strange, most toilets in the brand-new convention center were also clogged. As part of my visit, one of the lead engineers of the USSR National Laboratory where earthquake engineering research and experiments were being conducted gave me a private tour of their facilities. Walking through their enormous laboratory, I noticed that the place was filled with a large number of technicians—which is usually a good thing—but that they all seemed to be sitting around idle, talking and smoking. I asked my host if this was a break time. He responded, “The government pretends to pay us and we pretend to work.” Then, after the tour, he invited me to meet his family for dinner at his apartment. It was in a complex of multistory residential towers. It reminded me of the infamous subsidized housing development projects that had been constructed in some big US cities. There was garbage on the ground, dirty walls everywhere, even dirtier staircases, and plenty of things falling apart for lack of maintenance.
However, as soon as we entered the apartment, the place was warm, bright, clean, nicely decorated, and welcoming.
The whole family was charming and delighted to have a visitor. The contrast between the inside and the outside was striking. I asked my host why was it that the apartment was so nice while the exterior of the building looked so drab. He responded that people had pride and took care of what was inside their own apartments. Outside, however, that was the government’s responsibility. The divide was clear in their mind. Why would anyone take the initiative of being the stupid one working outside to make the grounds look better while others would not contribute? It was either “all in” or nothing, and the “all in” part fell within the domain of the Communist government’s responsibility. The ultimate tragedy of the commons.





Annoying Doomsday Scenarios
ZOMBIES, VAMPIRES, AND THE PANDEMIC OF THE DAY Those who believe that the outcome of World Cup games can be predicted by an octopus, that other future events are best predicted by throwing spears of asparagus in the air (asparamancing),1 that cancer can be removed from the body by touch alone or by reaching for it after making an incision on the stomach with two fingers (psychic surgery), that the electrons your body needs to balance its overabundance of free radicals must be absorbed by walking barefoot in dirt (body earthing), or that the ultimate remedy against every known disease that can at the same time make your skin softer and enlighten your third eye is to drink urine (urine therapy), might find this section offensive and can easily skip it without breaking out in hives.
Earthquakes, hurricanes, volcanoes, and other natural hazards unleash powerful forces of gigantic proportions. Yet, the elephant in the room—one of the biggest disasters ever —is actually smaller than an elephant, smaller than an ant, and smaller than a single cell. Technically, it is not even alive. It is a microscopic, self-replicating, invader. Not satisfied with creating a local disaster, it seeks to wreak havoc globally. Meet the pandemic.
If there is one hazard that needs little introduction by now, it is the pandemic. After months of social distancing, home confinement, travel bans, rationed toilet paper, and scrutiny of data, riding the waves of new cases, intubations, deaths, and upturn in the stock market, it is fair to say that little explanation is needed. Since COVID-19, everybody worldwide can now viscerally relate to what a pandemic is.
Previously—like most disasters described above—it was a vague concept; something that happened in underdeveloped worlds; something that happened in medieval times; something that happened when medicine did not know any better; something that the average Little Pig did not plan for because that kind of Big Bad Wolf would never come. Unfortunately, as for all the other hazards above, this false sense of security was unwarranted.
In the aftermath of the 2014–2016 Ebola epidemic, a Global Preparedness Monitoring Board was tasked to provide a frank opinion as to whether or not the world was ready for a pandemic; it published its findings in September 2019, four months before the existence of COVID-19 was first reported.2 The Board concluded that the world was not prepared at all to deal with a rapidly spreading lethal virus that had the potential to kill up to eighty million people worldwide and wipe-out 5 percent of the world’s economy— a catastrophe that it considered to be a real possibility given the fact that the World Health Organization had identified 1,483 epidemic events in 172 countries between 2011 and 2018.3 A prescient warning if any.
This false sense of security and lack of preparedness is not surprising. In many ways, a pandemic is an earthquake in slow motion. It is something everybody knows is possible at any time but is more convenient to ignore. Then, like every other ignored threat, when it happens, it is a mess.
Nobody is pleased to see disasters unfold—be they pandemics, earthquakes, hurricanes, or whatever else derails life from its peaceful course. Except for those who get spiritual or supernatural titillation out of it. In that category, there are groups who sincerely believe that a raging pandemic is the long-awaited “end of the world” that will, at last, allow them to meet their creator, or fuse with the universe, or whatever else is supposed to happen on the big doomsday, and who are driven to congregate to fulfill this destiny. Orders that prohibit large gatherings for public health reasons are pointless to fanatics eager to meet their end. Nothing prevents such folks from exercising their freedom of religion, but during a world pandemic, they should do so packed in a stadium, all doors welded shut (with all the utilities working and periodic food drops for humanitarian reasons), to live their dream secluded from those who do not wish to be infected by their contagious joy.
Cynics would add that apocalypticists4 of all denominations should be locked together on a remote island and left to count their dead to resolve which creed or cult held the truth in the end—if they do not kill each other in a religious war before the virus does.
For the rest of the population that, given a choice, prefers to survive pandemics, the best course of action is evidently to prepare. This means planning on how to contain the spread of the disease by preventing transmission from infected humans. In this case, coming back to the Three Little Pigs, whether the huts are built of straw, wood, or brick does not matter. None will stop a virus—although stronger huts could prevent the spread of “diseases” transmitted by the bite of vampires or zombies by keeping them out.
Previous chapters have dealt with earthquakes, hurricanes, floods, and some anthropogenic disasters, because these shared a common attribute: each of these hazards could have devastating effects on the built infrastructure, and consequently on the population. If releasing the straitjacket of that commonality, the list of possible causes of disasters becomes much longer. It can include droughts, snowstorms, avalanches and landslides, animal plagues, locust invasions and other insect-related problems, food chain poisoning, famine, concentration of the world’s wealth in the top 0.1 percent of the population, illiteracy, political instability, mass riots, dictatorships, war, massive solar flares and geomagnetic storms, asteroid impact, extraterrestrial invasion, the in-laws moving in, and more. Each of these by themselves can be—and have been —the topic of many books. Here, the primary focus is on hazards that affect everybody as a consequence of damage to the infrastructure, but without too much of a stretch, it remains that many of the matters addressed in the subsequent chapters are generally applicable to other hazards that do not affect infrastructure. All these hazards are “earthquakes” that can produce disasters of their own.
For some people, the earthquake will actually be an earthquake.
For others, the earthquake may be losing something/someone precious as a consequence of their negligence.
For alcoholics, the earthquake may be killing somebody when drunk driving.
For drug addicts, the earthquake may be hitting the absolute “bottom of the barrel.” For chain smokers, the earthquake may be a heart attack.
The way humans prepare (or not) and react to rare highconsequence events remains relatively the same, irrespective of the nature of the disaster. ON THE DISASTER TRAIL
Mopology Wisdom One of my first summer jobs was in a general hospital, working as one of the many “mopologists” hired to help keep the maintenance operations running while members of the regular mop squad took summer vacations. In the process, I learned two important lessons.
The first one was taught by the regular employee I was assigned to shadow during my first week of employment, to learn the ropes. After punching the time card, I followed him to the closet where all the mops, buckets, towels, rotary floor polishers, and other miscellaneous cleaning products were stored. He sat on a bench, looked me straight in the eyes, and said, “Listen carefully, kid. When the end of August comes, you kids are going back to school, but we are going nowhere. So don’t make us look bad,” which was a stern order to respect the regular pace of work and to not even think of being zealous. He then sat on a bucket and read his newspaper for thirty minutes. Thereafter, we worked for about ninety minutes, spent a good forty-five minutes in the cafeteria for the morning fifteen-minute union break, worked another hour, went to lunch fifteen minutes early and repeated the pattern in the afternoon.
Amazingly, we managed to squeeze a solid five hours of work into an eight-hour workday.
The second lesson was learned progressively over the summer. As maintenance people from various wards of the hospital took vacations somewhat in succession, the “kids” hired to fill-in for the summer were rotated from wing to wing. So, one week, I would be mopping the maternity ward, the next week the oncology department, then the burn center, and so on. Most of the mopping and waxing of floors happened around and under occupied hospital beds, but maintenance employees were strictly forbidden to talk to the patients—presumably because the private life of patients was none of our business. That rule, however, was most absolutely disregarded by all, first because talking is more fun than mopping, but more importantly because many patients are bored stiff and lonely—they can barely squeeze more than a few words out of the nurses and doctors, and visitors are in many instances rare and few.
What struck me was the striking contrasts in patient behavior from ward to ward. In the department of terminal pulmonary diseases, those with last-stage lung cancer or throat cancer, final-stage emphysema, end-stage cystic fibrosis, and the like knew that they had reached the end of the line. It was mindboggling to see some patients smoke cigarettes through the tracheostomy tube sticking out from their throat, in some desperate attempt to lighten the gloomy mood. These patients were not the talkative type, particularly those who had to rely on the “robot voice” of an electro-larynx speaker to communicate because their vocal cords were taken away by larynx cancer.
In contrast, the cardiology department was a blast.
Patients arrived by ambulance nearly dead, but those that survived and made it out of the intensive care unit were as giddy as if they had miraculously found a spare life in a Cracker Jack box (in those days, each box contained a prize).5 They had been given a second chance. They should have been dead, but because of advancements in research, knowledge, and technology, instead they were alive, waiting for the green light to walk out on their own two legs, inspired to adopt a healthy lifestyle of exercise without junk food—best intentions at the very least, successful in many cases. The “big earthquake” had hit hard and had been good for them, as they had survived.

Interlude
THE GALLERY OF HORRIBLE EARTHQUAKE MOVIES After all the grim facts of the previous chapters, it is time to relax a bit—before more grim facts in the next chapters.
Toward that goal, reviews of disaster movies are presented below, focusing on earthquakes only, to keep the list to a manageable size. The deeply curious who may wish to test their pain tolerance can find information on where to find these flicks from the Internet Movie Database (IMDb.com).
Note that the purpose here is not to comment on the quality of the scripts (because most storylines stick to formula for the genre) or acting (because some movies showcased A-list actors on the marquee, while others hired those in the studio parking lot that held highest their “Will work for food” sign). Rather, the purpose here is to comment on the credibility of the whole thing, or sometimes of the earthquake damage as depicted by a film industry that must embellish for box-of
f ice benefits. Remember that,
as with all art forms, beauty is in the eye of the beholder; there will always be a critic to call the worst film of all time a masterpiece, and vice-versa.
Warning: Do not Trust Hollywood! It loves disaster movies but is clueless on how disasters actually happen. Any teacher who believes that asking kids to watch a two-hour flick is worth some points as part of a science project should be covered in tar and feathers and thrown out of town.
Earthquake (1974) For unknown sociological reasons, the 1970s were sort of the heydays of disaster movies.1 Topics covered ranged from sinking ocean liner (The Poseidon Adventure), air disaster (Airport), high-rise fire (The Towering Inferno), and many more, including, of course, earthquake. Earthquake is the granddaddy of earthquake movies. Beyond being a “classic,” it is the film that launched “Sensurround” in movie theaters, a technology that pumped sub-audio waves at 120 decibels to immerse the audience in earthquake waves.
Without that rumble, it is just another cookie-cutter disaster movie, with an all-star cast—although it is a “classic.” As for the earthquake damage: B for effort— we are talking about mid-70s special effects here, after all. But it is a “classic”—in case that was not made clear.
Aftershock: Earthquake in New York (1999) Surprising. Bravo for looking at an earthquake outside of California (although, arguably, Hollywood went overboard, for effect). Notwithstanding the exaggeration, one of the most credible set of earthquake damage of the bunch. A-.
Aftershock (Again, due to severe shortage of imagination for original titles) (2010) Spoiler alert: This Chinese production cleverly used the 1976 Tang-shan and 2008 Sichuan earthquakes as bookmarks to human drama, weaving thirty-two years of China’s history into a tearjerker that is surprisingly watchable (and probably even more so in its IMAX version). Although predictable (by those who know their seismic history), it is a testimonial to the 250,000 who died during the 1976 event. B+.
The Impossible (2012) Not surprisingly, in the aftermath of the 2004 Indian Ocean earthquake and tsunami, and the 2011 Tōhoku earthquake and tsunami, it was just a matter of time before Hollywood tried to cash in. Although, in fairness, this one is actually watchable. Once upon a time on the web, a film critic called Mr. Cranky 2 reviewed movies from the perspective that all of them are terrible, inflicting various level of suffering. Instead of using stars, Mr. Cranky’s ratings used bombs, and ratings ranged from 1 bomb (“almost tolerable”) to 4 bombs (“as good as a poke in the eye with a sharp stick”).
Uniquely bad movies were rated either a bundle of dynamite sticks (“so godawful that it ruptured the very fabric of space and time”), or an Atomic Explosion (“Proof that Jesus died in vain”), that latter one awarded to masterpieces like the 2011 remake of Conan the Barbarian. If one were to use Mr. Cranky’s rating scale, The Impossible would deserve only “1 bomb”—which is a compliment.
The Great Los Angeles Earthquake (1990) Nothing memorable here. Made-for-TV movie, produced after the 1989 Loma Prieta earthquake to capitalize on the public’s sudden awareness of the risk in California, it preceded the Northridge earthquake (Los Angeles) by about three years (both the movie and real earthquake were about hidden faults, but the similarities stop there). A generous C grade, because there have been worse.
10.0 Earthquake (2014) The largest earthquake ever recorded was a magnitude 9.5.3 The USGS has stated that a magnitude 10 is impossible because the fault rupture that would be needed for that to happen would have to extend around the entire planet (no such fault exists).4 But, who cares?
In Hollywood, everything is possible. Fun geological fact: As well demonstrated in multiple dramatic sequences of this masterpiece, when a fault opens up and propagates, swallowing everything in its path, it does so at the speed of a pickup truck trying to escape, following it as it turns left and right, as in a wild a car chase.
10.5 (2004) Why stop at 10.5, instead of going all out to 11 and becoming the Spinal Tap of disaster movies? Also, why stop at literally separating Southern California from the United States instead of splitting the entire continent along the Rockies? Obviously, two huge lost opportunities, by a timid director. Beyond that, a supreme package for those ignoramuses who like threehour doses of nonsense. A well-deserved F across the board. Mind-boggling fun-fact: A sequel was produced as a TV mini-series.
The Day the Earth Moved (1974) As the movie’s budget apparently could only afford to destroy five homes and a gas station in a hellhole in the middle of the desert, they were trashed beyond artistic license (with the means available in 1974). A few minutes might be of interest to those interested in nonstructural damage. Another non-negotiable F.
Aftershock (Again. Evidently, there are always lots of aftershocks) (2012) The genius who got the brilliant idea of mashing an earthquake movie with a slasher horror flick succeeded in plunging the seventh art to new lows (and burning $2M in the process). Watching the film’s trashy characters flub their lines, soaked in gallons of fake blood, will make your eyes bleed. Films like this one perfectly highlight the failure of the letter grade system, because F covers a far too wide numerical range (from 0% to 60%). This one earns a solid F—of the 0% kind.
Disaster Wars: Earthquake versus Tsunami (2013) It is a rare event to find all critics agreeing on the rating for a movie. Exceptionally, in this case,5 they concurred that this one deserves a 0/10 grade (only because negative grades cannot be given). Bears no resemblance to actual earthquakes or tsunamis, but shows that movies can technically be done by randomly recruiting the cast in a Walmart. Forcing detainees at Guantanamo to watch this might be a violation of human rights.
San Andreas (2015) Sure, chunks of concrete fall out of nowhere with no rhyme or reason, sure the ground moves in ways that would baf f le any respectable seismologist, but . . . IMAX 3D!!! How not to love nonsense when it is projected on a screen measuring 70' × 50' with 30,000 Watts of sound? All done with big-name actors (a first since 1974). Just think of it as the Fast and Furious of earthquake movies, and enjoy the ride. Kudos for recognizing that shattered glass falling from buildings is a hazard (not all buildings have tempered glass). No kudos for destroying the Golden Gate Bridge . . . again, after Godzilla (2014), Kaiju (Pacific Rim 2013), and just about everything else (a video compilation of Hollywood attacks on the Golden Gate Bridge can be seen on YouTube).6 San Andreas is as much about earthquakes as Star Wars is about rocket science, but lots of bonus points for its sheer entertainment value, for the link on earthquake preparedness in the lower left corner of the movie’s of f icial website,7 and for Sia’s eerie sloweddown version (first minute only)8 of “California Dreaming” from the Mamas and the Papas. All this raises it up to an A-. San Andreas Quake (2015) Apparently purposely named9 (and released at about the same time) as the big-budget San Andreas movie— with a slightly longer title but a massively smaller budget. The targeted audience is those same folks who, driving to Orlando, would get off the highway and follow signs to DazeneyWorld, pay admission and wonder if the drunk clown and the ramshackle House of Mirrors are the Goofy and Space Mountain that everybody talked about. To avoid at all cost, unless one is fond of special effects done with paper and scissors. A solid F.
Mega Fault (Director’s cut!) (2009) Claim to fame: Features the only earthquake in the world capable of spontaneously igniting heads (with lame CGI flames). This could be seen at twenty-six seconds into a thirty-second clip once posted on YouTube.10 That says it all. If the grades scale was not truncated at F, this one would deserve a Z.
Pandora (2016) Straight from the high-tech country that offered the world the exploding Android tablet batteries, and applying the same quality-control standards to movie scripts, this Korean dud dramatizes a nuclear power plant meltdown following a small magnitude 6 earthquake. It is deserving of a Razzie award (these awards have been created to honor the worst of cinematic under-achievements).11 Computer animations of the meltdown, reactor explosion, and panicked mass evacuation must have chewed-up the budget, because the earthquake damage itself was limited to less than one minute of fallen suspended ceiling panels, toppled file cabinets, overturned fruit stands, ripped awnings, one fallen telephone pole, and rocks rolling down the hill (hardly exciting stuff, except for a specialized kind of engineer fascinated by damage to nonstructural components). Not a single crack in the village’s old buildings, which is magical for a quake that brought down a nuclear power plant. The movie also offers undeniable proof to those not fluent in Korean that bad acting cannot hide behind subtitles. Will be greatly enjoyed by those who consider nonstop screaming to be legitimate dialogue. C+ on the strength of professional camera work.
The Quake (Skjelvet) (2018) If Hollywood has no shame in stretching a credible magnitude 8 earthquake threat for California into a magnitude 10, why should Norwegians hesitate to inflate the tiny magnitude 5.4 Oslo earthquake of 1904 into an 8.5 balloon? ACT 1: The hero is moping for seventy-two minutes, depressed because nobody believes him. ACT 2: A thirty-foot tall earthquake wave ripples across town like a surfer wave in Hawaii, a clear indication that there were illegal substances hidden in the CGI team’s of
f ice. ACT III: The hero suddenly
becomes an accomplished acrobat and attempts saving his wife and kid from a building that defies the laws of gravity. Total US box office gross: $6,235 (not a typo).12 Highly recommended for devoted cinephiles with a fondness for dejected, apathetic, miserable characters.
For everybody else, it is a definite C-(for the special effects that incidentally won the Norwegian equivalent of an Oscar—a trophy that may be worth something on Ebay).
The Wave (2015) An enormous chunk of mountain falls from two thousand feet up into a fjord and the resulting 250-foot tall splash wave ripples upstream toward a small village nested at the end of the fjord. OK, more tsunami-like than earthquake, but that kind of wave is something that actually happens every now and then in Norway, with seven hundred people killed in a 1934 event (among many examples). Beyond that, the whole plot unfolds per formula, complete with predictions by a scientist nobody believes who becomes the hero saving (almost) everybody in the end. Branded as Norway’s first disaster movie, one wishes it had been the last, because—sadly —its sequel was “The Quake (Skjelvet).” The Wave is surprisingly watchable: B-.
San Andreas Megaquake (2019) Guess what! Scientists are predicting that in two days an earthquake will sink California in the ocean. What an original, never-heard-before, unique plot! The world needed that movie like the producer needed a kick in the groin. It is so unique and original that Meryl Streep and Tom Cruise probably would have signed up for the lead roles—if there had been a script. This masterpiece is to cinema what forks scratching plates are to music. A solid F (for FUBAR).
Quake (1992) Has the dubious merit of: (1) being the only movie with someone who believes that the best way to survive an earthquake while driving on a highway is to accelerate and bang all the surrounding cars for over a minute; (2) trying to convince the audience that a San Francisco highway can have only four cars on it; and (3) offering the worst-ever genre-bending combination by blending a bad earthquake movie with an even worse psycho thriller. All while “borrowing” newsreel images of damage in San Francisco from the 1989 Loma Prieta earthquake, which is either artistic appropriation or pure cinematographic laziness. If ever offered the choice between watching these seventy-eight excruciating minutes of nonsense while sitting in a comfortable sofa, or wearing a winter coat in a sauna while listening to two hours of screaming by Yoko Ono, definitely go for the second option without any hesitation.
The Earthquake (2016) This one is not a laughing matter. On December 7, 1988, a devastating earthquake struck Armenia (then part of the USSR). It killed more than 25,000 people and wiped out many cities, leaving more than half a million homeless in a freezing, soviet winter. This was a grim earthquake and this is a grim movie—almost a memorial. No scientist that knows it all but that everybody ignores, no hero saving the day against all odds, no clichés, no nonsense. Meet a few folks, witness earthquake destruction right away, and, from there, the rest is coping and suffering amid rubble in ways strikingly similar to what was seen thirty years ago.
Professionally done. Not a fun watch, but a rare A!
Earthquake (the band) (1971–79) Gotcha! This is not fiction but rather a poorly edited concert movie by the rock band called Earthquake. The very same band that will be inducted in the Rock ’n’ Roll Hall of Fame hundreds of years from now when they will be running out of ideas for nominees—but still more enjoyable than some of the movies mentioned above.
Giving it a major F-C-F-A-C-F rating (fingers on six guitar strings).
Geo-Disaster (2017) Everything at the same time: a super volcano, a mega earthquake, and a massive twister. Why not? A cinematographic achievement can launch the career of an actor, but if any Hollywood doors were left open after this disaster orgy, they might have been only those leading to the porn industry. Grade inflation is the new norm, so this one gets a D-.
San Francisco (1936) What better way to close this list than with another classic. There are movies where earthquakes are not the topic but rather an accessory, to offer comic relief (such as for The Three Stooges to escape a prison damaged by a three-second tremor)13 or, better yet, to provide a significant plot twist, as in the 1936 award-winning San Francisco. Looking through the lens of the 1930s, San Francisco got its quake right, offering an earthshattering rendition of the 1906 “big one,” with showers of bricks, stones, statues, and even a piano, killing those at the receiving end by closely (albeit not perfectly) replicating the manner in which unreinforced masonry buildings of the era collapsed. Starring Oscar-winning actors Spencer Tracy and Clark Gable (the King of Hollywood of the 1930s best remembered as the dude who, frankly, did not give a damn for his dear in Gone with the Wind), the 1936 blockbuster also gave the city of San Francisco one of its of
f icial anthems.14 The film is
watchable (and broadcast pretty much each year close to the anniversary of the 1906 earthquake), including the earthquake scenes, which is why it gets an A (for its time).
Unfortunately, this demolition derby is far from over. Now that everybody has an HD video camera and free movie editing software on their smartphone, the barrier of having to buy expensive film stock to produce a disaster has been removed. If that barrier has not prevented the above in the past, it will be amazing to see how bad things will become with future releases. While music recording software can fix bum notes and put everything in tune, there is still no software that can convert a bad script into a good one.

MEET THE LITTLE PIGS

The Wonderful Ability to
Forget
WHO CARES?
In 2019, a 5,631-square-foot Colonial brick home built in 2000, with five bedrooms, four full and two half bathrooms, an additional 1,000 square feet of finished basement, a twostory foyer, 12-foot ceilings elsewhere, hardwood floors, granite countertops, pro appliances, adjacent butler’s pantry, masonry fireplace, built-in bookcases, central vacuum system, whirlpool tub, wet bar, security system, and three-car garage; located on a lot of 0.64 acres, 108 feet by 250 feet with perimeter gardens, stone walls, lawn sprinkler system, and tennis court, located in the city’s best school district and only twenty-eight minutes from downtown Buffalo, New York, was offered for $995,000. It had been listed for over eight months, because those who do not wish to vacuum 5,631 square feet of livable floor space could find many other homes nearby, not too shabby either but with only 4,000+ square feet, for hundreds of thousand dollars less, and 3,000-square-foot ones for half that price.
The same year, for those who break out in hives at the sight of a snow flake, a more modest three story, 2,981- square-foot Mediterranean style home, built in 2001, with three bedrooms, three bathrooms (twelve rooms total), tile floor, French doors, 10-foot ceilings, Palladian windows, hurricane shutters, security system and cameras, two-car garage, with second- and third-floor oceanfront balconies, on an 0.51-acre oceanfront lot with private walkover to a quiet beach, a heated saltwater pool with spa, professionally landscaped gardens and sprinkler system, thirty-seven minutes from Saint Augustine, Florida, sold for $925,000.
Many similar homes nearby were less expensive, by as much as 40 percent when not oceanfront.
Meanwhile, on the San Francisco Peninsula, hard-earned dollars did not quite stretch that far, but detached homes in the million-dollar range were common. A 990-square-foot single-story ranch-style stucco home built in 1948, with two bedrooms, one bathroom, hardwood floors, and a one-car garage, on a 0.1-acre, 50' × 90' lot, with a narrow patch of lawn in front of the house and another one in the backyard, in the pristine and sunny Brentwood neighborhood, only twenty minutes from downtown San Francisco when driving at 3 a.m., but an hour away on a good day when traffic is bumper-to-bumper, was offered at $868,000. A similar home, but with an extra bedroom, could be found for an extra hundred thousand dollars. At the time, a couple of dozen of million-dollar homes were listed for sale on websites such as Zillow.com or Realestate.com. They typically stayed listed for only a few days, snatched by those quick to make an offer, the exception being homes having backyards abutting an eight-lane freeway, as it generally took a few weeks to find connoisseur buyers appreciative of such prime views. This reality of the San Francisco market also applies to those homes located in the southwestern-most corner of Daly City, nudged between a loop of Highway 1 and the Mussel Rock open space, which also happens to have the special privilege of being inside what is known as an AlquistPriolo zone. Repeat visits to Zillow or Realestate.com show that it is as dif
f icult to find a home for sale there as in any other specific neighborhood elsewhere across the peninsula.
This implies that—most interestingly for the topic at hand here—as far as real estate value is concerned, the price tag for 1,100-square-foot homes near San Francisco stays roughly the same, irrespective of whether it is inside or outside of an Alquist-Priolo zone.
To reemphasize, if Joe and Jane were shopping for a home in San Francisco—for a measly million dollars, maybe not the mansion of their dream, but a home nonetheless— whether that home is within or outside that zone will not be a factor.
OK, so what is this Alquist-Priolo zone anyway?
Simple.
In 1972, California passed the “Alquist-Priolo Earthquake Fault Zoning Act,” which effectively prevents new construction from being built within 750 to 1,000 feet of a seismically active fault.1 The locations where such faults are known to exist are documented in geological maps produced by the State.2 The law grandfathered in existing construction, but required real-estate transactions to disclose if properties transacted lie within that zone. Making the zone where construction is prohibited a wide band rather that a slim line was wise, as it certainly helped prevent protracted legal debates to argue whether the fault slices through one’s property or is “actually” on the neighbor’s lot. Most notably, single-family wood-frame homes are exempted from the law,3 but earthquakes do not care about human laws: no exemption can make the fault disappear. In fact, on the University of California Berkeley campus, the football stadium was built right on top of the Hayward fault in the 1930s, slicing it in half—so it was built as two separate half-stadiums, to the extent possible (and retrofitted in 2011 to allow similar movement).4 The west and east halves have since moved by five inches with respect to each other due to creep along the fault, making it undeniably clear where the fault is located.5 Incidentally, if the Hayward fault were to rupture during a game, the NCAA football rulebook is silent as to whether or not it will be an automatic first down if the ball is on the ground when the west half of the stadium moves south by ten yards.
Anyone with a pen, a ruler, a map of the San Francisco Bay Area, and the skills to draw a straight line, could easily trace on that map the path followed by the San Andreas Fault west of San Francisco. Going north to south, that segment follows the long and narrow Tomales Bay in Point Reyes National Seashore, takes Highway 1 for a few miles, plunges into the Pacific Ocean at Bolinas, travels underwater six miles west offshore from the Golden Gate Bridge, and hits shore in Daly City from where it does a beeline to Southern California.
More precisely, the San Andreas Fault hits the southwestern-most corner of Daly City, nudged between a loop of Highway 1 and the Mussel Rock open space, which is why parts of that neighborhood are inside an Alquist-Priolo zone. That area was developed in the 1960s, before the Alquist-Priolo Earthquake Fault Zoning Act, which is why roughly three hundred homes, lined up along a handful of city blocks there, fall within the designated fault zone. If fact, some even have the distinction and privilege of sitting right on top of one of the world’s most famous seismically active faults—the very same one whose rupture destroyed San Francisco in 1906. It is possible nowadays to use webbased real estate software and maps from the California geological survey to check the price of homes sitting on top of the fault, or left or right of it, and notice that there is no difference whatsoever in assessed values. The house whose kitchen will be sheared-off twenty feet away from its living room by the next major earthquake is worth as much as any other not sitting on the fault. Given the narrow width of some lots—typically 33' × 105' 6—many will get quite a change of scenery after the next significant rupture of the San Andreas Fault. When it comes to the Big Bad Wolf, houses straddling a fault might be expensive straw huts, but straw huts nonetheless.
At first, this might seem stunning—but not really.
It would be hard to argue that the good citizens of Daly City did not know about the whereabouts of the San Andreas Fault when urban sprawl spilled over that fault zone in the 1960s—but they could be given the benefit of the doubt. That Joe and Jane, moving from Nebraska to California a few years ago, ended up buying a house on top of a fault, might also be excusable; maybe they were ignorant of the forces unleashed by an earthquake, since they likely never experienced any—and at 6 percent commission on a million-dollar sale, some real estate agents might excel at reassuring customers when doubts arise.
However, what is harder to explain is why a different pair of Joe and Jane, moving from elsewhere in California, would not know a thing or two about earthquakes. Either they are blissfully unaware of faults and Alquist-Priolo Zones, or something in their subconscious fogs it all up and makes them rationally accept the risk, because nothing beats the view from the house or from the park nearby, or because the fresh salty air is so rejuvenating, or simply because the real estate market is so hot that one grabs whatever is available (incidentally, not only hot, but literally on fire: the charred skeleton of a house heavily damaged during a twoalarm fire, listed at $850,000 and twenty-five miles from San Francisco, received multiple offers and sold for $1 million,7 greatly broadening the meaning of fixer-upper). Or maybe they expect FEMA to bail them out when their home will be sheared into two halves by the San Andreas Fault, and that they are planning to use that money to rebuild. Maybe they hope to be allowed to locate their brand-new house on the very same lot, but if not, they can always build on top of a fault in another state where there are no Alquist-Priolo zone restrictions. Or maybe they will move away to never have to live though another such traumatic experience. Like those who have suffered massive losses during a hurricane and relocate inland—anywhere away from a coast—as if they suddenly discovered that hurricanes are more powerful in real life than they appeared to be on television. It is always after it devastates the local community that it hits home and becomes real.
However, unless a damaging earthquake actually happens, nobody leaves in fear of earthquakes. In fact, more than half a million Californians move away from the state every year,8 and few mention earthquakes as the cause. The leading reasons for this exodus include the high cost of housing, high taxes, and being fed up with the political culture.9 Pollution, crime, poverty, uncontrolled immigration, traffic, cost of living, are other reasons mentioned.10 Earthquakes? Not on the list. Until the next big one, of course.
DENIAL OF DISASTER To some degree, denial of risk when it comes to rare and extreme events may be rooted in human nature and urban legends, irrespective of the type of hazard. For example, prior to 2016, many residents of Florida living along the coast from St. Augustine to Melbourne were convinced that this part of Florida would never be in the path of a hurricane —in spite of the fact that maps showing the path of all known past hurricanes provide clear evidence to the contrary. As if enticed by the songs of mermaids, blind to reality, they would confidently explain to anyone asking that this protection from hurricanes was due to either the shape of the ocean floor, the shape of the coast where Cape Canaveral projects into the ocean, or both— although no oceanography expert had ever uttered such nonsense.
Many real estate agents and homebuilders bought into that fantasy, believing it firmly, as it was particularly convenient to reassure potential buyers moving to the coast.
Fortunately, in 2016, when Hurricane Matthew hugged the Florida coastline south to north, about thirty miles offshore,11 short of landfall but still producing significant wind and storm-surge damage along the shores north of Cape Canaveral, this urban legend was put to rest (at least, for a while, until people forget again).
When it comes to explaining why some believe they are immune to disasters, it is not that everybody’s brain has turned to mush, but rather that everybody is predisposed to hear the story they wish to hear. To some, urban legends are suf f icient; to others, perceived notions get in the way and can lead to the strangest, highly nonscientific, explanations —like the one occasionally heard that earthquakes in California are divine punishment in retaliation for the gay lifestyle that the state has condoned.
Denial of risk is also the indirect expression of the fact that immediate concerns always take precedence over longterm ones. Concerns about whether today’s weather will affect the planned family picnic trump concerns about global warming of the planet over future decades (that may prevent many more future picnics).
Even when a threat is acknowledged as a reality, denial of risk implies transferring the consequence of risk to others. Those living inland will rationalize that global warming may be real, but are convinced that it will mostly flood those on the coast, while those living on the coast will rationalize that it will flood other parts of the coast than where they live. Other deniers of risk will argue that if the threat were real, everybody would be doing something about it now—which never seems to be the case. Or they may concede that the threat is real but that it pertains to an event so far off in the future that it is not sensible to make sacrifices today for something that might happen down the line—or that may actually never occur in one’s lifetime.12 Whether such denial of risk is a healthy and defensible mechanism to be able to live in an uncertain world is a topic best left to psychologists, anthropologists, and other such specialists. However, because of the natural inclination to deny risk, promoting disaster resilience is not something that people will readily embrace, which is a challenge, particularly to engineers and other professionals interested in enhancing the resilience of communities.
NOT WANTING TO KNOW Some forms of denial are more damning than others. Like shades of gray, they range from honest, unconscious denial reflexes, all the way to the deliberate deception of calculating minds. In the latter case, when strong interests are at stake and it is clear that recognition of the risk posed by a hazard can deeply hurt the wallet, skepticism is expected. If Big Tobacco can still deny that smoking is harmful,13 in spite of more than fifty years of reports by the US Surgeon General and unequivocal statements by the Centers for Disease Control and Prevention identifying smoking as the leading cause of preventable disease, disability, and death in the country,14 it should not be surprising when a single earthquake does not necessarily “convince” those whose bottom line is at stake. Following the 1906 earthquake that devastated San Francisco, with the ensuing conflagration engulfing two-thirds of the city, many folks with something at stake downplayed the earthquake damage. Interestingly, most fire insurance policies in effect at the time did not provide coverage if a building had suffered damage before a fire. Maybe not surprisingly, as the city was still burning, the California governor said that the destruction was mostly due to fire and, likewise, many of the city residents insisted that their property had not suffered any earthquake damage before the fire. Even though photographs taken before the conflagration revealed extensive earthquake damage, within a few weeks of the earthquake, the San Francisco Real Estate Board passed a resolution to only refer to the disaster as “the great fire” instead of “the great earthquake”15— which shows that fake news existed before the term was coined.
Public relations “spins” occur not only following earthquakes. When a major hurricane struck Miami in September 1926, killing 372 people, damaging 8,600 homes, and leaving 43,000 homeless, at a time when the population of Miami was approximately 100,000—estimated to be more than double the 42,753 from the 1920 census,16 as it was in the midst of a construction boom—the mayor issued a public statement to reassure visitors that the hospitable, enjoyable, comfortable city of Miami would be there for their winter vacation.17 Looking at pictures of the devastation in Miami and along the shoreline,18 it would appear that the mayor forgot to remind the visitors to bring their camping gear for the trip, as the sunshine and warm weather might still be there, but maybe not the hotel rooms.
In a similar mindset, after the moderate magnitude 6.4 Long Beach earthquake of 1933 that killed 120 people and produced roughly $40 million in damage—in the midst of the great depression when a gallon of gas cost ten cents and a loaf of bread seven—the Los Angeles Times and many other newspapers spent much of their coverage to downplay the event and urged their readers to spread the news nationwide that that there was no greater place on earth than California if one wanted to be free from the dangers of natural elements.19 In spite of the denial of some, because the Long Beach earthquake destroyed nearly half of the city’s schools and several more in nearby Los Angeles—fortunately, at 5:54 p.m., long after classes, as these collapses would have killed thousands of children otherwise—those who strongly believed that buildings should not collapse during an earthquake seized the occasion. State legislation was enacted, banning unreinforced masonry buildings because that type of construction had shown repeatedly its vulnerability to earthquakes, and most schools in the state had been built using this construction material. The legislation also required schools to be designed to resist a small horizontal earthquake force—a tiny force by today’s standards, but it was a start. Some local school officials complained that this requirement was too stringent and expensive, but their denial was put to rest when the federal government supplemented the funding from bond issues.20 Then, in the late 1960s and early 1970s, after many more damaging earthquakes had demonstrated the need to fix the deficiencies of many old schools, many voters blocked the passage of bond issues intended to fund this work.
Some of the opponents stated that no kid had ever been hurt by an earthquake while in school—which was true, since none of the past destructive earthquakes in California to date at this time had occurred during school hours.21 That is equivalent to telling a flock of turkeys, weeks before Thanksgiving, that there is no reason to worry about the future because their experience so far proves that nothing bad ever happens to turkeys.
Many systemic denials are rooted in concerns over financial hardship. In other words, dollars trump facts. Yet massive cost increases routinely occur for a number of reasons in all things that exist, and the extra cost incurred by these increases often by far overwhelm the cost of other measures that can have major long-term benefits when it comes to resisting hazards. For example, since 1956, the Federal Highway Trust Fund has been collecting a tax on each gallon of gas sold, as a way to fund ground transportation programs across the nation, and most notably highway projects. Great idea, except that the framers of the bill did not—or did not want to—think in percentages. By the end of 1993, the tax was 18.4 cents per gallon. Twenty-six years later, in 2019, it was still 18.4 cents per gallon.22 In 1993, on average nationwide, a gallon of gas sold for $1.17. In 2019, it was $2.47 (and $3.58 during the peak in oil prices that occurred in 2013).23 In other words, the 15.7 percent tax in 1993 became a 7.4 percent tax in 2019 (and a 5.1 percent tax in 2013)—a great discount, thanks to the framers of the bill. Every attempt at changing the rate has been a major political fight. Yet, as basic transportation expenses are unescapable, the Highway Trust Fund regularly runs a deficit of tens of billions of dollars and special transfers from the Treasury general fund24 are needed to fill the budget hole— without necessarily fixing all the other real potholes that the fund was intended to fill in the first place. Strangely, campaigning politicians often promise to fix the nation’s decrepit infrastructure via massive infrastructure programs, but, after the election, they instead fight endlessly among themselves, arguing that there is no money for such wasteful programs. All in possible denial that changing the Highway Trust Fund formula to a percentage of sale price rather than a fixed decimal number of cents per gallon would be a financially sound approach, compatible with widely accepted payer-user concepts, and more sensible than rolling out the red carpet for foreign-owned-andoperated toll roads as is happening now.
25 Incidentally, contrasting with a swift bipartisan vote that went smoothly for $8 trillion over ten years in military spending,26 it took more than eight months of haggling, drama, in-fighting, political posturing, and debate to agree on a 2021 infrastructure bill27 that budgeted half-a-trillion dollars of new spending over ten years (2022–2032), as a small but positive step toward the additional $2 trillion needed to meet national infrastructure needs, as consistently reported by the American Society of Civil Engineers over past decades.28 The fear of greater costs has often been used as the “deal killer” in multiple initiatives proposed to enhance the safety of construction against multiple hazards. Special interest groups have often argued that any new requirement that makes homes better in this regard is prohibitively expensive, alleging that implementing such measures could go as far as killing the housing market. This is an argument of dubious merit, given the fact that the cost of residential homes in many locations across the country has increased wildly in the past decades (except for a few years due to the bursting of the “housing bubble”), generally without any improvements made to their ability to resist any hazard, and new homes have continued to be built and sold nonstop.29 In many parts of the country, cost does not even bear any relationship to the quality of construction, let alone protection measures against various hazards; it is rather a matter of offer and demand, overarching political and economic factors, and, of course, location, location, location.30 This makes it ill-advised to use the fear of prohibitive costs as an excuse against enacting logical hazard-protection measures in construction practices.
On account of ignorance and a good dose of California’s pioneering spirit, it might be possible to forgive the Chambers of Commerce and other promoters who wrote newspaper editorials and telegraphed messages to all of America and Europe that diminished the extent of damage after each large earthquake of the nineteenth century and early twentieth century, in an attempt to reassure the oncoming masses. It is obvious that statements like “earthquakes are trifles as compared with runaway horses, apothecaries’ mistakes, accidents with firearms”31 (made after the devastating 1868 earthquake),32 or “one Western cyclone will do more damage than all the earthquakes California has ever known” 33 (in newspapers after the milder 1892 one),34 or “no place on earth offers greater security to life, and greater freedom from the dangers of natural elements, than Southern California” 35 (affirmed after the 1933 Long Beach earthquake), were dictated by the selfinterest of developers.36 In light of today’s knowledge, it is much harder to forgive people who make similar comparisons nowadays. For example, stating that earthquake damage over the past twenty years has been insignificant compared to that from many other hazards is disingenuous; it compares the relative impact of various hazards during a few decades when no major earthquake has happened. Like the turkeys before Thanksgiving mentioned previously, some people fall for that stuff—Paul Joseph Goebbels (Nazi Reich Minister of Propaganda, 1933– 1945) could not have done better himself. Truth is, a single major earthquake by itself can produce enough mayhem to turn these damage statistics on their head.
NOT INVENTED HERE AND PROFESSIONAL DENIAL Bridges are designed to carry truck traffic on an everyday basis, and engineers are compelled to ensure that collapse will not occur under a truck overload. As such, maximum truck weights and maximum load per axle are specified, and bridges are designed to resist these loads—and, in fact, to resist greater loads than that because “safety factors” are built into the design process to account (to some extent) for the probability of overloads. In addition, to protect against renegade members of the trucking industry that may be tempted to disregard the law of the land, weighing stations are located at multiple points along highways. There, trucks must slowly proceed over a scale to confirm compliance with the legal load limit. Violators are subject to fines, and in most states, the driver of an overloaded truck will be charged with a misdemeanor,37 which, technically, is a crime punishable by up to twelve months in jail—or maybe more if facing a judge who had a truck back up into his brand-new BMW earlier in the day.
In short, tons of trucks drive on roads every day, everybody is familiar with the fact that they are heavy, and engineers use safety factors when designing bridges because nobody wants to see bridges collapse under the weight of the trucks they carry.
Bridges have collapsed in the past, as shown earlier, for a number of reasons, but extremely rarely due to truck overloads. There could be room to revisit the magnitude of the safety factors applied to truckloads when designing bridges, but what would be the point? There is broad acceptance of the existing load factors by the engineering community, and no outcry on the extra cost incurred in each bridge designed to protect against collapse due to truck overloads. These safety factors have been “calibrated” partly on past practice and partly on observed truck traffic, and no pressure exists to reduce them. Trucks are everywhere, every day, noticed by everybody, and nobody is compelled to debate whether the failure of a bridge under truck traf f ic every 2,500 years would be a more acceptable outcome than once every 500 years. Arguably, there may also be no incentive to have such a discussion because, when a bridge failure happens, the engineer of record for that bridge is held accountable—which makes it an individual responsibility/liability.
In contrast, when it comes to earthquakes, they are not part of everybody’s daily lives. Hardly anybody has lived through one or expects to. As such, they are intangible concepts—they do happen around the world, here and there, but they are essentially something “not invented here.” The consequences of earthquakes elsewhere are not perceived as highlighting possible similar vulnerabilities at home—it is somebody else’s problem.
Now, consider the following scenario. Imagine a small army of rogue drivers, with an axe to grind, who would pack-up their trucks solid, way above the legal limit, and start to drive together with the intent of collapsing big bridges. Imagine that the trucks themselves can carry that load without their axles snapping, and imagine that the roads themselves would not crumble under the heavy axles.
Imagine that the heavy axle load of the first truck would not punch through the bridge deck and get jammed, thereby stopping the entire convoy right at the entrance of the bridge due to failure of a single small structural member.
Imagine that this criminal convoy of overweight trucks is actually capable of collapsing an entire bridge by grossly overloading it. Then, at best, it will travel until it encounters the first bridge unable to sustain this unusual burden, which will then collapse, bringing down all the trucks with it. That would effectively be the demise of the convoy. The consequence of that overload would be the collapse of a single bridge.
By analogy with the overloaded truck, imagine an earthquake bigger than what has been considered in the design of bridges—which is not a big stretch of imagination, as many bridges are expected to suffer significant damage when a severe earthquake will strike, irrespective of where in the world that happens. Whereas the convoy of superheavy trucks managed to damage a grand total of one bridge, the “overload earthquake” will damage all at once an overwhelming number of bridges within miles of its epicenter—hundreds of miles in some cases. This can disable the entire road transportation network of a region for weeks—or months. In such a case, given that the state of practice will result in thousands of damaged structures simultaneously, responsibility/liability generally becomes a social problem rather an individual one. The pressing need to immediately respond to the emergency pushes to an undetermined future any blame and accusations.
Furthermore, when scrutiny does occur, the engineering community will collectively explain that severe damage is the expected outcome given existing seismic-design specifications (notwithstanding the subtle nuances between the definitions of expected damage and collapse). Such a “forgiving” situation hence provides little incentive to change practice.
Furthermore, engineers are people too. In spite of the inescapable evidence, the fact that seismic maps are consensus-based products of the best scientific minds, and the fact that seismic design provisions in design codes and standards are similarly consensus-based requirements developed by top engineers, there will still be professionals convinced that either earthquakes do not occur, or that if they occur, they will most certainly be less severe than expected. This is expressed incidentally by the fact that some engineers consider the design rules to protect against earthquakes or other extreme events to be “too conservative,” or in some cases not warranted—which is effectively “denial,” or at least “skepticism.” While denial, as indicated earlier, is a deliberate or subconscious belief that catastrophic outcomes will never occur, or only occur to others, skepticism is a judgment, generally made in presence or absence of a deep understanding of all facts and data, which is based on the conviction that “bad science” has been used. Coupled to those two is often the “gambler” attitude that while earthquakes will occur, the rare “design earthquake” will not occur in the gambler’s lifetime. Not here.
In most universities, the baccalaureate civil engineering degree provides general courses with basic exposure to design codes. Emphasis is generally on gravity loads and generic codified information on earthquake or wind loading requirements in a way by which they can be applied without special knowledge on the nature of these loads and the philosophy underlying these design requirements. This is also true of the graduate civil engineering curriculum in many universities. With some notable exceptions, the places where this seems to not be the case is where disasters have struck before. Without such a precedent, there is apparently little incentive to do otherwise.
REMEMBERING THE PAST IS NOT A HARBOR OF THE FUTURE On September 19, every year, Mexico City holds an earthquake drill. Partly in commemoration of the 1985 earthquake that killed thousands on that date, partly as an earthquake preparedness measure, at 11 a.m. on that day, all buildings are evacuated. Everybody then goes down the stairs of their building—calmly, because everybody knows that it is a drill, that everybody will reach the street safely, and that it will be back to work after that bit of morning exercise.38 The older residents remember the traumatic devastation of 1985, with its more than four hundred collapsed buildings and ten thousand casualties.39 Yet, to those in their thirties or younger, this is only a historical event—much like Pearl Harbor is to nearly all Americans today: something to read about in textbooks.
Understandably, without the benefit of having lived through a disaster, without any sensorial imprint to sharpen the senses, the younger Mexicans cannot approach the drill as seriously as their seniors can, and might go down the stairs while texting their friends. Textbook memory is worthless compared to lived memory.
That is not a Hispanic cultural trait but rather a universal one. How many people will admit to being so fed up with fire drills that they close their office door and continue working when an alarm is triggered? Those with an office on the ground floor or near an exit can argue that they will smell the smoke and have time to run out before roasting when it will be real, but what about those who cannot be bothered with going down two, five, ten, twenty, or thirty flights of stairs? Some people argue in favor of unannounced fire drills, to include the element of surprise as part of the drill, to avoid it simply being an exercise in herding to the exits blasé participants whose behavior bears no resemblance to what would happen in an actual fire. Others argue that unannounced drills have the undesirable consequence of many people ignoring the alarms during an actual fire, thinking that it is another bloody, annoying drill—until they realize that it is not the case and it is too late to escape.40 Anyway, two hours after the September 19, 2017 earthquake drill, thirty-two years to the day after the commemorated 1985 disaster, a real earthquake hit Mexico City. Apparently, for those who trust bogus statistics, September 19, like January 17 (for Northridge and Kobe) is a very propitious date for earthquakes. Indisputably, people who survive an earthquake learn a few things. That does not necessarily mean that they learn everything that could potentially be learned. After the 1985 earthquake, the Mexican building code was tightened, reconstruction happened, major efforts were made to improve earthquake preparedness, and some corruption problems were resolved.41 Still, during the 2017 earthquake, 220 people died and forty-four buildings collapsed in Mexico City— smaller numbers than 1985, but a far cry from zero, and disappointing given that it was a much smaller earthquake this time around (magnitude 7.1 instead of 8.0, although striking much closer).42 Most of Mexico City is built on an old lakebed, which means that most buildings in town are sitting on top of more than 150 feet of soft clay.
43 Such a thick deposit of soft soil has its problems. One can think of Mexico City as if sitting on a gigantic bowl of Jell-O: small shaking at the bottom of the bowl is enough to madly wiggle the top. Likewise, earthquake vibrations at the bottom of the lakebed are greatly amplified by the time they reach the surface where everything bounces around more wildly—so to speak. This makes it possible for distant earthquakes that normally would have no impact hundreds of miles away from the epicenter, to become destructive events. The devastating magnitude 8 earthquake that killed so many in 1985 had its epicenter three hundred miles away; the 2017 one was about one hundred miles away. Considering that seismic waves travel at eighteen thousand miles per hour,44 it takes twenty seconds for these waves to travel one hundred miles. This inspired Mexico, in response to the 1985 disaster, to develop an early-warning system. As one of Mexico’s points of pride, it was launched in 1993.45 As a result, thanks to the early-warning system, hundreds of thousands of Mexicans received an alarm on their cell phone, warning them of the incoming seismic waves racing toward town like a runaway train. Whether anyone can do anything that makes a difference in so few seconds is debatable, and depends on a number of factors, but the warning system was credited for the fact that fewer casualties occurred than would otherwise have been the case without it.46 However, another reason why fewer people died—other than the fact that the 2017 earthquake was much smaller than the 1985 one—is because Mother Nature can be an ally by weeding out bad buildings with each earthquake. If the source of the problem (that is, bad buildings) is eradicated by the earthquake (because they collapse) and no new problematic buildings are added to the existing inventory, this should help achieve a better survival rate in future earthquakes. That is the approach taken by Chile when it comes to unreinforced masonry buildings. Recognizing that this kind of construction is known to be at high risk of collapsing during an earthquake, laws were enacted years ago to stop construction of such buildings there, but it is most dif f icult to force people to abandon the existing ones constructed prior to the ban, and fixing them to make them safer during earthquakes is more expensive than most owners can afford. As a result, the Chilean strategy is simply to wait for them to be eventually “wiped out” of the building inventory, one earthquake at a time, one collapse after the other, and to outlaw their reconstruction.
However, be it cigarette smoking or shoddy construction practices, bad habits not only die hard, but also tend to return after some time. As such, surviving an earthquake is good in that it brings better things forward, but time is an enemy that can erode gains in awareness. Similar to the often-heard truth that “institutions have no memory”—and are thus prone to repeat the mistakes of the past—societies equally lose memory because hard-earned wisdom can be forgotten within a generation, if not sooner.
ON THE DISASTER TRAIL
Not a Big Deal When my wife, son, and I arrived in Berkeley in the early 1980s, the area was already suffering from a severe housing shortage. We found ourselves lucky midway through the first academic year to be able to move into one of the University apartments in the Smyth-Fernwald complex. The hillside residence complex had been built in 1945 as dormitories for women to ease the housing crisis created by the large number of GIs enrolling at the university when World War II ended, and they were converted into apartments for married students in 1970.47 To put it mildly, these were Spartan accommodations, but conveniently close to campus. As a bonus, we had a great view of the Bay Area and—as a bonus to a structural engineer—of the Bay Bridge and the Golden Gate Bridge (fourteen miles away as the crow flies). During our time there, we were surprised to hear many Californians comment that earthquakes were “not a big deal.” When learning that I was studying earthquake engineering, some even went as far as adding that this was all very interesting but that there was no reason to worry about earthquakes—they were problems in the old days, maybe, but not nowadays. These were obviously not words from engineers, but still mindboggling statements considering that the 1906 San Francisco earthquake had happened right across the bay.
A few years after we left California, during the 1989 Loma Prieta earthquake, one of the Smyth-Fernwald buildings moved off its foundation, but since the epicenter of the earthquake had been quite far away, structures on the Berkeley hills did not suffer any major damage. However, this wake-up call reminded the university that it was pretty much sitting on top of the Hayward fault (mea culpa: our apartment was 550 feet from it), so it embarked on a massive program to identify and retrofit its seismically vulnerable buildings. The Smyth-Fernwald housing complex was flagged as problematic, because not only did the Hayward Fault run through it,48 but also the hill slope on which the complex was built was not considered to be seismically stable. Therefore, the apartments were demolished in 2013.49 A sad outcome for something that is “not a big deal.”

Airport Proctologists
POST 9/11 AIRPORT SECURITY Once upon a time, it was common for airline pilots to invite little kids—accompanied by at least one parent—to visit the cockpit during long-haul flights. Kids sitting on their knees, some pilots would even temporarily turn-off the automatic pilot system, to veer the plane a tad bit left, a tad bit right, just for fun.1 In some rare cases, a lucky passenger would even be invited to sit in the cockpit’s jump seat through landing.2 That was eons ago, when the fun was in the entire flying experience, rather than limited to the goofy arrival and departure announcements by flight attendants of discount airlines.
Every now and then, concerns were raised that easy access to the cockpit was an invitation to hijackers. There certainly was no real barrier preventing a passenger from reaching and threatening the pilot—and many did.3 According to the Guinness World Book of Records, the first recorded hijacking took place in 1931 in Peru,4 when an eight-passenger tri-motor airplane—the kind of rust bucket having a square fuselage of corrugated sheet-metal that Charles Lindbergh and Amelia Earhart flew, but slightly larger—was surrounded on the ground by members of a revolutionary army. In-flight hijacking started in the 1940s, more so with commercial airline flights,5 but certainly not exclusively.
6 Then, through the 1950s and 1960s, it became more common, as all kinds of attention-seeking psychopaths and rebels with a political or social grievance saw it trendy to seize control of airplanes. Between 1968 and 1972, alone, there was an attempted plane hijacking every 5.6 days.7 In response, by the end of 1972, the US Federal Aviation Administration (FAA) required screening of all passengers and carry-on luggage, which led to the broad implementation of metal detectors and X-ray machines in airports. It helped a bit, as the number of hijackings dropped, to a handful on a good year, to a few dozen on a bad year.
8 Still a significant number, including many cases where hard lessons were learned too late—when learned.
For instance, in 1999, a Japanese man wrote to All Nippon Airways, multiple Japanese government agencies, and newspapers, that he had discovered multiple security flaws at the Haneda airport. It may not have helped the credibility of his claims that he also asked in the same letter to be hired as security guard, so he was ignored by all. Yet the flaws were real, and to prove his point, he took a flight from Osaka to Haneda with an eight-inch-long kitchen knife in his checked luggage. He collected his bag on arrival and used the unguarded passageways that he had identified as a security flaw to reach the departure gates without going through any security checks. Technically, he could have made his point very clear by pulling the big knife out of his luggage in the middle of the departure gates area—even running around while swinging it wildly and shouting “Bonzai!” if needed, for more spectacular effect. Instead, he quietly boarded an All Nippon Airways Boeing 747. After take-off, he pulled the knife out of his bag, entered the cockpit, forced the co-pilot out, stabbed the pilot, and took command of the plane for a ride—a frightening idea given that he had never piloted before. The airplane dived by tens of thousands of feet, down to an altitude of one thousand feet, before he was neutralized.9 In response, after the hijack, the Japanese authorities reviewed security procedures in all airports across the country and apparently eliminated all such security flaws.
Typically, hijackers took passengers as hostage, seeking either ransom money, a free ride to some political asylum, or concessions such as the release of political prisoners.
Some governments were compliant to such demands, some not. For example, in 1985, Trans World Airlines Flight 847 to Rome was hijacked after taking off from Athens. It was diverted to Beirut where nineteen passengers were traded for fuel; then to Algiers where twenty more were dropped; back to Beirut where they killed a passenger, dumped his body on the tarmac, and sent seven that had Jewish names to a Beirut prison; back to Algiers where seventy more were released; back to Beirut where they managed to get the release of an Islamist accomplice from a Greek prison in exchange for eight Greek passengers.10 All that bouncing around like a ping-pong ball happening while the United States refused to negotiate with the same hijackers—who had requested the release of 766 prisoners held captive in Israel in exchange for the forty US passengers on board—on the basis that it would only encourage further hijacking.11 After it appeared that no more juice could be squeezed from various governments, all hostages were eventually released and the hijackers walked away unpunished, since Beirut was the capital of a rogue country (Lebanon) in the middle of a fifteen-year civil war.
12 Things changed though, in 1994, when the Armed Islamic Group of Algeria hijacked an Airbus 300 and its more than two hundred passengers, with the goal of crashing it on the Eiffel tower, or at the very least blowing it up above Paris.13 The mujahidin had boarded Air France flight 8969 with machine guns and explosives, but thanks to a lack of subtlety, these armed criminals were identified by ground security before the plane took off. To make it worse, it never occurred to the hijackers that the majority of the passengers on board would be Algerians compatriots—more problematic than a planeload full of infidels. The standoff lasted three days and ended-up with a twenty-minute raid of the plane by the French counterterror unit and snipers that, in the end, killed all the hijackers.14 In response, Air France stopped flying to Algeria for eight years.15 Beyond that, the French gave medals to the airplane crewmembers that survived the ordeal, posthumous honors to those that died, accolades to members of the counterterror units, and considered the entire operation a great success. Somehow, overlooked by everybody worldwide was the fact that, for some players, the hijacking game had changed; the objective was not anymore to use passengers as hostages in exchange for various concessions, but rather to use commercial planes as weapons—like missiles or bombs.16 Parenthesis: In the eyes of the author, who is a structural engineer, anyone who wishes to destroy a landmark structure—regardless of the cause of action or legal, theological, social, or political theory pled or asserted—shall be deemed nothing more than a complete moron.
Irrespective of the directives, legal opinions, and pedagogical wisdom of Piaget, the Summer-hill School, Child Protection Services, UNICEF, and the Global Initiative to End All Corporal Punishment of Children, bullies who laugh while stomping on the sand castles of other kids are on the path to become adult morons and deserve a damn good slap on the back of the head before it is too late. Do not touch the Eif
fel Tower!
In spite of all this, cockpit doors in aircrafts prior to 2001 remained a rather thin divide between the pilots and their passengers. In fact, some international airlines did not require the doors to be locked during flights,17 which makes one wonder how many passengers looking for the toilet mistakenly found themselves entering the cockpit—and how many were too drunk to realize their mistake? US airlines required that the cockpit be locked, which was undoubtedly of great assistance to drunkards who randomly pushed all visible doors, but both pilots and flight attendants had keys and flimsy cockpit doors were not exactly breach-resistant— and had been punched through in multiple occasions by deranged passengers.18 Then, 9/11 happened, which was a “big earthquake” for the airline industry. Except for those who have been without contact with world history since 9/11—either for having lived in a monastery or for having spent every hour of their waking life playing video games while flunking high school— everybody knows that on September 11, 2001, nineteen fanatics hijacked four airplanes. Three were flown into US landmarks, namely the two World Trade Center towers in New York City and the Pentagon in Washington, while the fourth one crashed in a field on its way to the White House when the passengers learned what happened to the other planes and fought to regain control of the plane.19 The entire US airspace was closed for three days while government agencies tried to figure out how it happened and how to tighten security in case more of the same was waiting in the wings.
In the months that followed, many heads of US agencies, and even the president of the United States, stated that nobody had foreseen that terrorists could possibly ram hijacked airplanes into buildings.20 Hard to believe given statements to the contrary by many other former US of f icials, such as the former FAA security chief, the former head of the CIA’s counterterrorism operations, and even of f icials at the World Trade Center and Pentagon, who all had expressed concerns that commercial airplanes could be used as weapons. The FAA’s very own self-published history recognized in 1972 that the hijackers of Southern Airways Flight 49 threatened to crash the DC-9 into the Oak Ridge National Laboratory’s nuclear reactor if their demand for a $10 million ransom was not met.21 In 1974, the perpetrator of another failed hijacking had documented on an audio tape that his intention was to crash the plane into the White House. Again, terrorists captured by the FBI in the mid1990s confessed that they had been planning to dive a hijacked airplane into the CIA headquarters. Even the North American Aerospace Defense Command (NORAD) ran simulations from 1991 to 2001 that considered a commercial aircraft hijacked for the purpose of crashing it into a landmark building in the United States.22 Not to forget that “not having foreseen” such a scenario would imply that after the Air France flight 8969 experience, everybody assumed an ostrich position and buried their head in the sand to make the problem go away.
Irrespectively, on 9/11, the big earthquake had now happened and action was called for. In response, the FAA recommended the hardening of existing cockpit doors and all major US airlines implemented these measures within six months (it only became a mandatory requirement by November 2003). The Transportation Security Administration (TSA) was also created to put the federal government in charge of airport security, taking over this responsibility from miscellaneous private security companies. That is when nail clippers and forks became potential terrorist weapons and hour-long line-ups at security screening became common occurrences.
Then, on December 22, 2001, on American Airlines Flight 63 from Paris to Miami, another fanatic attempted to detonate plastic explosives stashed into his shoes— unsuccessfully, as passengers noticed his repeated attempts to light the damp fuse sticking out from his shoe and ganged up to restrain him.23 In response, TSA started requiring that all shoes be removed and screened separately.
On August 10, 2006, British Intelligence arrested twentyfive fanatics who had been plotting to use liquid bombs to blow up multiple airplanes over the Atlantic Ocean. In response, TSA started prohibiting the transportation of liquids, aerosols, and gels in carry-on luggage, except in minute amounts. This also resulted in countless water bottles being thrown into garbage cans near airport security screening, and a boom in the sale of water bottles past security screening.
On December 25, 2009, on Northwest Airlines Flight 253 from Amsterdam to Detroit, another fanatic failed to detonate plastic explosives hidden in his underwear and set his pants on fire instead, inflicting second degree burns on his thigh and genitalia24—maybe with the side benefit of making it harder for this “underwear bomber” to reproduce.
In response, the TSA used the event to accelerate the implementation of full body scanners across US airports— the wonderful technology that reveals the surface of the skin under clothes, which many consider to be a kind of indecent strip search and an invasion of privacy.
25 In response, in response, in response. Always in response. That is at the crux of the problem. It is fortunate that explosives have never been found in certain body cavities of terrorists; otherwise, in response, passengers would now also have to be screened by a TSA proctologist when going through security.
SECURITY MEASURES: FOOLPROOF OR FOR FOOLS In spite of best efforts and best intentions, it remains that no security measure is foolproof, because there is no such thing as absolute safety. For example, although it has long been recognized that even if a vault door as thick and secure as those used in banks were used to secure access to the cockpit—which would be prohibitive in airplanes due to their excessive weight—that by itself would not be a guarantee of ultimate safety. Simulations have showed that a pack of terrorists would only need three seconds to rush into the cockpit when the pilot opens the door to go to the toilet.26 Therefore, after all airplanes cockpit doors were replaced by the more secure ones mandated by TSA, a new procedure was developed that required flight attendants to block the aisle with the service trolley to keep passengers at bay when the pilot leaves the cockpit to use the front toilet.
Another security hole plugged!
Does all of the above now make airplanes impossible to hijack? Having grandma and grandpa “strip searched” by a machine after throwing away their water bottle might make everybody feel that amazing measures are now in place to keep terrorists at bay. But, arguably, an unstated objective of creating TSA and tightening up the screening measures in airports may have been, in large part, to give the public the perception of safety and thus save the commercial airline industry from bankruptcy. Surely, the bar has been raised for anyone planning to hijack an airplane, but it has not made it impossible. In fact, in 2017, inspectors from the Department of Home-land Security whose job is to travel incognito from airport to airport with guns and explosives in their carry-on luggage to test the effectiveness of the TSA screening found that they were able to smuggle them through security checkpoints 80 percent of the time. This was seen as a major improvement over 2015—fourteen years after 9/11—when the TSA failure rate at detecting these weapons was reported to be 95 percent.27 When attempting to create a security barrier by building on job experience and skills, it certainly does not help when one in four TSA screeners quits the job within six months;28 in part, because screeners are among the lowest-paid federal government employees, with a starting salary close to minimum wage—less than if they worked at the airport sandwich shop.29 Yet, even if airport security screening were watertight, that would not make hijacking impossible. Anyone who has flown business class has observed that rules in that cabin are more relaxed. While the masses in economy class get plastic forks and knives—because a butter knife might be too much temptation to a budding hijacker—the elite in business class on international flights sometimes enjoys real cutlery. While the proletarians in economy class will be yelled at by the flight attendant if they attempt going to the toilet while the seatbelt sign is on—and has been on for three hours—the rule does not always apply to the privileged few who shelled out four times more money on the airfare. Nothing prevents a malicious group of individuals from buying all the seats in the business cabin, to have an easier time storming a cockpit defended by a 120-pound flight attendant and her service trolley—or from buying all the seats in an airplane for that matter to eliminate the presence of heroic passengers that might interfere. Where is it said that someone on a suicide mission cannot splurge on a business class ticket and the comforts it will bring for an hour or so before dying?
However, it may not even be worth the logistical trouble for terrorists to develop such a plan when so many other options exist. Bombs can be detonated in trains, as was done in 2004 when ten near-simultaneous explosions in four trains in Madrid left 193 people dead and roughly 2,000 injured,30 or in subways, as was done in 2005 when bombs exploded on three London Underground trains as part of a coordinated operation that killed 52 and left more than 700 injured. Even easier to achieve, trucks can be used as weapons, as was done in 2016 when a nineteen-ton cargo truck was purposely driven into a crowd celebrating the national holiday in Nice, killing 86 and injuring 458— comparably as deadly as the London Underground event, without even needing explosives.
Yet, while hundreds of billions of dollars are being spent in airport security, nothing is done to control access to trains and subways31—except maybe for the addition of security cameras that can provide after-the-fact information.
The truth of the matter is that nothing happens for a number of reasons. First, trucks and trains cannot be broadly “weaponized” easily—meaning that they cannot be crashed into the White House, as could be done with an airplane. People who took a train assumed the risk that came with it, while innocent people not in the train will not be victims. Second, there is truth in the “official line” that providing rail stations, subway stations, and bus stations with the type of screening done at airports is impractical and would bring the transportation network to a halt.32 The normal activities of life cannot stop because there are illintentioned people and nutcases in this world. Nobody wishes to see massive concrete barriers installed along the edge of every sidewalk in crowded cities just in case someone goes on a truck rampage.
However, there are many instances—universally, not just on terrorism-related issues—when things can be done and should be done to prevent events with massive undesirable consequences, but are not done. Instances when, after the fact, the population will insist that these preventive measures should have been implemented before. That is not only true for terrorism, but also for every other type of low-probability large-consequences event, symbolically called “earthquakes” for the current purpose.
“WHY WAS IT NOT DONE BEFORE?” ASKED THE OUTRAGED Like it or not, as illustrated by the above example, it is part of human nature to react after the fact, rather than plan beforehand. While all would agree in principle with the wisdom of expressions like “an ounce of prevention is worth a pound of cure,” all that sagacity tends to be thrown out the window when the said ounce of prevention requires an effort or entails some costs. That is when the idiom “we’ll cross that bridge when we get there” takes over because there is always something more important, urgent, enjoyable, or rewarding that will take precedence in the grand list of things to be done and that will monopolize all available time and resources. Everybody can think of things that should be done at home or at work “when time allows.” In fact, the excuse “I’ll do it when I get around to it” is so often heard that those tired of waiting can purchase wooden tokens with a large “TUIT” engraved on both faces, to give friends or relatives the round “tuit” that they so desperately need.
As a result, when things get done, they are often done “in response” to the disaster that could have been prevented or mitigated if the actions had been taken before the event. All of a sudden, the extensive damage, the body bags, the tragic stories of loss and pain, provide inescapable evidence that something should have been done, and push the topic to the top of the list. “Something needs to be done, now!” is trumpeted. As if patching potholes in a poorly designed road could compensate for failure to have designed a good road in the first place. As if asking a professor, after having flunked the final exam, if there is any extra work that could be done to pass the class, instead of having done the assignments to learn the material before the exam (yes, that question gets asked). As if trying to shove some of the toothpaste back into the tube makes any sense.
It obviously cannot. Nonetheless, the post-event flurry of activities to clean up and rebuild—be it well intentioned or simply the product of weasel politics—gives the impression that those in charge are all doing what they are supposed to do, in a capable and responsible way. If Penn and Teller (the illusionists renowned for revealing how magic tricks are done) had to describe this, they might call it an effective misdirection to hide the fact that the very same people failed to do what should have been done years earlier. It redirects the “do not fix it until it is broken” dunces before the event, to dress-up as “let’s fix the damage” men and women of action after the event—conveniently blurring the fact that the whole thing might have been set up to fail by the inaction of the same men and women.
“Even a fool may be wise after the event,” as stated in The Iliad.
33
AHEAD OF THE GAME: WHAT WAS DONE BEFORE, NOT AFTER!
There are obviously examples that will come to mind of individuals and organizations that are preparing ahead of time for disasters. Particularly when it comes to agencies and corporations, it may appear today that they are “ahead of the game” and truly getting ready to prevent disasters.
Indeed, some of them truly are. However, in some cases, these outstanding activities to prepare against future damage have happened after massive losses or near misses, or because the sight of damage to others prompted them to assess their vulnerability.
For example, after the Loma Prieta earthquake of 1989— known as the World Series Earthquake because it happened thirty minutes before the start of Game 3 between the Oakland Athletics and the San Francisco Giants, making it the first major US earthquake to happen live on television34 —when the Goodyear blimp originally intended to provide overhead views of the baseball game flew around town and broadcasted views of the extensive damage to the Bay Bridge and elevated freeways, as well as of the burning fires in San Francisco, the message was loud and clear. Most of the civil infrastructure built in Northern California had been built over decades during which no major damaging earthquakes occurred, and for the most part in an era when knowledge was seriously lacking on how to design it so that it does not sustain fatal damage during an earthquake. In some cases, prior to the Loma Prieta earthquake, the vulnerabilities of this infrastructure were known, but either there were other more pressing priorities to address or no budget to do anything about it.
When shaking started, for some, it was unfortunately fatal. However, for those that survived, the earthquake was a good thing.
Seeing the Oakland Athletics win the World Series when playing resumed, after a roughly two-week delay, might have somewhat lifted the spirits of the residents on the east side of the San Francisco Bay. However, losing the World Series would not have been as bad as losing access to water because of an earthquake. Following the earthquake, the East Bay Municipal Utility District (EBMUD) that supplies water to more than one million people living along the east side of San Francisco Bay decided to assess the vulnerability of its entire distribution system. Results did not look good. In part, because most of the utility’s water at the time came from further east and reached the thirsty East Bay by crossing the Oakland hills through a tunnel built in 1929. It so happened that the east-west tunnel intersected the Hayward fault, one of the most dangerous faults in California. Although forgotten by most—and overshadowed by the 1906 San Andreas Fault earthquake—the 1858 earthquake that occurred along the Hayward fault was one of the most destructive in California’s history.
35 The past six earthquakes on that fault have happened on average 150 years from each other (ranging from 95 to 183 years).
Seismologists expect a magnitude 7 earthquake to occur on that fault anytime36—in fact, looking at the average, simple math shows that it is overdue.
The EBMUD study showed that the tunnel was likely to be ruptured by slippage along the Hayward fault, cutting the water supply to eight hundred thousand people for a period of up to six months.37 By 2007, nearly twenty years after Loma Prieta, construction of the complex engineering solution to overcome this deficiency was completed. Looking at it with today’s eyes, it may be seen as outstanding foresight and evidence of thinking ahead of time. However, in retrospect, this seismic improvement program largely happened in response to the Loma Prieta earthquake.
In response—again.
Likewise, shortly after the Loma Prieta earthquake, California Governor Deukmejian appointed an Independent Board of Inquiry to focus on bridges. The Board was tasked to report on why so many bridges collapsed or suffered significant damage during the earthquake, and to recommend what to do with the more than twenty-two thousand bridges in the state (half of those being maintained by the California Department of Transportation, also known as Caltrans) to prevent future similar destruction.38 The Board’s final report stated: “The fiscal environment at Caltrans in the last two decades seems to have inhibited giving the necessary attention to seismic problems. Many items ranging from research on earthquake engineering to seismic retrofitting were placed in low priority because of the limited possibility of funding due to budget constraints.”39 Reading between the lines, this was a polite way to tell the governor that Caltrans needed a round “tuit” when it came to fixing the state’s bridges, but that the government—and thus the governor—always had more important things to do. Decades before, after the 1971 San Fernando earthquake during which many bridge spans simply fell off their supports, Caltrans had embarked on a project to add restrainers to tie spans to their supports; yet, it took seventeen years to execute this $54 million program —effectively $3.2 million per year, out of a state budget of approximately $50 billion.40 When it came to the Department of Transportation’s budget, the priority in California was not to fix bridges but to invest in roadwork to relieve traf f ic congestion—obviously, still a work in progress fifty years later.
Why is so much being said about what California does, and so little about what other states do, when it comes to earthquakes? Simply because most other states have not had damaging earthquakes in recent history—the “not so recent” ones being long forgotten. Therefore, while some of these other states had the prescience to retrofit key bridges, others still have not found a round “tuit” in their pockets— but they sure will not fail to do so after the next damaging earthquake.
In response . . .
Examples also abound for all other kinds of “earthquakes” considered here. For instance, there has been a back-and-forth dance in coastal regions to bury or not bury power lines to minimize power outage due to line damage during windstorms. In 2019, Florida passed legislation requiring power companies to move outageprone portions of their distribution network underground.41 Some utilities, like Florida Power and Light already have 40 percent of their lines buried, as this is an effective solution to eliminate damage to the power lines from flying debris and falling tree limbs during hurricanes. In contrast, North Carolina decided against doing so after assessing that it would be too expensive, translating into possible electricity rate increases of 125 percent.42 Note that buried pipelines are in principle watertight, unless damage occurs to the PVC pipes in which they run, which could happen in the case of uneven soil settlement. Repairing buried power lines is substantially more time consuming and expensive, but it will always be possible to “cross that bridge when getting there.” It may or may not be wise to wait for a disaster before acting. Arguably, to make an enlightened decision, one would need to know “what are the odds” that the Big Bad Wolf will come? Answering this question requires diving into the world of statistics and probabilities, for a quick dip, hopefully without drowning.
ON THE DISASTER TRAIL
Kiwi Airport Security At one point during a three-month stay in Christchurch, New Zealand, in 2010, I had to take a domestic flight from Christchurch to Wellington on Air New Zealand. After grabbing my boarding pass at the check-in pod in the old domestic terminal (replaced since by a more modern facility), I proceeded to the departure doors. I spotted the Xray machine and metal detector ahead—in my mind, standard airport equipment in the post 9/11 era—and lined up waiting for my turn. When the officer checked my boarding pass before pushing through the X-ray machine the plastic bin in which I had deposited my laptop, wallet, and keys, he looked at me and said, “You are at the wrong place. Your flight leaves from there.” He pointed at Gate 6, behind me. Sure enough, there was my gate, outside of the security-controlled area. It had never occurred to me that there could be a departure gate where nobody needed to go through a metal detector and an X-ray machine before boarding the plane. Anybody could have bought a ticket online, printed their boarding pass, and walked inside the Wellington-bound plane without even as much as an inquisitive stare. The entire Osama Ben Laden fan club of New Zealand could have boarded the twin-engine turboprop and it would have been greeted by the same smiling Air New Zealand flight attendants as everyone else. No need to hide anything in shoes, water bottles, underwear, or body cavities.
After landing, I mentioned to my host in Wellington my surprise at, not the lax security, but rather the total absence of any security check. He smiled and said, “Why would any terrorist want to blow up kiwis? That would not make much of an impact now, would it?” All a matter of perspective, evidently.

Statistical Hocus-Pocus
LYING WITH STATISTICS When it comes to understanding hazards and how to cope with them, having a good understanding of statistics is valuable. Unfortunately, understanding statistics is not always as simple and intuitive as calculating batting averages. In fact, human intuition and experience too often lead to false expectations when it comes to statistical outcomes.
Imagine that a coin has been flipped five times and that on each of the five times, the outcome was tails. If one had to predict the outcome of flip-ping a sixth time, what is the best bet? Most people would argue that the next toss is more likely to be heads.1 They reason that hitting tails six times in a row is quite a rare event, so after five “tails” in a row, “heads is about due.” Yet, this is not only false, but a misconception so common that it has been given its own name: the “gambler’s fallacy.”2 In reality, past results do not in any way affect the outcome of the next toss: when flipping a coin, there is always a fifty-fifty chance of each outcome. Of course, the coin could also land on its edge and not fall on either side, but that is an amazing feat of equilibrium with a very low probability of occurrence when tossing the coin on a hard surface, so that possible outcome is discounted here as it is beyond the point.3 The gambler’s fallacy exists because human intuition has some interesting biases when it comes to games of chance.
One of these biases is to treat random events as if they were not random. The human experience in many ways suggests that long sequences of repeated identical outcomes are never sustained. If it is sunny for ten days in a row, then a rainy day ahead is expected; make that twenty sunny days in a row and the rainy day ahead is believed to be even more of a sure bet. Therefore, in the above example, since tails comes up five times in a row, then, in fairness, it should now be heads’ turn. Unfortunately, a coin being tossed has nothing to do with fluctuations in weather, and a coin could not care less about human experiences and biases; it is only a stupid piece of metal that always has an equal chance of landing on heads or tails—or on its edge once in an eon to stun everybody.
Likewise, imagine a lottery game that requires picking six numbers between 1 and 70. The gambler’s fallacy in this case is to think that picking the set of numbers 1, 2, 3, 4, 5, 6 is less likely to win than 4, 11, 27, 33, 59, 65. Ask someone next to the lottery ticket vending machine, “I have never played this game before. Should I pick 1, 2, 3, 4, 5, 6 for my lottery ticket?” and watch reactions. Most people would confidently argue that it is better to pick a nonsequential set of numbers, or state outright that one would have to be nuts to pick a series of consecutive numbers. However, in reality, the probability of winning is the same as selecting nonconsecutive numbers. In fact, for any combination of six numbers, there is the same 1 in 130,000,000 chance of winning, no matter how laughable the sequence of numbers selected. The only sure way to win is to buy 130,000,000 tickets to cover all possible combinations, which, incidentally, might be a good idea if the jackpot on a given week exceeds 130,000,000 times the price of an individual ticket, as long as the grand prize that week does not have to be split with another winner.
Interestingly—as there are always exceptions to every rule— some contrarians (math or psychology students?) purposely pick combinations that most people avoid. Notably, 2,014 people won the grand prize of 5,000 times their bet when North Carolina’s Pick 4 lottery winning combination on June 22, 2019, turned out to be 0, 0, 0, 0, with a record payout of $7.8 million.4 Given that the odds of winning are 1 in 9,999, and that the lottery is held twice daily, the odds for a winning combination with four identical numbers are 1 in 999, or once every 2.73 years; this relatively short return period creates “visibility” and might explain why such “quads” are among some of the most popular bets placed in this specific lottery. Indeed, the 1, 1, 1, 1 winning combination in 2012 was North Carolina’s previous largest payout, at $7.5 million.5 Psychologists have had a ball trying to explain why the gambler’s fallacy exists in the first place and why it is so hard to overcome—from psychological to neurological studies, and everything in between. To put it bluntly, statistical literacy is not ingrained in human genes because it apparently has not been essential for survival—like ignorance in itself has never been a barrier against evolution of the species. As a viral online joke says: “We hate math, say 4 in 10—a majority of Americans.” Given that none of the World Statistics Congresses since 1887 have been held in Las Vegas, it might be fair to assume that it is likely easier for a trade association who wants to hold a convention there to get comp rooms and massive hotel discounts if its members do not fully grasp why slot machines are also called one-armed bandits.
It did not take long for the advertising industry to realize that hazy statistics can be used advantageously to influence a mathematically challenged public. How to lie with statistics is an art and a game that can be played in many ways. Some of the most basic tricks consist of:
Conducting surveys on samples having built-up biases, which then casts the results in misleading ways: For example, when the 1950s Kinsey report on human sexuality came out, it presented surprising statistics on the number of American males who reported having had sex with prostitutes (69 percent) and homosexual experiences (37 percent). Closer scrutiny of the data revealed that the sample of those interviewed, possibly due to the dif
f iculty in gathering sexuality data in the 1950s, included a substantial number of prison inmates, including incarcerated sexual offenders.6 Playing on the fact that most people confuse median and mean: A mean is simply the sum of a series of numbers divided by how many numbers there are in that series. Take $250,000 + $500,000 + $10,000,000.
If these were the prices for three homes sold in one week in a certain city, then the mean price of homes that week would be reported to be $3,583,333—a value severely affected by the one mansion with a $10 million view, but a value nonetheless that real estate agents would not hesitate to quote to qualify the other two homes as massive bargains. The median is the middle point for which half of the numbers in the series are greater, and half are lower. In the above example, it would be $500,000—the number the Chamber of Commerce is more likely to use when it wishes to report affordable housing to attract businesses. To confuse matters, since most people do not know the difference, both medians and means are casually called “averages” depending on what point one wishes to make.
Presenting incomplete comparisons: For instance, selfdriving cars are being promoted as being safer. This begs the question: “Safer than what?” or “Safer under what circumstances?” The National Highway Traffic Safety Administration reported in 2015 that 94 percent of accidents are due to driver errors,7 so many have concluded, logically, that if the human factor is eliminated, cars will be safer. This is a big claim. For sure, nearly 100 percent of accidents are due to driver errors because it is quite difficult nowadays to have a car crash unless there is a driver in the car. Yet some have reported that self-driving car accidents have occurred at twice the rate for regular cars.8 Still, is driver error a comprehensive basis of comparison? How about all the other factors? Drilling down into the data shows that roughly 5 percent of accidents nowadays are due to mechanical failures (namely, failures related to tires, brakes, steering, suspension, transmission, and engine) or environmental conditions (obstructed view, glare, ice, snow, fog, etc.).9 However, if self-driving cars turn out to be safer when cruising well-marked highways on sunny days, will they also be safer in heavy rain, on ice or snow-covered roads, on gravel or muddy roads, in construction detours, at intersections operated by a police officer, or when hackers remotely take control of the car?10 Will there be other new problems? Recent surveys indicate that the majority of people would be afraid to ride in a self-driving car.
11 Indeed, for anybody who has suffered the frustrations of working with personal computers or who has owned “smart” but unreliable electronic products, it might be a significant leap of faith to trust driverless cars operating with hardware and software created by the same computer programmers. As structural engineers like to say, if buildings were designed the way computers are designed, the world would be in serious trouble: after a building has crashed, it is too late to close all of its windows, reopen them, and expect everything to return to normal as if it was a laptop.
Headlining selected facts and neglecting the impact of various variables and parameters: A study was reported to have found that death rates due to accidents double for each ten miles per hour increase in speed limit. That would be an attention-grabbing headline for sure, but useless in itself. It means absolutely nothing because the conclusions have been separated from their context.
Such a headline may suggest that reducing speed on highways by ten miles per hour would have a dramatic impact on safety, but that would be perverting the fact that the numbers cited actually refer to fatalities in terms of pedestrians struck by cars—definitely not statistics related to freeway speeds. In this case, the doubling in fatalities jumps from 31 percent to 60 percent for victims hit by cars at speeds of 35 and 45 mph, respectively.
12 The fatalities also triple from 10 to 31 percent if comparing 25 mph and 35 mph, but only increase by 25 percent if comparing 45 mph and 55 mph. In other words, the fact that the data follows a highly nonlinear curve is conveniently omitted by the headline. Note that results also vary most significantly if looking at fatalities for thirty- and seventy-year-old pedestrians, with the older victims being more frequently fatally wounded, irrespective of speed. It also is interesting that according to different data sets, the fatality rate (all ages) at 45 mph is either 40, 50, or 80 percent.13 The results also vary from country to country, as roads and sidewalks in the United States, England, and Germany are definitely not the same. Hence, everybody knows that speed kills, but heralding simplifications is also harmful.
Presenting data using clever graphs that mislead by design: An example is using axes that do not go to zero, or graphic elements whose sizes betray the nature of the data. This is well illustrated by the beautiful maps of the world that come with all countries coated in the same dull-gray latex material that is used to hide numbers on lottery tickets. By using a penny to scratchoff the latex that covers specific countries, world travelers reveal the map colors of all the countries that they have visited. This is equivalent to putting pins on a world map, but presumably more fun and visually appealing. However, it is also misleading. Take two budding globetrotters who have happened to visit four cities. The first one visited Shanghai, Moscow, Montréal, and San Francisco, and the second one Tokyo, Stockholm, Bogota, and Cairo. The first one will get to scratch-off almost half the map, with China, Russia, Canada, and the United States revealed in full colors on the gray map, giving the impression of having traveled half the world already. By comparison, the second one will scratch-off countries of smaller size in square miles, giving the exact opposite impression, even though it involved traveling more miles and visiting cities that are more populous. The same kind of visual trickery is used all the time when presenting statistical results.
Using correlation between events to establish relationships that do not exist: There is obviously a oneto-one correlation between days when sidewalks are hot and days when people suffer from sunburn, but it would be crazy to claim that hot sidewalks produce sunburn.
Such strong correlations are used all the time to convincingly link unrelated phenomena. For example, it is often said that graduates of Ivy League schools are more likely to become the wealthiest members of the country, compared to graduates from other universities.
This does wonders for these universities’ reputation and recruiting, but it fails to consider that each of the Ivy League schools entering class has a large number of students already coming from the country’s wealthiest families.14 They could fail tons of classes, collect less than stellar grades, graduate by the skin of the teeth, and still have a wealthy future ahead—some may even become president of the United States, on slim academic accomplishments.
Hiding the variability that exists within samples or because of sample size: In particular, for marketing purposes, instead of asking one thousand people to participate in a blind taste test to determine if they prefer the product from company A or B, and reporting the results for the total number of participants, it is better to organize one hundred such blind tests on groups of ten people each. As if flipping a coin ten times, it will turn out that, in many groups, half the people will indicate that they prefer A and the rest B. In some other groups, six will prefer A, four B (or prefer B over A). In a smaller number of groups, maybe seven will prefer one over the other. Inevitably, when repeating the experiment often enough, with many more groups of ten, it is bound to happen that in one of those groups, participants will prefer either A or B in an even bigger percentage. It is just “luck of the draw,” but this will make it possible to report that “in a blind test” (meaning, in one of these many groups, but not saying it like that), more people preferred A over B, with precise percentages cited to back this up.15 Using large variability to mask reality: For example, millions of people write with the hope of becoming successful novelists. It is dif
f icult to know what the
average income of fiction writers is, but the extremes are well known—and top earnings well publicized. In 2018, James Patterson was the highest-paid author, with income from book sales of approximately $86 million.16 Yet, a survey in the UK revealed that, on average, authors earned less than the minimum wage.17 This is not surprising. There are more than three hundred thousand new books published in the United States alone each year—or more than eight hundred per day.
18 About fifty thousand of those are fiction—thus, almost one hundred fifty per day.
19 Hence, for every Harry Potter success story, there are tens of thousands of titles that go nowhere. Likewise, for nonfiction, in spite of numerous best sellers, the sheer number of unsuccessful books drags the average sales number to less than two hundred fifty copies per title.20 To make things worse, a quarter of all Americans do not read any book (and that ratio is increasing fast), and most people read less than five books per year.
21 The odds of becoming rich by pouring blood, sweat, and tears into writing a novel are similar to those of hitting the jackpot by buying a lottery ticket for a few dollars and no work at all. Yet thousands of people write novels (instead of buying lottery tickets). Obviously, some do it because they are compulsively driven and because they have something to say, but those who do it driven by the appeal of fame and fortune are seriously misguided by the pernicious attraction of extreme values in statistical sets having enormous variability.
In fact, statisticians have grown so masterful at transforming the truth by playing games with the numbers that this is often derided by jokes. A case in point is the story of the statistician who was arrested by TSA agents at the airport because they found a bomb in his carry-on luggage. When asked why a serious scientist like him would do something so stupid as trying to bring a bomb on board an airplane, he replied that he had read that the probability of a bomb being on an airplane was 1 in 10,000, which made him uneasy about taking an airplane. However, for that given probability, the odds that there would be two bombs on an airplane can be calculated to be 1 in 100,000,000 (equal to 1/10,000 times 1/10,000), which is a much lower risk and made him feel much safer, which proved the benefit of carrying his own bomb.
The examples in the bulleted list above are only some of the tricks of the trade used to make the numbers say whatever is desired simply by casting statistical results in different lights. So, why does all that matter when it comes to earthquakes and other hazards? This deserves some scrutiny.
THE CURSE OF INCOMPLETE INFORMATION Given the regular abuse of statistics to manipulate the public (to make people think favorably of anything, be it a specific brand of toothpaste or a politician), it should not be surprising that most people have become skeptical—if not cynical—when facing claims that are presented as facts supported by statistics. Even when those statistics have been developed by reliable sources based on complete data sets, the news media reporting on the findings often distort reality when presenting the information—accidentally or intentionally twisting reality by some of the methods outlined above—which does not help. If that happens when all the facts and all the data are known, and statistics only consist of tallying the information, imagine what happens when it comes to predictions that must rely on statistics to forecast the future by extrapolating from existing data. It should not be surprising that even fewer people pay more than passing attention in that case.
Then, one step further, who cares when these predictions are based on incomplete or sparse data? That, unfortunately, is what scientists predicting extreme events and hazards must deal with.
At one end of the spectrum, those predicting weatherrelated events benefit from continuous streams of data fired from arrays of instruments scattered across the world. There are weather stations collecting air temperature, barometric pressure, wind speed, humidity, precipitations, cloud cover at multiple elevations, jet stream activity, ocean temperature, wave directions, swell patterns, ocean currents, and the number of butterflies flapping their wings —and all in real time. Data is collected by ground-level stations, antennas, airplanes, geo-stationary satellites, and grandpa’s arthritic joints. Yet, the fact remains that nobody can exactly predict where and when hurricanes or tornados will strike. Consider the following:
On Tuesday August 27, 2019, the National Hurricane Center of NOAA predicted that Tropical Storm Dorian would hit Florida a bit south of Cape Canaveral, five days later, at 2 a.m. on Sunday, September 1.
By the end of the day, it revised the severity of the predicted wind speeds, upgrading the tropical storm to a hurricane status, with the eye of the hurricane arriving instead at 8 p.m. Sunday.
The next morning, on August 28, it further increased the severity of the upcoming impact, upgrading the storm to a major hurricane to hit the coast farther north by one hundred miles, on the upcoming Monday (that is, still five days ahead). The Florida governor declared a state of emergency for the counties in the path of the hurricane.22 By midafternoon on August 29, NOAA predicted that Dorian would be a Category 3 hurricane, with the eye of the storm hitting the coast near West Palm Beach, two hundred miles south from the previous prediction.
By early morning Friday August 30, the hurricane was still forecast as a direct hit on West Palm Beach, with Category 4 wind forces, but with the eye of the hurricane arriving only by Tuesday morning, and subsequently weakening along a northern path midwidth of the state. By Friday evening, the forecasted northward path was revised to “hug” instead the entire eastern coastline of Florida, with sustained disastrous hurricane-force winds all the way.
Then, the Saturday morning NOAA forecast predicted that the eye of the hurricane would not hit Florida after all, but rather curve up north and stay nearly one hundred miles away from the coast, reaching the latitude of Cape Canaveral sometime during the night from Tuesday to Wednesday. This provided a temporary sigh of relief to all Florida residents, as the tropicalstorm winds of roughly 70 mph forecasted to be felt on the coast, while still strong, were far less devastating than the near record-level winds of 180 mph at the eye wall, since Dorian had grown by then to be a Category 5 hurricane.23 Throughout the Labor Day weekend, the predicted path of Hurricane Dorian remained sensibly the same, forecasted to be at least a Category 4 when it would be skirting the coast, but the predicted distance of its path away from the coast as it moved North progressively shrank, bringing it closer to Florida than forecasted a couple of days earlier. By Monday morning, the predicted northward path of the eye of the hurricane had become so close to the coast of Florida that a mandatory evacuation order was issued for the hundreds of thousands of people living east of the intracoastal waterway, over a distance of roughly three hundred miles, from Palm Beach County24 (which includes Boca Raton and Palm Beach) up to the northern state line. This called for evacuating the barrier island parts of nine counties along the coast, including the cities of Palm Beach, Melbourne, Daytona Beach, Saint Augustine, Jacksonville, and everything in between.
Similar evacuation orders were issued for the coast of Georgia and South Carolina. While everybody in Florida waited for Dorian, the Category 5 hurricane decided to park itself over the Bahamas, battering the islands for a solid forty hours, dumping more than thirty inches of rain there while wind gusts of up to 220 mph made water levels surge by twenty-three feet.25 While the eye of the hurricane was nearly stationary, moving no faster than one mile per hour over the Bahamas,26 the various computer models crunched new predictions nonstop and reached somewhat of a consensus that the storm would eventually get in gear as a Category 4 hurricane along a northwestern path, hugging the Florida coast so closely that any small deviation from the predicted path could spell disaster on land.
By noon Tuesday September 3, when the storm did get in gear, it was declared to be a Category 2 hurricane, with maximum winds of 110 mph. While a 110 mph wind is not a small breeze, the downgrading from 220 mph brought a breath of fresh air to those that were bracing for the worst along the Florida coast.
Considering that 50 percent smaller wind speeds result in four-times-lower wind forces acting on buildings and other infrastructures, that was indeed a significant relief.
Yet, somehow, the hurricane lost more strength than predicted, as if none of the computer models had previously taken into account the fact that, in all of its rampage, a stationary hurricane also eventually cools the ocean, thereby depriving the storm of the warm water that serves as its fuel. The southernmost counties along the coast that had previously declared mandatory evacuation lifted these orders by midday that Tuesday.
In the twenty-four hours from Tuesday evening to Wednesday evening, the storm moved north, staying offshore far enough to spare Florida from destructive winds or storm surge. The whole coast of Florida was left unscathed, leaving the news media scrambling for something newsworthy to show; the wind and rain had been so banal that the few fallen trees and instances of localized flooding (on a few streets and parks here and there) were given extensive and repeated live coverage throughout the day. Across the state, all curfew orders were lifted and displaced people returned to their homes.
All in all, as far as Florida was concerned, Dorian amounted to a lot of anxiety and a near-miss. It did mess up some parts of the Outer Banks in North Carolina in the end,27 and later, as a tropical storm, part of Nova Scotia in Canada,28 and—of course—it was disastrous for the Bahamas. However, for Florida, the bullseye of the storm for more than a week: nothing.
Note that the NOAA forecasts focus on where the center position of the hurricane will most likely be, which is what everybody is interested to know, but they also provide a “cone of uncertainty” that is intended to illustrate uncertainty in the prediction. This cone is created by a set of circles drawn along the forecast track at the positions where the center of the hurricane is expected to be over each of the next five days ahead. It is significant that the radius of these circles is 198 miles for the forecast five days ahead and 68 miles two days ahead.29 These radii have been determined by comparing past predictions and actual path of past hurricanes and calculating how large the circles should be if the actual path of past hurricanes was to fall within these circles 60–70 percent of the time. This implies that when forecasting any hurricane path, even the prediction that the eye of the hurricane will remain within the cone of uncertainty is in itself wrong at least 30 percent of the time.30 Not surprisingly then, on August 27, the 400- mile diameter of Hurricane Dorian’s five-day circle of uncertainty encompassed the entire Florida east coast. A day later, it engulfed all of Florida and Georgia’s Atlantic coast. From Thursday to Saturday morning, except for the panhandle part of the state, all of Florida was within the cone. By Sunday evening, the entire Atlantic coast of New Jersey, Delaware, Maryland, Virginia, North Carolina, South Carolina, Georgia, and all of Florida (except for the bit of the state south of West Palm Beach) were encompassed within the cone of uncertainty.
31 The path of a drunken sailor might have been more predictable than that.
The meanderings of Hurricane Dorian provide a good example of uncertainties in landfall predictions that pingpong from one impact point to another—driving everyone crazy in the process because a difference of fifty miles matters tremendously considering that the most destructive hurricane winds develop roughly over a radius of forty miles from the center of the hurricane. Wild fluctuations in predictions of where a hurricane will intersect the coast makes the planning of emergency preparedness and response—to say the least—challenging.
No doubt, Category 5 hurricanes will hit Florida head-on in the future and each strike will produce extensive and heartbreaking damage—it is not a question of “if” but “when.” However, as far as human nature is concerned, ten days of flip-flopping predictions on landfall, wind speeds, and storm surge levels often hardens cynical responses instead of raising awareness to the risk—a behavior well captured by the fable of the boy who cried wolf. In fact, this tendency to ignore valid warnings is so ingrained in our DNA that psychologists, philosophers, scientists, and experts from many other disciplines refer to it as the “Cassandra complex,”32 in reference to the princess from Greek mythology who was cursed with the ability of making 100 percent accurate prophecies that nobody would ever believe. Indeed, the last day of Dorian’s visit to Florida was not even over when some of those who had defied the evacuation orders and had stayed home through the storm started posting videos on YouTube, bragging that hurricanes are nothing to get excited about, peppering their “I told you so” commentaries with a pride in not buying into the end-ofthe-world herd mentality—like proud conspiracy theorists.
These are the folks who do not believe that it is possible to die during a hurricane, because nothing in their life experience has approached this reality—and it is exactly because they have not yet personally experienced the devastating punch of 150 mph winds coupled with fifteen feet of water in the street that they could be the ones most likely to die in a future hurricane.
Obviously, cynics will not miss the opportunity to underscore that, given the inability of meteorologists to reliably predict if tomorrow will see rain or sunshine, only fools would bet the bank on a five-day weather forecast of any kind—even more so when dealing with forces of nature unleashed, like hurricanes. But the point remains that given the debatable accuracy in predicting the path of an extreme event like a hurricane that lasts days and for which the data that goes into the models are measured both directly and remotely on a continuing basis with hundreds of thousands of sensors, what confidence can possibly be developed for hazards at the other end of the spectrum, where the data needed to make predictions cannot be directly accessed and is grossly incomplete?
This is the case for those predicting extreme geological events like earthquakes and volcano eruptions, as they must contend with the fact that the totality of human history spans a mere few thousand years, which is effectively nothing on a geological timescale. For example, the Rocky Mountains have been growing upward for eighty million years.33 Within that mountain range, the much younger (“only” ten million years old) Grand Teton’s summit, at 13,775 feet, is growing on average one millimeter per year,34 in fits and starts, as earthquakes happen and push it up a notch at a time.35 By comparison, the Richter scale that quantifies the size of earthquakes (as the length of the wiggle recorded by a Wood-Anderson seismograph) was invented in 1935—less than a century ago.
36 That is about eight hundred thousand centuries too late to get a full set of data and reliable statistics on the size of all the earthquakes that created the Rocky Mountains.
Relatively little is known about these earthquakes. So how much actual data is there to predict when future earthquakes will actually strike and make these mountains rise a bit more in the future? Focusing on the part of the Rocky Mountains located in Colorado, the largest “recorded” earthquake in that state occurred in 188237 at the north boundary of the Rocky Mountain National Park, although it was qualitatively “recorded” by written descriptions of the event rather than quantitatively measured by an instrument. Based on interpretation of the information provided in this written record, geologists have estimated this earthquake to have been of magnitude 6.6, with an error of plus or minus 0.6—or, in other words, of a magnitude somewhere between 5.8 and 7.2 when considering the uncertainties. Therefore, since a magnitude 7.2 earthquake is twenty-five times bigger on the logarithmic magnitude scale than a 5.8 one, getting highly reliable predictions based on statistics calculated using data points with that kind of uncertainty is like trying to predict when the drunken sailor mentioned earlier will fall, and the severity of the injury.
SWEEPING HAZARDS UNDER STATISTICAL RUGS Those who get heartburn when merely thinking about mathematics will likely need some Ultra Strength Tums or Rolaids through the rest of this chapter, but the numbers are not necessarily that overwhelming when put in context.
A major challenge lies with conducting statistics using incomplete data. Imagine a student who got a grade of 72 percent in Professor Bruneau’s Steel Design class. If the student is fully satisfied by the knowledge of having passed the course, the story stops there. However, if the student is curious to know if 72 percent is a grade close to the class average, there is no way to know unless a comparison is made. Given that Professor Bruneau has not posted the class average for the exam, the student may ask a couple of friends what grade they received. If the two friends respectively got 88 percent and 66 percent, this would correspond to an average of 75.3 percent. If the student had asked two other friends that instead got 95 percent and 79 percent, the average would have been 82 percent. Big difference. It is not possible to know with certainty what the real class average is from a sample of only three grades.
However, if the distribution of all the grades in the course follows a known distribution, such as a normal distribution (commonly called a “bell curve”), it is possible to infer the probability that the real average is close to either 75.3 percent or 82 percent, or to the average of any other three numbers for that matter. Calculating the certainty in these estimated averages—that do not correspond necessarily to the truth—is exactly what Bayesian statistics is all about. As one would expect, the greater the number of data points used in that estimate, the closer the estimate is to the truth.
Unfortunately, in this case, Professor Bruneau can attest that the distribution of grades in Steel Design has never followed a normal curve—the actual curve looks more like a three-hump camel than a bell. Therefore, the students would be kidding themselves; all their estimates of the average would be more in error than expected, since they were calculated assuming an incorrect distribution of the real data (yes, it can be tough taking Professor Bruneau’s class).
Likewise, imagine that an earthquake fault has produced one major earthquake every one hundred years, like clockwork, with the only thing changing from event to event being the magnitude of the earthquake—say ranging from magnitude 6 to magnitude 8. If it were possible to record every one of those earthquakes over millennia, then it would be possible to perform statistics on that data and determine the average size of the earthquakes produced by that fault, and the probabilities of getting the larger earthquakes. However, if the civilization near that earthquake fault has existed for only two hundred years, then only two earthquakes have been experienced. Can the size of earthquakes expected in the future be deducted from these two earthquakes? Can they be reliable as representative of the average for all events that have happened there over time immemorial? How much more reliable would the predictions be if civilizations had existed near that fault for three, four, or five hundred years? How many events does it take for the average of this small sample to approach the real average?
This is the statistical game played when forecasting extreme events.
Thankfully, some information on earthquakes that have occurred prior to civilization can be inferred from the geological traces they have left. This detective job is what paleoseismology is all about.38 Clues left by large earthquakes can be found in the landscape (using aerial photographs or laser sensing), or in trenches dug across faults where evidence has been buried by thousands of years of sediment deposits. These findings can provide some useful “guestimates” on past earthquakes, but with nowhere near the resolution provided by modern instruments. Like asking more friends about their grades and getting “somewhere between 65 percent and 85 percent” as an answer—it can help to some degree, with a hefty dose of additional “statistical massaging.” Not surprisingly, that ends up providing predictions that are, at best, somewhat informative. For instance, the Hayward fault is considered to be one of the scariest in the world. According to the USGS, it is a seventy-four-mile-long “tectonic time bomb”39 due to produce a magnitude 7.0 earthquake any time soon. Paleoseismology studies have identified twelve large earthquakes that have occurred on that fault over the past 2,000 years, from 100 to 220 years apart. The last one of those—and the only one since the establishment of Western civilization—is the 1868 earthquake, more than 150 years ago. Would anybody wish to live right next to that fault? Yes, actually, three million people do (some even literally live on top of it). When the next magnitude 7 earthquake happens there, and the fault ruptures from Oakland to San Jose in California, it will damage one million homes, produce $100 billion in financial losses, ignite firestorms that will burn down eighty thousand single-family homes, leave part of the San Francisco Bay Area without water for up to three months, and leave ten feet of broken glass piled-up in the streets of downtown San Francisco (and presumably people buried below all that glass).40 The number of deaths and injuries is anybody’s guess.
More statistics: The odds of a severely damaging earthquake in the Bay Area over the next thirty years are currently pinned at 75 percent.41 With those odds, taking earthquake insurance coverage would seem to make sense.
Not so easy. After the 1994 Northridge earthquake, even though only half of the $20 billion in residential damage was insured, the massive number of claims delivered quite a blow to the insurance industry. Within a year, 93 percent of the insurance companies in California had stopped issuing homeowners insurance—or severely restricted them—to minimize their future losses rather than those of the policyholders. The State had to create a not-for-profit, publicly managed entity, called the California Earthquake Authority, to fill the void.42 According to the free earthquake insurance premium calculator on their website, the cost to insure an average $1,000,000 single-story home in Berkeley, California, would be $4,000 a year—with a $150,000 deductible! Roughly double that premium to bring the deductible to $50,000—there is no option to reduce the deductible further.
43 With such great deals on insurance premiums, how many Californians have earthquake insurance? Barely 10 percent.44 Many believe that FEMA will come to the rescue. It actually will, but only to some degree, as FEMA might limit the size of its checks to roughly $35,000 per claim,45 and may try to keep actual payouts at less than that. 46 With real estate going at more than $700 per square foot in Berkeley,47 that $35,000 check from FEMA would “cover” fifty square feet of the $1,000,000 house— the size of a cute walk-in closet.
That is for a 75 percent chance of earthquake damage over the next thirty years. If statistics showed that there was a 99 percent chance instead, would it make any difference? Some may argue that statistics—and predictions based on statistics—are not credible. Yet, even if predictions could be 100 percent accurate, would it really matter? It depends on the odds.
UNDERSTANDING AND MISUNDERSTANDING THE ODDS Would there be volunteers for the following experiment?
Imagine a six-lane boulevard in a North American suburb, the type of road where a squirrel dashing across at rush hour is certain to be mashed into a paste between the threads of a tire, yet a road on which hardly anybody drives in the middle of the night. This experiment seeks volunteers to cross the road blindfolded, at 3 a.m., when the chance of being hit by a car is calculated at one in a million. The reward if making it safely to the other side of the six-lane road is a small sum of money, the exact amount determined by increasing the value of the reward until a volunteer comes forward. If $50 is not sufficient to find a volunteer, the reward is increased to $75. Still no volunteers, then $100. At some point, someone will volunteer, because given these odds, there is a 99.9999 percent chance of collecting the money. Definitely better odds of winning than in all lottery games, except that in this case there is also a 0.0001 percent chance of ending up like the squirrel. Now how big would the reward have to be if the volunteer was asked to repeat the crossing every night for a year? How about for fifty years? If it paid $100 per crossing, this would amount to $36,500 per year. Some jobs that pay less than that have a bigger risk of casualties: according to the US Bureau of Labor Statistics, one out of every one thousand fishermen lost their lives in 2017, for an average yearly salary of $28,310,48 so being paid $36,500 per year is relatively good for a task less risky than chasing a tuna.
Would the ability to recruit a volunteer change if the same probabilities were explained differently? In a first approach at describing the odds, if there is one-in-a-million chance of being hit during a single crossing, this means that, on average, someone is going to get hit every 1,000,000 crossings. At 365 crossings per year, this implies that this experiment will produce one victim roughly every 2,736 years. Presented this way, it is likely that volunteers could still be found to collect a living wage for one minute of work per day.
Now, there is another way to recast the same information differently. If the volunteer is selected to be eighteen years old at the start of the experiment—always good to keep minors off the street—and asked to retire at sixty-eight years of age, that will amount to a fifty-year career that of f icially consisted of “crossing the road to get to the other side.” What are this volunteer’s chances of enjoying retirement? Doing the math, it turns out there is a 2 percent chance this volunteer will be hit by a car once during this period of employment. Is the volunteer still up for it? How about if the number was 10 percent? There might no takers, unless the salary was significantly increased, because the odds are far less attractive presented this way. Yet, this 10 percent is the same as a 1-in-200,000 probability of a fatal crossing, or roughly one victim every five hundred years.
By the way, for those who have an interest in understanding how this number is achieved, the math works out this way:
If the return period for a hazard is once every 500 years, this is the same as saying there is a 0.2 percent chance per year of it happening on any given year (1 divided by 500 equals 0.002).
This also means that there is a 99.8 percent chance it will not happen (1 minus 0.002).
The probability that it will not happen on two consecutive year is 99.6 percent (obtained from 0.998 times 0.998).
The probability that it will not happen at all in fifty consecutive years will be 0.998 multiplied by itself 50 times, which is equal to 0.904.
This leaves a chance that it will happen once over a fifty-year period as 1 minus 0.904, equal to a 9.6 percent probability (rounded up to 10 percent in the previous paragraph).
This is how return period and probabilities of exceedance for earthquake levels are determined in buildings codes. It means nothing more than that. There is no implicit warranty that two large earthquakes cannot occur on consecutive years. It is only statistics. A low probability is not an impossibility; there is always somebody winning the lottery, even though the probability of winning is low—incredibly low, for that matter.
When telling someone that there is a 10 percent chance something horrific will happen to a building that will be owned for fifty years, it may spark interest in buying insurance to cover for that risk. Tell that same person that the same building that will be there for fifty years can face something horrific that happens on average once every five hundred years, and watch the yawning. Yet, statistically, it is the exact same risk—only packaged differently. Packaging matters.
Things get further complicated when questioning— statistically again— the confidence level in these predictions. When the beautiful and talented meteorologists that deliver the weather forecast after the news say that there is a 10 percent chance that it will rain tomorrow, that is the calculated odds that it will rain. If that statement is followed by the rare qualification that this prediction is correct 90 percent of the time, that is the confidence level.
When a poll predicts that Savy Pierrot will win 56 percent of the vote during an election, and then states that “the margin of error of that prediction is—say—plus or minus five percentage points, 19 times out of 20,” it essentially says different things at the same time. First, it establishes that the poll accuracy is limited because the pollsters only contacted a limited number of people. If 400 people were called Monday evening, then 225 of them indicated they would vote for Savy Pierrot (56 percent). Second, it states that different results could have been obtained if an altogether different set of 400 people had been called that very same evening. There is randomness in that process, so maybe only 195 people (49 percent) in the second group of 400 would have expressed their support for Savy if they had been called. Or 240 (60 percent) in another group. And so on. In other words, the poll result is not an absolute value, because there are fluctuations in the number that would be obtained when calling completely independent groups of 400 people. Statistics can be run to quantify that error. The experiment can be made by calling 100 different groups of 400 people and seeing how much scatter there is in the results. This would provide 100 different results. Using made-up numbers to make a point, say the results from all 100 polls ranged from 150 (37.5 percent) to 325 (81.2 percent) votes for Savy. This huge spread can be quantified by stating that the error is roughly plus or minus 22 percent —a reality, but one that would certainly make the polling company look “unreliable” to its clients. However, out of those 100 poll results, because most of the results are usually bunched up closer together, it may be wise not to count the “outliers.” If the five poll results most distant from the mean are discarded, then maybe the spread becomes 204 (51 percent) to 244 (61 percent) votes for Savy, which makes the error become only plus or minus 5 percent. Then, to acknowledge that the outliers have been thrown out, the polling company will state that the results reported are true nineteen times out of twenty (that is, 95 percent of the time because 5 of the 100 polls have been discarded).49 The polling company can even “tighten” the error if it is willing to say that the results of its polls are true only 90 percent of the time. Does it mean that Savy Pierrot will win? Probably, but certainly not a sure thing, as history has often shown— besides, poll results can get significantly less reliable if the group of people sampled is not truly representative of the population, but that is a different story.
The same goes with every prediction in life, from natural hazards to engineered products. If a gizmo purchased from NOTACME is advertised to be effective 99 percent of the time, one would need to be able to collect feedback on personal experiences from a large number of purchasers of the product to be able to perform statistical analysis that would either confirm or discredit NOTACME’s claim. Doable, but tough. Demonstrating this with numbers, to be 99 percent sure of the results (the confidence level), with an error no greater than plus or minus 0.25 percent in the results (the confidence interval), one would have to collect data on approximately 1 percent of all customers (the sample size divided by the total number of units). This is equivalent to saying that, by conducting a survey of ten thousand customers out of the one million who purchased the gizmo in question, it can be determined that the gizmo worked as intended for 99 percent of the users, with the caveat that there is a 1 percent chance that this survey gives wrong answers, off by at least 0.25 percent. In short, to have an ironclad certainly that the survey results are 100 percent representative, without any error, with a bet-thehouse absolute confidence in the results, one would have no choice but to sample all the clients. And that is when dealing with quantitative data that does not change over time—otherwise, that is another ballgame altogether, like when trying to rely on polls to predict election results.
If all those numbers are nothing but a blur at this point, not to worry. Even the stars of the medical profession get it wrong most of the time,50 as frequently demonstrated.51 To illustrate, a group of 160 gynecologists was asked how many women who test positive from the results of a routine mammography actually truly have breast cancer, if: (1) 1 percent of all woman have breast cancer; (2) 90 percent of the women with breast cancer test positive, and (3) 9 percent of the woman who do not have breast cancer, receive a false positive, like a false alarm. In a multiplechoice question with four answers to choose from, 79 percent of the gynecologists (who typically have more than twenty years of schooling) said that women who test positive have a 90 percent chance of having cancer, which is the wrong answer. Based on the above statements, out of 1,000 women who take the mammography, 10 would have cancer, of which 9 would test positive, and 990 would not have cancer, of which 89 would test positive. Therefore, of the 91 that tested positive, only 9 actually had cancer, or 10 percent—maybe a sad reduction in the number of potential clients for all these gynecologists compared to what they thought was the right answer. Fortunately, 21 percent of the gynecologists picked the right answer— although as is the case in multiple-choice exams, each of the four answers had a one-in-four chance (25 percent) of being picked by those who had no clue what the correct answer was. Since 25 percent is more than 21 percent, this means that medical doctors did worse in answering this question than, say, kindergarten kids who would have picked answers randomly —which provides an interesting perspective on the value of getting a second opinion when it comes to medical diagnostics. (If baffled by the math, do not worry: most people need multiple reads to figure this one out).
OBFUSCATING REALITY (AND THIS SUBTITLE) There are over 450 types of sharks in the world.52 As incredible as it may seem, at least to those who have never been scuba diving, most types of sharks are harmless. Some are shy, some are curious, some are bottom feeders—like the nurse shark, which some have called the “couch-potato of the shark world.”53 Only about a dozen types of sharks are deemed dangerous, and most human attacks have been by three species only,54 which includes the great white shark that so often makes the headlines. The same goes with lawyers. There are dozens of different types of lawyers, with practices covering family laws, criminal laws (prosecution and defense), corporate laws, constitutional laws, immigration laws, tax laws, real estate laws, civil laws, employment laws, administrative laws, personal injury laws, appellate laws, bankruptcy laws, malpractice laws, and many more, all with many subdisciplines. Like sharks, only a few are dangerous and known to attack engineers.
The same way many will avoid swimming at dawn or dusk when sharks are known to feed, engineers take precautions to avoid troubled waters. For example, members on a code-writing committee were discussing ways to help engineers understand the intent of seismic design provisions for bridges by explicitly describing in the code itself the expected performance of structures designed by these provisions. One suggestion was to describe the level of damage expected to occur in reinforced concrete columns in terms of crack widths corresponding to various bridge damage levels. If designing a critical bridge for which immediate access by all vehicles following a rare severe earthquake was the objective, then cracks in columns would have to be quite small, contrary to those in less important bridges that could be allowed to suffer more damage. This idea, while valid and effective to help engineers appreciate the expected extent of damage for various design scenarios, was killed by the specter of lawyers walking around bridge columns with measuring tapes following earthquakes, checking crack widths with the resolve of gold-diggers. The approach adopted by the code instead refers to maximum displacements reached during the earthquake—something that occurs only for an instant during the earthquake itself and that only engineers can calculate using computer programs, leaving no evidence after the fact.
Likewise, when dealing with extreme events, statistics is the most effective lawyer repellent. This is absolutely not the reason why statistics are being used, but it is a convenient side effect. Statistics eliminates certainty. No certainty, no guarantees. No guarantees, no broken promises—and thus, no actionable failures. Or maybe this presumption is wrong and, to the contrary, it will lead to more protracted, complex, and convoluted court cases.
Time will tell.
THE EARTHQUAKE VANISHING ACT Magicians love to make things disappear. At the top of his career as an illusionist, David Copperfield even made the Statue of Liberty disappear.
55 Yet, no magician has yet succeeded in making a magnitude 7 earthquake disappear.
That exceptional distinction belongs to seismologists, with a helpful dose of statistics. Indeed, thanks to the wonderful world of probability theories, statistics can not only be used to masquerade blatant lies as credible truths, but it can as easily make expected magnitude 7 earthquakes disappear from the engineering criteria used to design buildings. The trick to accomplish this feat is no secret, because it has been published in the scientific literature, but it is admittedly far less exciting than making an elephant disappear.
The trick goes as follows. The “North American Craton,” also known as Laurentia, is essentially the massive geological rock shield that covers all of North America east of the Rockies.56 It is nowhere as prone to generate earthquakes as the more seismically active West Coast of the continent, but large earthquakes happen there too, every now and then, for reasons that are far less understood than along, say, the ring of fire around the Pacific. Within the craton itself, there are some relatively well-defined and active sub-zones where significant earthquakes happen with enough regularity to have an impact on the national seismic maps that are generated for use in the engineering design of infrastructure. However, over the entire craton, it is also recognized that sizeable earthquakes can pretty much occur anywhere, albeit far less predictably and far less frequently.
Interesting arguments arise among seismologists on what the largest earthquakes are that could occur there, and how frequently.
57 The Canadian seismologists tasked to develop seismic design maps for the National Building Code of Canada had to make decisions in this regard. To do so, they compiled data on all of the world’s cratons having similar geological features, and determined that magnitude 7 earthquakes reasonably can be expected to occur anywhere along the North American Craton,58 particularly given the evidence that earthquakes of that size in stable cratons have occurred in modern times (for example, the 1988 magnitude 6.7 Tennant Creek earthquake in Australia’s Central Craton).59 They also determined—among many statistics established—that such magnitude 7 earthquakes occurred at a rate of 0.001 per year per 50,700,000 km2 (roughly two million square miles), and that earthquakes of magnitude 6 or greater were four times more likely to occur.
Given that the North American Craton is roughly 10,700,000 km2 , this would make it one magnitude 6 or greater every one thousand years. Magnitude 6 earthquakes can produce quite a lot of damage, as demonstrated by the magnitude 6.3 Christchurch earthquake that killed 185 people in 2011 in New Zealand—of all places, one country where earthquakes are expected to occur and that is, generally, better prepared for that eventuality than many others.
Note that other seismologists performing similar studies would typically come up with different numbers—sometimes more critical, sometimes less. When it comes to the design of critical facilities, such as nuclear power plants, offshore platforms, and nuclear waste disposal sites to name a few, where the design earthquake is intended to be the one with a ten-thousand-year return period, these debates are important. This lack of consensus among experts is not uncommon or unsettling when seismological models are created—it is accounted for, and defined as, “epistemic uncertainty.”60 Interestingly, seismologists have been brilliant and deliberate when expressing the differences in predictions obtained from their various models, by plotting their results on log-log graphs, rather than “normal” diagrams. In a log-log graph, the distance between each major tick-mark corresponds to a tenfold increase. In other words, in such a log-log graph, the major tick marks along the vertical axis correspond to 0.1, 1, 10, and 100, instead of 1, 2, 3, and 4. This makes a 100 percent disagreement between, say, the numbers 10 and 20 graphically look like only a 14 percent disagreement61—which is quite a clever way to save face when comparing models that predict such dramatically different results. However, even such variations in return period estimates are not significant when it comes to the earthquake vanishing act. What matters is the size of the area over which the earthquake is “smeared,” and the distance from a potential epicenter to any point on the map where engineered infrastructure of interest is located. This is because anybody can survive a magnitude 7—or even a magnitude 8—earthquake, provided it happens far enough from where one stands. In fact, as mentioned earlier, there is on average of one magnitude 8 earthquake occurring somewhere on the globe every year—in most years, nobody hears about it because it happens far away from civilization.
Coming back to the North American Craton, when considering that a magnitude 7 earthquake can occur somewhere in the North American Craton, what matters is not the temporal dimension, but the physical dimension.
The chances that any specific location on the massive craton will be close to the epicenter when it happens are so low that it effectively makes no difference whether it is considered or not when developing the probabilistic seismic maps of, say, the National Building Code of Canada. In other words, the magnitude 7 exists, but it is its impact that vanishes/disappears.
As a result, if one thinks of a magnitude 7 earthquake that happens every four thousand, one thousand, or even five hundred years as a dart thrown on a map of the Canadian part of the North American Craton, (which is effectively all of Canada minus the Atlantic provinces and the land from the Rockies to the Pacific), the chances are greater that it will happen in the middle of nowhere, or in a place where there is not much infrastructure to damage.
Canadian moose, geese, and beavers will get shaken up, but nobody will care—in a sort of earthquake engineering version of the metaphysical “If a tree falls in a forest and no one is around to hear it, does it make a sound?” However, if the dart unfortunately bullseyes on Toronto, seismologists will be doing a lot of hand-waving and fancy footwork trying to explain to the Torontonians looking at their damaged buildings, bridges, arenas, and infrastructure, that they were simply “sheer out of luck”— fubar style—because even though the dart had a negligible probability of ever landing there, it was not a zero probability. Such is the perverse effect of messing with statistics. Like other events with an insanely low probability of happening—be it dying from an asteroid impact or having the Detroit Lions win the Super Bowl—an insanely low probability does not mean it cannot ever happen. Do not bet on it, do not wait for it, do not prepare for it, but do not look for culprits when it happens.
To repeat, absolute certainty does not exist. Everything is possible, only with different probabilities of occurrence.
Severe earthquakes are rarer than smaller ones; winning $1 million does not happen as frequently as finding a coin in the street.
At the same time, not all low-probability events have the same consequences. Parking the brand-new car in the driveway to see it crushed by a falling tree during a wind storm is not as bad as being hit by lightning, which itself is not as bad as losing everything and all loved ones in a collapsed building.
It is all a question of “Risk.”
ON THE DISASTER TRAIL
Sports as a Proxy—Football for All It was while watching a documentary on World War I that it jumped out at me. There, on the scratched black-and-white film, arms waving out of every door and window, the train was leaving the station with a bunch of exuberant kids on their way to play and win the ultimate game. Same crazy optimism on the cruisers’ decks as they left port.62 As an ice-hockey historian described it in part 3 of Hockey: A People’s History 63 (a ten-hour Canadian documentary that only Canadians might care to watch),64 going to World War I was better than any sports match; it was an irresistible drive to be part of the home team and fight side-by-side on a great adventure.
In any competition between two teams of even strength, the odds of victory are roughly 50 percent. Yet, all that youthful exuberance was about the thrill of victory. In all that excitement, few seemed to acknowledge that in the game of war, there is also the agony of defeat—as can be seen strikingly emphasized in the 1970s ABC Wide World of Sports intro.
65 The recruits on their way to fight in Europe acted like a mob of sports fans on their way to a football game, cheering for the home team and expecting no less than a full and dominating victory.
In that moment, watching the old war footage, it became clear that a characteristic of youth is an uncanny ability to see the world only from its own point of view. Undoubtedly, German trains were likely leaving home with the same testosterone-laden drive to head-butt other rams, expecting no less than a full and dominating victory.
Fortunately, nowadays, all that head-butting energy can be released through organized sports, where the misguided belief that a win is guaranteed can be crushed without life loss or—in most cases—debilitating injuries. Even better, professional sports can provide that same outlet to passionate sports fans while keeping them safer than players.
From that deep observation, it is tempting to formulate the hypothesis that the desire of a nation’s youth to go to war is inversely correlated to that country’s professional sport infrastructure. In other words, lots of professional football, baseball, basketball, soccer, and hockey teams, equals lots of opportunities for frenzied youth to spend their rambunctious energy and learn that the home team can lose big time on any given day against any given team— maybe suffering some emotional pain, but without losing limbs or life to learn that lesson. Conversely, the countries whose citizens are most eager to become soldiers—or even terrorists in the absence of a national army—sorely lack professional sports.
Therefore—clearly outdoing all the wishes of past Miss Americas in the quest for world peace—I hereby postulate that to demilitarize the world and reduce the risk of wars, it is essential to establish multiple professional sports leagues in every country on earth, and use conscription to draft all kids into amateur sports teams. Dress everybody in wellpadded uniforms and let them bang each other to exhaustion on a football field or an ice rink or equivalent—a well-controlled warring environment that is safe for the planet and its citizens.
Problem solved! And maybe the first Nobel Peace Prize awarded for simultaneously enhancing both the safety of the world’s citizens, and their fitness.

Life’s Casino
RISK TOLERANCE AND RISK AVERSION When a contractor was hired to fix a small leak in the roof of the NOTACME Corporation headquarters building, it did so expeditiously. However, as work started, it was found out that the roof underlayment contained asbestos. As far as the contractor was concerned, this was not a problem but a blessing, because a site condition not mentioned in the original contract can be handled by a change order—in essence, extra work at a higher profit margin, adding the special protection necessary to deal with asbestos. When the employees learned that asbestos was present in the roof, they requested that the entire roof be replaced. Taking this request into consideration and investigating, NOTACME learned that removing asbestos from a roof is one of the most expensive asbestos abatement operations, due to access and containment requirements, and the cost estimate for that operation came up at $120 per square foot.1 For NOTACME’s modest 100' × 100' building, that added up to $1,200,000, plus disposal fees, permit fees, and other ancillary expenses. In the process, NOTACME also learned from government agencies that a roof containing asbestos is not hazardous, that the best course of action is to leave it in place because it does not affect conditions inside the building,2 and that asbestos is not a banned substance in the United States.3 Besides, NOTACME learned that, incredibly enough, asbestos is a product found in rocks and that natural erosion of those rocks releases a small amount of asbestos fibers in the air nearly worldwide.4 NOTACME gladly informed its employees that the roofing material presented no risk. Yet, no matter how many facts NOTACME provided, many employees found plenty of websites stating that asbestos can cause asbestosis and different types of cancers (mesothelioma being one),5 and that one fiber of asbestos in the work place is one fiber too many. The fact that the legal system has typically found non-persuasive the claim made in many lawsuits that a single inhaled fiber of asbestos is sufficient to cause cancer,6 the NOTACME employees took this rejection of the “one-fiber theory” as clear evidence that the justice system is in bed with big business. That asbestos roof had to go.
Three months after the discovery of asbestos in the roof and no action from the NOTACME management, the employees met with the management. They threatened to sue the company and go on strike until the issue was resolved. The crisis had reached its climax, and NOTACME had to budge: given the alternative of millions in lost revenues due to work stoppage, it agreed to hire the contractor to replace the roof at once.
The employees were ecstatic. The risk of death by cancer caused by asbestos in an unsafe workplace was at once removed. All went to celebrate this victory at the nearby bistro. Congratulations, backslapping, cheering, fist pumping, and singing were sustained for hours by profuse amounts of wine and beer. By midnight, the party was over.
Some staggered to their cars, struggled to put the key in the ignition, lit up a cigarette, and drove home completely plastered. Others did the same on their motorcycles— wearing a T-shirt, shorts, sandals, and no helmet.
What is wrong with that picture?
HAZARD VERSUS RISK Before going any further, we must clarify the confusion that often arises between hazard and risk, which are often used interchangeably or listed as synonyms in dictionaries. In essence, to borrow a definition from experts in occupational health and safety,7 a “hazard is any source of potential damage, harm or adverse health effects on something or someone.” As such, a knife, electricity, a wet floor, and a coconut in a palm tree are all potential hazards. What ensues from this is that a “risk is the chance or probability that a person will be harmed or experience an adverse health effect if exposed to a hazard.” Avoiding the risk requires not sleeping under a palm tree (although the claim that falling coconuts kill 150 people every year is apparently an urban legend and only a few actual documented cases of such an exotic way to die apparently exist).8 If the relationships with the hazards are handled properly, the risks can be managed.
Some risks are, in theory, easy to manage: Who would use a hair-dryer in a shower, dive in shallow waters, light a cigarette next to a gas pump, or text while driving? For most people, the risk is obvious and therefore avoided—others will end up upside down in a ditch, crushed by an airbag, cell phone firmly in hand. However, there are many instances where risk is not an easy thing to assess and to manage. In fact, this is so true that “risk manager” is a professional career in itself and its practitioners include (and engage) experts from various disciplines. There exists a Society for Risk Analysis, a Risk Management Society, a Risk Management Association, a Professional Risk Manager International Association, a Global Association of Risk Professionals, a Federation of European Risk Management Associations, and many more international, national, local, and discipline-focused groups that tackle the broadest imaginable range of risks. Every possible risk has been scrutinized, researched, dissected, and quantified. Financial risks, health risks, societal risks, nuclear war risk, and so on.
Multiple studies have continuously attempted to explain how biases, beliefs, social and cultural factors, trust and credibility, and many other factors affect the perception of risk by individuals, and why that perception often (if not always) differs from that of experts. All done in a rigorous, scholarly way, for the benefit of academic and professional scrutiny—which is infinitely more credible, reliable, and actionable than the (hopefully more enjoyable) approach taken here.
Given that risks are part of nearly all human endeavors, there exist many context-dependent definitions of risk. It has been said that past efforts by the “definition committee” of the Society of Risk Analysis to come up with a single, consensual, exact, and all-encompassing definition of risk have failed.9 This should not be surprising—as the saying goes, if you need something done now, do it yourself; if you can wait a few weeks, delegate the task to a trusted employee; if you are willing to wait forever, ask a committee to do it. So, for now, to keep things simple (manageable?) and yet reasonably accurate and general, suf f ice to say that, “risk is the possibility of an unfortunate occurrence”—to borrow one of the seven definitions of risk provided in the glossary of the Society for Risk Analysis.10 In the present context, the “unfortunate occurrences” and interest are at a massive scale—not of the “toe stubbed on the bed leg” type, but rather of the “massive casualties and losses from a disastrous event” type. More specifically, the focus here is on risks that have a low probability of occurring, but that can have massive consequence when they do—typically called low-probability, high-consequence events in the academic literature—because academics are notoriously famous for never coming up with catchy names for whatever they study.
To appreciate how risk is handled for low-probability, high-consequence events, it pays to look at how human nature deals with risk in a general sense, as it reflects trends ingrained in human nature. In all human activities, some risks are avoidable, some not, and it is sometimes amazing to see the risks that are taken for the pleasure of it.
RISKS FOR REWARDS Everybody takes risks for rewards. Why? Because of the rewards, obviously.
Risk is always a matter of perspective. Most people would not think of jumping from an airplane with a parachute for fun—but some do. Likewise, most of the roughly forty thousand members of the US Parachute Association (USPA) would not think of BASE jumping for fun —but some do.
BASE jumping is the extreme sport that consists of parachuting off a building, a bridge, a tower, a cliff, or any point that seems too close to the ground (“BASE” stands for “Building, Antenna, Span, and Earth,” although some have claimed it should stand for “Ballsy Awesome Suicidal Exhibitionists”).11 The USPA reported nineteen deaths from over three million “normal” parachute jumps in 2012. By comparison, statistics show that one in sixty BASE jumpers typically die from that sport12—which makes it unlikely to become an Olympic discipline anytime soon. Many of those still alive are typically not deterred by the stories they can tell of hours spent in intensive care, neurosurgeries, or at funerals.13 If to the normal parachutist, the combination of adrenaline, goose bumps, butterflies in the stomach, increased blood flow, perspiration, and constricted blood vessels produced by freefall14 is a rush worth the jump, then BASE jumpers must be adding a spiking fever to the cocktail. Statistics on accident rates simply do not matter.
Although the comparison with extreme sports is—well— extreme, the same could be said for most human activities.
The fifty-five million people who enjoy the breath of fresh air and awesome views that come with alpine skiing15 are not deterred by the numerous skiing-related injuries that occur every year. Interestingly, those who enjoy a breath of fresh air and the more mundane lower altitude views that come with bicycling face the same risk of severe injuries as those practicing alpine skiing.16 Or, for an even more mundane example, the thirty unprovoked shark attacks that have occurred in Florida every year17 have not prevented some of the more than 125 million tourists visiting Florida18 from making 810 million day visits19 to its beaches. Likewise, one would be hard pressed to attempt convincing an avid snowboarder to abandon the slopes to play monopoly instead based on the fact that only 1 person in 100 million per year dies while playing board games, compared to one death per 2.2 million snowboarders.20 Relative safety is not what attracts people to philately, coin collecting, or gardening—although the latter can be deadly to those allergic to bee stings and contact with certain plants.
All activities have risks, and free individuals undertake activities in exchange for rewards. Whether the rewards consist of money, medals, grades, food, fun, excitement, self-esteem, love, acceptance, titillation, power, casino chips, or Pokémon eggs, does not matter. If the rewards are deemed to be worth it, nobody calculates risks—or cares about risks. It is exactly because people are willing to take risks for rewards that syphilis, gonorrhea, and AIDS continue to exist. That is, people are willing to tolerate high risks as long as the risks are self-imposed.
When risks are imposed by others, strangely, a completely different zero-tolerance policy seems to apply. Nobody cares about risk calculations and statistics: zero risk becomes the only acceptable norm—which is never a problem when the dollars to reduce that risk to zero will be paid from someone else’s pocket. This is the mindset that has, in part, given rise to the asbestos abatement industry, with $3 billion in revenues per year in the United States alone.21 The same mindset that has planted a poisonous kiss on the cheeks of the Keystone XL pipeline22 and handicapped the nuclear industry in the United States. Yet there is a cost. A study reviewing the US federal government expenditures on various regulatory programs enacted in the mid-1980s calculated that the Occupational Safety and Health Administration regulations of asbestos cost $89.3 million per life saved.23 This was cheaper than its other regulations on formaldehyde, which ran at $72 billion per life saved, but significantly more expensive than its regulations on concrete and masonry construction, which cost a measly $1.3 million per life saved.24 When medical doctors routinely start their voice recorders and recite the boring list of possible complications, infections, debilitating conditions, and death risks—and their probabilities—when consulting with a patient before performing any surgery, they effectively make it part of the contract that it is the client that wishes to assume all these risks in hope of the rewards. Although it should be clear from the outset that a breast enhancement surgery or vasectomy usually does not happen without consent, the medical professionals apparently wish to make it absolutely clear—and on record—that they are only willing to perform the surgical procedure (for a small fee) because the patient begged for it, irrespective of all the stated risks and dangers.
So, to recap, there is a definite double standard when it comes to risk. Individuals have a definite willingness to take risks—even high risks— when these are self-imposed. Some will even resist—in the name of life, liberty, and the pursuit of happiness—well-intentioned attempts by others to impose measures intended to reduce the high risks of some activities. They will actively fight against laws that would make motorcycle helmets mandatory or that ban smoking in public places. Yet the same individuals have a zero tolerance for risks imposed on them by others and will not balk at the prospect of massive costs to abate minuscule risks when those costs are borne by others. The golden rule does not apply here. The new twist says, “It is OK to do unto others as you would NOT have them do unto you.” One should not confuse things, as generalizations can sometimes be misunderstood. Without doubt, activism is not necessarily bad. A lot of it serves worthy causes. If the 10,000 lives lost per year in drunk-driving crashes25 were only those of the intoxicated drivers, few people would care.
Except for the poor parents of teenagers (because teenagers are apparently victims of an evolutionary flaw that has made the development of sound judgment seriously lapse far behind that of their physical ability to take self-destructive risks), most sober people would not shed a tear for dead drunk drivers if they were not exposed to any risk by them. However, since thousands of innocent people are killed every year by alcoholics, this highprobability risk justifiably warrants being eliminated. In the foreseeable future, emerging technologies will likely eliminate that risk,26 and devices intended to prevent powering of an engine by anyone who had too many mojitos will become standard equipment—possibly as cheap to implement as alternating wipers, power windows, and pine air freshener trees hanging from the mirror. Even then, though it will be a positive outcome that thousands of senseless deaths will have been eliminated, the risk of death resulting from drunk driving accidents will never be zero.
Can anything be? ABSOLUTE SAFETY IS A MYTH As has often been said, life is the only disease with a 100 percent casualty rate. Everything else is less certain.
Although one may find solace in the belief that everything is ordered for the greater good, all the evidence is to the contrary. Life is a chaotic process. No computer can predict what will happen next. Things get complicated by the fact that the behavior of some types of chaotic systems can be affected by the prediction itself. If a renowned climatologist looks at a blue sky and predicts that it will rain in ten minutes, this forecast will not suddenly make gray clouds appear—literally—out of the blue. On the other hand, if a renowned and highly respected economist predicts that the stock market will drop by 10 percent within a few hours, it might stir a panic that will affect the market instantly and precipitate its dive. The weather and stock market are both chaotic systems, but the first one is insensitive to predictions while the second is not—although the volatility in response to the prediction depends, of course, on the credibility of the person making the prediction and whether that person was dead serious or dead drunk when making the prediction.
In light of the challenges in predicting what a chaotic system will actually do, anyone promising absolute safety is either sadly deluded or an exceptional liar. Absolute safety is a pie in the sky.
The fact that an offshore platform is designed to resist earthquakes and iceberg collisions that are deemed to happen on average once every 10,000 years27—and 100- foot-tall waves too—does reduce the risk significantly and makes it possible to sleep at night. Yet that is not absolute safety. Which is in part why oil producers evacuate staff from their offshore platforms when it is predicted that they will be in the path of an incoming hurricane.28 And, of course, even if none of the natural hazards could ever destroy a building, or a bridge, or an offshore platform, or a nuclear power plant, there will always remain the ultimate risk: an operator error, thanks to an overworked, distracted, or distraught employee—or one of the Homer Simpsons of this world.
WHAT IS A PREDICTION WORTH, ANYHOW?
A lot of research dollars and efforts are invested in developing the ability to predict where and when earthquakes, hurricanes, tornados, and other extreme events will strike. These are valid endeavors and valuable tasks to better understand the world and what is at stake.
However, what if, thanks to massive scientific breakthroughs and the astounding insights of geniuses, it became instantly possible to predict with 100 percent accuracy when and where earthquakes will strike. Would the world be better off? How would humans react—or how would everything that has been said so far change? The answer, as always, is “it depends.” What would most people do if an official warning came that “A magnitude 8 earthquake having its epicenter right under your feet will strike in exactly one minute”? What?
Say that again. “A magnitude 8 earthquake having its epicenter right under your feet will strike in exactly fifty-five seconds.” Damn it! The clock is ticking.
As a reflex, those living in a bungalow might throw themselves (and the kids) out the door at once. Once out, some will wonder if they have enough time left to run inside and grab their laptop or some other prized possession. Not having prepared anything ahead of time for such a rare occurrence, or given the topic any thought until now, the daredevils storming back inside are more likely to come out with the goldfish bowl and wedding photos than precious paperwork or survival kits. That is all nice and dandy, but what about those living on the tenth floor of a building, on the fifteenth floor, on the twentieth floor, and so on? Panic?
Jump out the window? Brace in a doorway? Duck under a desk? Freeze in place and hope for the best? These are all things that can be done within the first few seconds of shaking, when one realizes that an earthquake is happening, without any warning. So, for many people, a one-minute warning is not a particularly useful prediction. It may allow time for the dentist to stop drilling, for trains to slow down, for people to exit elevators, for some operations and facilities to initiate safe shutdown,29 but it will not change the outcome for things at risk of failure or collapse.
How about: “A magnitude 8 earthquake having its epicenter right under your feet will strike in exactly one hour”? What? Say that again. “A magnitude 8 earthquake having its epicenter right under your feet will strike in exactly fifty-nine minutes and fifty-five seconds.” Damn it!
The clock is ticking. Some will have time and the presence of mind to walk into an open field and ride out the waves safe from any falling hazard or collapsing buildings or power lines—that is, if there is an open field nearby. What about those downtown? Panic? Not having prepared anything ahead of time for such a rare occurrence, nor given the topic any thought until now, who knows if their building can survive a magnitude 8? Should they “ride it” in place or should they rush to the streets or to the subway? How many people can be packed onto the baseball stadium field, and will the gates be locked? A slightly more useful prediction but, again, maybe not for everybody.
How about: “A magnitude 8 earthquake having its epicenter right under your feet will strike in exactly one day”? What? Say that again. “A magnitude 8 earthquake having its epicenter right under your feet will strike in exactly twenty-three hours, fifty-nine minutes, and fifty-five seconds.” Damn it! The clock is ticking. Pack the car? Not having prepared anything ahead of time for such a rare occurrence, nor given the topic any thought until now, many will rush to the highways, get stuck in massive traffic jams, at a standstill for hours, possibly running out of gas, not sure where hotel rooms will be available. Is this a more useful prediction?
How about: “A magnitude 8 earthquake having its epicenter right under your feet will strike in exactly one hundred years, to the second”? What? Say that again. “A magnitude 8 earthquake having its epicenter right under your feet will strike in exactly 36,524.24 days, five hours, fifty-nine minutes, and fifty-five seconds.” Oh. No need to worry then.
So where is the “sweet spot” between one hour and one hundred years where the prediction would be optimum in terms of making a difference?
Two days? Mayhem. Massive evacuation traffic jams again—except for those cowboys determined to ride the storm and guard the castle.
One week. Ghost town again, minus the cowboys.
One month. More time for packing, more orderly evacuation, better braced cowboys. Yet, nobody has time in a month to do much, if anything, to change the earthquake resistance of the buildings and infrastructure. Some of these structures will survive the shaking and some will collapse, depending on their year of construction, type of structural system, and care taken during their design and construction to address the issue of seismic survivability. Lives will be saved—except for cowboys crushed by debris—but the extent of damage will remain the same as if there had been no prediction at all. A small victory, but no less a disaster.
How about thirty years? That should be enough time to strengthen/ retrofit the infrastructure, shouldn’t it? Would that make any difference? Maybe. Financial buzzards could start factoring future damage into present dollar values and build that into the cost of mortgages and loans, but that would not prevent the disaster. Hard cash would need to be spent on upgrading the infrastructure to ensure its satisfactory performance during that predicted earthquake, so many of those considering making these investments will have tons of questions that would first need to be answered.
Is this really a 100 percent sure prediction? Then, predicting magnitude is fine, but what about the amplitude of the design parameters that engineers use in their calculations; these can vary by a factor of ten among a bunch of earthquakes having the same magnitude. How long will it take to do any work to strengthen the infrastructure and will there be a shortage of labor if everybody waits to the last minute before doing something about it? What is the point of investing money in something like that when one could die from so many other causes over a thirty-year period?
Besides, why worry about damage from one earthquake when, even in the absence of any threatening hazard, the American Society of Civil Engineers’ Report Card for America’s Infrastructure 30 gave the nation’s existing infrastructure a grade no better than a solid D+, thus deserving of a serious spanking for lack of effort. This report called for trillions of dollars in investment to bring the existing deficient infrastructure to a safe and acceptable level, and it has been making this call for decades with only timid action by all successive governments. Twenty percent of all highway pavement is in poor condition and $160 billion is wasted every year in time and fuel on badly congested roads;31 $45 billion is needed to repair more than two thousand aging dams that risk failure;32 $123 billion is needed to repair the more than fifty-six thousand deficient bridges across the country;33 $1 trillion is needed to address the water needs of the nation while fixing the two hundred forty thousand water main breaks that waste over two trillion gallons of treated drinking water each year;34 and many more problems exist with levees, energy production and transmission, solid waste, hazardous waste, schools, and so on. For decades, all of this has been known, and the funding has not been sufficient to address these pressing issues to the extent the problems warrant. So why would there be sudden action taken to do something about one earthquake predicted to happen at some point (any point) in the future? When people are perfectly happy to drive cars with four patch-covered tires that have no threads left, they probably could not care less about a flashing warning light on the dashboard indicating an airbag failure; the risk of an accident is not at the forefront of their mind.
From that perspective, the concept of “let the earthquake clear up the place and restart brand-new” does not seem too bad an idea. Aren’t earthquakes Black Swans anyway?
ON THE DISASTER TRAIL
To Be or Not to Be (in a Crane) In construction, a large number of temporary structures are used. In particular, some relatively “flimsy” bridges can be used for traf
f ic detour while an existing bridge is being replaced, or a tall tower crane may be used on a high-rise construction site. Typically, these temporary structures are supposed to be there anywhere from a few months to a couple of years—although I am aware of some temporary structures that have been “temporary” for more than seven years.
In the early 1990s, I had casually mentioned as part of a technical presentation that temporary structures should also probably be designed to resists earthquakes—at the time, most were not. Some people were rather upset by that comment. They argued that because temporary structures only exist for a short period of time—say a year—there is only a remote probability that they will be struck by an earthquake. That certainly is one way to look at it. From that perspective, nowadays, most of the time (but not always), in many countries (but not all), temporary structures are simply not designed to resist any earthquake forces,35 or are designed to resist an earthquake that has a 10 percent chance of being exceeded during the existence of the temporary structure—say a year, instead of the fifty-year span considered for regular structures. There is an engineering logic in computing the size of the design forces based on the “lifespan” of the temporary structure—if looking at it from the perspective of each temporary structure on its own. This is like recognizing that smoking is dangerous but saying that the risk of getting cancer by smoking an entire pack of cigarettes once, on a single day, is infinitesimally low.
However, looking at it from another perspective, if a crane operator is hopping from one tower crane to another, from project to project, one year at a time, this adds up to an entire career spent in temporary structures. Who would like to spend a lifetime working in the elevated cabin of a tower crane knowing they are not going to be safe during an earthquake? This is where the packs of cigarettes, day after day, add up over a lifetime.
When Taipei 101 opened in 2004, at 1,671 feet to the top of its spire, it was the tallest building in the world—and remained so until 2010.36 One of Taipei 101’s lesser-known features is that the restrooms in one of its eighty-fifth-floor restaurants have floor-to-ceiling windows that force users to admire the view while standing at the urinals—thankfully, the surrounding buildings in sight are ten-to thirty-story dwarfs. Another lesser-known fact not mentioned in the Taipei 101 tourist brochure is that on March 31, 2002, an earthquake struck Taipei. At that time, construction was in the process of erecting the streel frames between the fifty-third and fiftysixth floors. One of the four tower cranes in use at that level failed and dropped 750 feet to the ground. Five people were killed, including the crane operator.
37 That is the curse of risk calculations for temporary structures. (Note: Prompted by Japan following the 1995 Kobe earthquake during which many tower cranes suffered damage,38 the International Organization for Standardization published a new standard in 2016 (ISO 11031) for the design of tower cranes to ensure that they will not collapse and endanger the public during severe earthquakes.39 Progress is a process.)

Black Swans
PREDICTIONS THAT FAIL As the saying—often attributed to various authors, humorists, quantum physics experts (Neils Bohr), baseball gurus (Yogi Bera), and many others—goes, “It’s hard to make predictions, especially about the future.” All the might of statistics and all the wizardry of risk managers is useless when it comes to predicting something that has never occurred before. For instance, it would not have been possible to calculate the odds of having a bunch of terrorists simultaneously hijacking commercial airliners to crash them into national landmarks. As early as 1998, the CIA had warned the president that al-Qaeda terrorists were planning to hijack commercial airplanes.1 Hijacking was not a novel idea and, as mentioned earlier, the fact that some groups had planned to crash airplanes on landmarks was already known, but nobody foresaw the exact set of events that unfolded on September 11, 2001. Yet, even if someone had imagined such a scenario—as a real possibility rather than as a bad screenplay for an even worse movie— any person taking a passing interest in the idea could not have calculated the probability of it happening. Nor could anyone have calculated the odds that this would have led to the invasion of Iraq.
This kind of unpredictable event that is extremely rare and of catastrophic consequences is typically called a “Black Swan Event.”2 It can also be recognized by the fact that following the black swan event itself, armchair quarterbacks of every ilk offer their expertise to explain that it was all foreseeable and predictable—as there is never a shortage of geniuses willing to share predictions that are solidly anchored in hindsight and nothing else.
Throughout Western civilization, nobody could conceive that swans could be any color other than white. It was simply impossible. This remained an unalterable truth until 1697 when black swans were discovered in Western Australia. There it was. Mathematicians (e.g., Karl Popper)3 and philosophers (Hume)4 have used this historical event to emphasize that a single observation unpredicted by the prevailing theory (or belief) is sufficient to invalidate that theory (or belief). The global financial crisis of 2007–08 is possibly the most famous contemporary black swan event, both because it occurred at a time when the term was being popularized in a bestselling book about the fragility of financial markets (“The Black Swan: The Impact of the Highly Improbable,” by Nassim Nicholas Taleb),5 and because nobody from the financial elite saw it coming. All banks, investment firms, financial institutions, and even—as Taleb himself likes to remind everybody6—Nobel Prize winners in economics were caught with their pants down. As unexpectedly as for the terrorists that had used airplanes as missiles, nobody anticipated that top Wall Street investment firms would end-up being accused of having deliberately defrauded their own clients7—and nobody predicted that all, except one,8 of the top executives that were part of the worldwide financial meltdown would evade jail and get rewarded with millions of dollars in bonuses.9 As bad as these events were, there can be worse. The problem arises when experts in many domains develop forecasting models entirely based on past data, and rely exclusively on these models to make decisions. Some have called this folly—or smug arrogance—because, by definition, past data does not include things that have not happened yet. As such, this reliance on highly sophisticated forecasting tools is misleading their users into a false sense of control and security, thereby increasing their exposure and vulnerability to the black swan events.
The plain truth is that some events are unpredictable, mostly because people are unpredictable. Even the most cautious parents who had the foresight to put plastic protectors over all electrical outlets to prevent their kids from filling them up with playdough, will be surprised when their little devils wade in the toilet bowl, put the cat in the washing machine, or try to catch a skunk—creativity that they had not foreseen in their wildest dreams.
Given that black swan events of catastrophic consequences cannot be predicted, the only sensible way to prepare for their occurrence is to build robustness into the systems they can affect. Trusting forecasting tools based on past data will provide no protection—and, in fact, provides a false sense of security that can lead to reckless actions, that can in turn increase exposure and vulnerability. The only way out is to assume that something terrible and unforeseen can happen and plan accordingly.
10 As the saying goes, hope for the best but plan for the worst.
Unfortunately, almost the opposite often happens, where people plan for the best and ignore that the worst could even happen. For example, in an attempt to obtain authorization to operate the Shoreham nuclear power plant it had constructed, the Long Island Lighting Company decided to conduct a drill to demonstrate that evacuation of the population around a set radius from the plant was possible following a nuclear incident. The Three Mile Island nuclear accident had occurred somewhat halfway during construction of the Shoreham project, so it had become a political hot potato and the target of much grassroots opposition. With the local emergency response agencies refusing to participate in this evacuation simulation, the Long Island Lighting Company paid 1,800 of its employees to drive buses and tow trucks, and play the role of traffic of f icers and emergency responders.11 The Long Island Lighting Company plan thoroughly documented that all the drills it conducted worked smoothly and showed the effectiveness of their evacuation plan. However, serious doubts were expressed throughout the hearings for the nuclear power plant as to whether the bus drivers paid to participate in the drills would as eagerly dash into the radioactively contaminated zone during a real emergency, or would even bother to show up at all. What bus drivers would rationally rush to pick up children from schools within a certain radius of the power plant, and drive them to predesignated safe havens, rather than rush to take care of their own kids and leave the world to fend for itself—all for nothing more than minimum wage? Somehow, the state of panic that will exist when trying to evacuate the population where the air has been fouled by a radiological incident cannot be quite replicated by an evacuation drill executed with pure air and no stress. A beautiful sunny day is never quite as worrisome as a day when rain is washing down radioactive particles. The county and the state rejected the evacuation plan,12 and as no evacuation plan was ever approved, the $6 billion plant had to be decommissioned and dismantled.13 One unfortunate problem with the black swan theory is that it has been a victim of its success. The fate that awaits every highly popular concept is that it eventually becomes a buzzword—meaning that it becomes profusely misused to the point of becoming meaningless. Even more so, nowadays, when every person that contributed to creating a crisis of inordinate proportion will rush to call the event a black swan, as if to disengage from any responsibility. This fundamentally misses the key point in the definition: a black swan event is something that no sane person could reasonably have expected to happen in real life. Negligence, incompetence, ignorance, and lack of foresight cannot hide under a coat of black swan paint.
As such, the COVID-19 pandemic of 2020 is definitely not a black swan event, contrary to what some have claimed.14 Governments and corporations worldwide had been warned repeatedly of the risk of a pandemic and of how rapidly it could spread. Not warned by unknown scientists toiling in obscure labs, but warned by some of the most reputable national and international agencies and organizations, as well as by high-profile speak ers with international name recognition.
None other than Bill Gates, founder of Microsoft and one of the richest men on earth, in a 2015 TED talk (that can be seen on YouTube15) warned that the world was not ready to face a pandemic. This was in the aftermath of the Ebola outbreak that killed more than ten thousand Africans. The world had been lucky to escape a global Ebola pandemic because a person infected cannot spread the virus before developing symptoms, and when the symptoms appear, they are brutal. Ebola triggers a cascading meltdown of the immune system, blood vessels, and vital organs. The virus kills on average 50 percent of those infected,16 which is, ironically, what makes the heroic task of confining an Ebola outbreak possible. During a security conference in Munich in February 2017, Gates further warned that within ten to fifteen years, a rapidly propagating pathogen would kill thirty million people in less than a year.
17 Beyond philanthropic billionaires, many other organizations forecasted a world pandemic and its consequences, and stressed the need for preparedness. For decades.18 In all serious analyses of the situation, there was never any doubt;19 when it came to the topic of a worldwide pandemic, it was never a matter of “if” it will happen, only a matter of “when” it will happen.
In fact, in October 2019, barely a few months before the COVID-19 pandemic started, a Washington think tank on national security ran a simulation to determine what could happen during a “highly transmissible coronavirus” outbreak.20 A coronavirus scenario was used because both the SARS and MERS viruses at the root of recent epidemics were also of the coronavirus family. The scenario assumed that governments would be too slow to react and would impose travel bans and border closures only after the virus had spread worldwide through international air corridors. It assumed that government would have to pump massive amounts of money into the economy to prevent its total collapse, and that development of a vaccine that could effectively end the crisis would take up to a year. One of the main conclusions of the pandemic simulation conducted by these invited experts in global health, biosciences, national security, emergency response, and economics was that early and preventative actions are absolutely critical if there is any hope of minimizing the mess created by a pandemic.
Not a reassuring thought in an era where trust and cooperation between countries, levels of government, companies, and citizens has been eroded to an all-time low.
21 The key conclusion from the scenario pandemic was that “leaders simply don’t take health seriously enough as a U.S. national security issue.”22 In fact, such simulations have been played multiple times in the past by numerous organizations. In 2001, the “Dark Winter” exercise was conducted to simulate an epidemic caused by a malicious smallpox infection. In this “seniorlevel war game” focusing on a biological attack, the roles of National Security Council members were played by former senior government officials.23 In this case, a smallpox vaccine exists, but the US supply at the time was estimated to be only seven to twelve million doses and production of new vaccines to replenish the stocks would have taken two to three years because the facilities to produce smallpox vaccines had been dismantled after 1980.24 The simulation was stopped after it had stretched over thirteen virtual days, during which the disease spread to twenty-five states and fifteen other countries, leaving the world with massive casualties and under the threat of a “breakdown in essential institutions, violation of democratic processes, civil disorder,” and compromised governments that had lost the confidence of their citizens.
In 2018, the “Clade X” exercise similarly considered “an outbreak of a novel parainfluenza virus that is moderately contagious and moderately lethal and for which there are no effective medical countermeasures.”25 Again, national security and epidemic response experts26 played the role of National Security Council members. On the first day of the tabletop exercise, they were briefed about the recently discovered spread of a new virus, called “Parainfluenza Clade X” that killed 10 percent of those infected and for which a vaccine would take twelve months to develop. The scenario considered that each infected individual transmitted the illness to two to three other people, and that the virus incubation period was approximately five to seven days. Impressively, that is similar to the playbook for the COVID-19 virus that appeared two years after that simulation. Throughout the Clade X pandemic scenario, air traf f ic shut down, a quarantine was imposed by the federal government, “national state emergency” and “public health emergency” were declared, the capacity of intensive care units was exceeded, public demand for surgical masks and respirators surged. Four months after the initial outbreak, almost nine million were infected and three million dead, with 10 percent of those in the Americas. Projecting ten months ahead, twenty million people had died in the United States, the gross national product had dropped by 50 percent, the stock market collapsed to 10 percent of its previous peak value, and the health care and health insurance system had gone bankrupt. Civil unrest led to widespread looting and violent clashes, with the police and the military securing borders. Some governments collapsed.
A vaccine finally became available, but only five million doses per month were to be produced. Who gets it first?
Approximately 150 people were invited to attend the 2018 table-top exercise, and videos of the more than five hours of National Security Council discussions that took place through the entire simulation are freely available online for anyone interested in watching.27 With all the knowledge in hand, and the “close calls” of the 2003 SARS and 2014–16 Ebola epidemics,28 it is simply not possible, in all seriousness, to call a pandemic— including the COVID-19 one—a black swan event.
So, to be 100 percent clear:
A future world pandemic is not a black swan event (be it Ebola or any other deadly virus not yet known to exist); A future destructive magnitude 6 earthquake near New York City or Boston is not a black swan event;29 A future destructive magnitude 7 earthquake near Charleston, North Carolina,30 or near Salt Lake City, Utah,31 is not a black swan event; A future destructive magnitude 8 earthquake near Memphis, Tennessee,32 or anywhere along the US West Coast is not a black swan event; A future destructive magnitude 9 earthquake near Seattle33 is not a black swan event; A future destructive Category 5 hurricane anywhere along the Atlantic coast or the Gulf of Mexico is not a black swan event; Any of the destructive hazards described earlier in this book are not black swan events—except for the extraterrestrial invasion. Any of these events have the potential to wreck the economy to various degrees and to inflict massive deaths and injuries. Those who will be hit by any one of these events will suffer in predictable ways. If they survive, hopefully, they will have become more knowledgeable and be able to build on their firsthand experience of the disaster to make judicious decisions to prepare a better future.
However, none of them will be legitimately able to claim that they have survived a black swan event, simply because none of these events are black swans. The threats posed by all of these events are well known; the cold facts have been repeated multiple times by experts from many disciplines; the risks have been surgically assessed by academics and other credible sources using specialized tools and models. In some cases, the majority of those who had to make decisions—from individuals to government officials—decided that the best course of action was to “cross that bridge when we get there.” That is perfectly fine—but, after the disaster, leave the poor black swans alone.
INCONSEQUENTIAL SWANS Not all unforeseen events are black swans either. Things that nobody could have expected happen all the time, but not always with significant consequences.
For example, a magnitude 6.9 earthquake occurred on December 23, 1985, in the Nahanni National Park Reserve in the Northwest Territories34—nothing dramatic given that nobody lived within a hundred miles of the epicenter. The shaking, which started with the first in a string of earthquakes in October that year and peaked with the December event, startled the roughly three thousand people that lived scattered over the seventy-five thousand square miles in and around the Dehcho First Nations administrative region.35 The seismic waves found nothing of significance in their path to damage, but they rattled the confidence of seismologists who, prior to the 1985 earthquakes, considered this region to be a “relatively quiet earthquake zone.”36 In developing the seismic maps of the National Building Code of Canada, to be conservative, seismologists had predicted that earthquakes as large as magnitude 6 could happen there.37 It turned out that not only is a 6.9 roughly ten times larger than a 6.0, but the ground motions recorded on December 23 showed peak horizontal ground accelerations greater than 2g (that is, twice the gravity force), setting a world record at the time.
Indeed, every now and then, earthquakes happen where none have been observed before. This was the case for the 1988 magnitude 6 Saguenay earthquake that struck in the middle of a national park, mostly shaking up all the resident moose, Canada geese, and other wildlife (it is unknown whether any were traumatized in the process). It cannot really be called a black swan event either because, while it produced some minor damage in cities up to two hundred miles away, none of it was significant.
Typically, where no previously known significant earthquake activity has occurred in the past, none is predicted to occur in the future, but once an earthquake happens there, a new seismic region is born. Evidently, earthquake engineering is an expanding business.
SIMCITY ON STEROIDS When driving on a highway, if a little ten-year-old kid riding in the back-seat sees warehouses lined up along the road and asks, “Is this industrial zoning?” one can safely bet that this kid has been playing SimCity. SimCity is a role-playing game in which one takes a virgin piece of land, divides it into residential, commercial, and industrial zones, and starts building the infrastructure of a virtual city38—as in some sort of über urban planner power trip. The player then sets up the city budget and tax rates to attract people and grow the city. This is an open-ended game with no goal other than trying to build a satisfactory outcome, either by maximizing population, profitability, aesthetics, quality of life, or any other self-imposed objective. Although there is no end to the game, the key is to find an equilibrium to avoid chaos and urban decay—without using any of the cheat codes that can be found online and that can provide infinite water and electricity or infinite money to spend.39 The rules that govern the behavior of the SimCity residents and infrastructure interdependencies are hidden, but can be inferred to some degree by trial and error and some common sense. Practically speaking, this video game is an agent-based model. Agent-based models are computer programs that simulate the actions and interactions of individual agents (such as, people) that follow simple decision-making rules that are deemed to be rational to achieve and maximize, one small incremental step at a time, a specific benefit such as reproduction, wealth, social status, and so on, depending on the model.40 Agent-based modeling has been used to study how specific situations can evolve over time, for a multitude of applications, ranging from the spread of epidemics, to traffic problems, to warfare (obviously), with variable levels of success.41 Tens of thousands of players have exchanged tricks on social networks to figure out how to achieve a successful SimCity. Some have reported that once the basics of water and power are provided, neighborhoods of concentrated dirty manufacturing industries only need lots of policing and shorter distances for freight trips to be “happy” so it is pointless to invest further resources there, whereas achieving a happy residential neighborhood requires loading up on schools, hospitals, parks, non-polluting industries, low traf f ic noise, low crime, and no garbage. Some successful players have argued that to beat the system, the strategy to have a successful city is to surround it by over-polluted failed industrial cities to which all pollution and garbage is exported.42 Others have figured out that each individual virtual citizen in the simulation responds to the same simple rules, with the population distributing itself as a homogeneous mass moving around no differently than water, sewage, or traffic, taking jobs in the first random building they encounter that is hiring, irrespective of skills or prior employment, easily moving from a commercial to an industrial job.43 Astute folks have even unmasked faulty rules that can lead to illogically dense traffic in a long deadend street with only one house at its end,44 traffic jams on small roads running parallel to empty mega-highways,45 or intersections forever blocked by masses of pedestrians going in circles and nowhere else.46 Providing a step-by-step guide on how to “win” in SimCity by building prosperous and stable cities would be a book by itself. The point is that the outcome of any agent-based modeling construct depends on the assumptions built into the model. The expectation when running an agent-based simulation is that if the rational response of each person or agent to a given input is accounted for by the model—such as having a percentage of the population move away when taxes are increased—then running the model over time will result in an accurate simulation of the outcome. In other words, the belief is that running the program will effectively aggregate all of the cause-effect and action-reaction rules that have been embedded into the model, taking into account average responses and variations in responses from agent to agent, and provide a clear and reliable prediction of the future. As such, the program will consider at any point in time the respective response of each individual to taxes, public services, pollution, infrastructure quality and effectiveness, and their tolerance and reactions to shortages or abundance of each, and act accordingly— moving to or away from a city. The beauty of an agentbased model is that it computes evolution in accelerated time—something that is hard to do by any other means. The drawback is that the resulting “path of evolution” that the program will simulate depends entirely on the specific rules embedded in the model. Changing one rule changes the outcome. To make things worse, if the evolution that is simulated is a societal model that relies on a number of rules that define how an individual behaves, it does not take long to realize that missing an important action-reaction rule, or a sudden unpredicted change in how people behave over a short or long time, will throw a wrench into the entire prediction process. Like having the in-laws unexpectedly move in can change the dynamic of a marriage.
Experts on agent-based modeling and decision-making models will argue that everybody makes rational decisions and reacts to actions in ways that are in their best interest.
Ask them why it is, then, that many people are smoking in spite of overwhelming evidence that it is a major health hazard, and they will respond that people who smoke have made the rational decision that the enjoyment they gain from smoking is worth the resulting reduction in life expectancy. This is tantamount to saying, with an ironclad confidence in the ability to model human behavior, that those who jump off a bridge (without a parachute) have reached the rational decision that it is a logical action in response to a broken heart or other setback—or maybe that the jumpers (still without a parachute) have made the rational decision that the enjoyment they gain from the sensation of flying as they fall is worth the resulting reduction in life expectancy. Political scientists will be quick to counter that people make decisions against their best interest all the time—some will emphasize that it is actually the entire purpose of political science to figure out how to manipulate people to vote for things that are clearly against their best interest.
Ask a prominent economist why SimCity-like models have not been developed to predict economic cycles and recessions, and the likely response will be that most factors that have a large impact at that scale are unpredictable.47 Nobody can predict the impact of changes in national and international political regimes on the economy and stock market, or rumors from out of nowhere that can suddenly lead to a bank rush or some illogical financial panic, or when a real estate market bubble will pop. How to model the fact that an international crisis, a pandemic, or the emergence of a green movement might suddenly change the rules of the game or maybe even shift the short-term fixation on immediate profits of some companies into a long-term view focused on survival?
Sociologists and psychologists who have studied the mass hysteria that can be created by unfounded fears and rumors would concur. Who could have predicted that the tense political and religious quarrels of the inhabitants of Salem would escalate into a full-blown, hysterical witch-hunt that culminated in two hundred accusations, thirty guilty verdicts, and the hanging of nineteen citizens48—likely all innocent victims, as any self-respecting witch would have used witchcraft to escape death. Yet are the Salem trials and executions worse than the modern anti-vaccination movement that has led to the resurgence of previously eliminated diseases?49 If someone wished to construct an agent-based model to predict the impact of the next mass hysterias, delusions, and groupthink disasters, what fortunetellers should be consulted to figure out the rules that “rational agents” would follow in such cases?
All of that is to say that unpredictable events are—well— unpredictable.
ON THE DISASTER TRAIL
Let It Be I was stunned. A local radio station that had simply played, beginning to end, the “red album” and “blue album” had the gall to call it the greatest Beatles special ever aired. It was 1979 and the two greatest-hits double-LPs had been out since 1973. Playing them back-to-back without a single commentary—but plenty of commercials between songs— was no feat. Anybody with a little bit of “Beatles knowledge” could have done better, and for some unexplained reason, I sat at the typewriter and gave it a shot. By the time I was done, I had created an eighteen-hour series embracing the totality of their recorded opus, from the Star-Club in Hamburg up to the breakup, with commentaries or anecdotes for each song. It even included seamless mixes of the Beatles’ covers of songs by other artists with the original versions of these songs, to highlight the sharpness added by the newer versions. That was the “ultimate Beatles special” I would have liked to have heard in the first place.
It was a fun challenge for a mini-Beatlemaniac like me to create that fictitious radio show, but what was I supposed to do with it? Was it good enough to put on the airwaves?
There was only one way to know, and it was to try to sell it.
For sure, the local community radio would have taken it for free, but if I started at the bottom, I would never know if anyone higher in the radiophonic hierarchy would have been interested. I therefore had no choice but to start at the top and work my way down until I found the perfect match.
With no prior experience in radio, my prediction was that I would be laughed at and kicked out of every station— possibly until I landed back to the community radio. This low expectation was a direct consequence of my one prior experience as a door-to-door salesman. It did not help that the product then was a glossy fishing/hunting magazine— something nobody needs. My coworker who covered the houses on his side of the street sold eight subscriptions on his first day. He was a “foot-in-the-door” salesperson that literally put his foot between the doorframe and the door— old-style—such that the potential customer could not escape by slamming the door. My single sale was to a couple that happened to be coming back from a fishing trip, still unpacking and more than a bit giddy from the experience—and possibly from alcohol.
With nothing to lose, ready to face as many rejections as there were radio stations, with the latest Bureau of Broadcast Measurement (BBM) ratings in hand, I showed up unannounced at the no. 1 station in town. The program director seemed amused and talkative. He wanted to know more about the breadth of my musical tastes and interests, which thankfully was much broader than the Beatles. When he asked what was the latest LP that I had purchased, it was a bit awkward. Facing the program director of a station focused on pop and rock, I stuck to the truth and admitted that it was a country music album—actually, the only country music I ever purchased in my entire life up to this day for that matter—because it was a concept album on the secession war that was so brilliantly done that I could not resist.
“White Mansion?” he asked.
“Exactly!” I replied surprised.
He could not believe it. He thought nobody else in town knew about this album—so did I, for that matter. He not only bought my Beatles special, he also hired me as a part-time content producer, developing shows on progressive rock and retro music. Absolutely no agent-based simulation could have predicted that outcome.
The eighteen-hour series L’Épopée des Beatles (The Beatles’ Saga) ran during the fall BBM that year and the ratings it received crushed the competition—which is evidently easier to do when headlining with the world’s best rock band. My prediction that I would tumble down the stairs of the radio world all the way to near bottom had failed badly, but it was an unpredicted event with a positive outcome—hence, not a black swan event.

The Brain
WHOSE BRAIN?
Serious research in neurology has been conducted in the past decades to determine why teenagers typically engage in more risky activities than any other age group. From these expensive studies, neuroscientists discovered what every parent of teenagers could have told them for free: overnight, the brain of a healthy kid transforms into a mass of Jell-O in which synapses and neurons are drowning and thus unable to perform normally. Every part of teenagers’ bodies grows at accelerated speed into adulthood, except the brain, which lags behind. As a result, teenagers can physically do anything adults do (and often better), but have no pragmatic perception of the future and therefore have not fully grasped that death can be as close as, for example, the distance between the beer in their right hand and the steering wheel in their left one. They may have an acute perception of risk when filling out questionnaires designed to determine if they understand the consequences of their actions, but that is theory. When it comes to practice, they still engage in incoherent and risky behavior,1 be it binge-drinking, unprotected sex, smoking anything that burns, sexting, truancy, speeding, snorting smarties,2 performing unrehearsed daredevil stunts and tricks, and listening to Justin Bieber’s music, to name only a few.
Statistics confirm this all the time: this segment of the population loves to partake in high-risk behavior and does not bother with the nuance of “calculated risks.” Circumventing the mysteries of why teenagers act brainless,3 the focus here is on so-called “normal” adult brains that are considered fully developed and on their occasional irrationality.
OUR SEGMENTED BRAIN To say the least, knowledge on how the human brain functions is far from complete—and what is being discussed here is not the proverbial joke that members of one sex are clueless in understanding how the other sex’s brain works.
Yet, in spite of all the advances of the past decades in mapping how thoughts travel and jolt neurons here and there in all the nooks and crannies of the brain, no reliable roadmap exists and many of these roads are at best foggy pathways. Yet one characteristic of the human brain has been long proclaimed and believed by all: it has a split personality.
Going back millennia, traditional Chinese medicine considered the mind to consist of: (i) the “Yi,” which provided wisdom, correct judgment, and clear thinking, and qualified as being active, calm, and peaceful; and (ii) the “Xin,” connecting the mind to the heart, feelings, emotions, and desires, and qualified as passive, excited, energized, and confused. The sum of the Yi and Xin creates humanity and personality.
The contemporary version states that there is a right and a left side of the brain, where the right hemisphere harbors the intuitive artistic side of this schizophrenic relationship and the left hemisphere is the domain of the logical thoughts and mathematical mind. Much of that perception arose from the study of patients having suffered severe brain damage—victims who were willing to let other people tinker and probe inside their heads in hope of relief and remission. Neuropsychologist/neurobiologist Roger Wolcott Sperry famously did so (the tinkering bit, that is), receiving a Nobel Prize for his work on “split-brain” patients.4 The idea, in its most simplistic “distinct cleavage” form, quickly found its way into popular culture—resulting in the tagging of individuals as being “left-brained” or “right-brained,” which is more polite than saying nerdy or artsy.
The two-minds idea is now deeply entrenched in contemporary views, including statements such as this one: “The intuitive mind is a sacred gift and the rational mind is a faithful servant. We have created a society that honors the servant and has forgotten the gift.”—a quotation widely but incorrectly claimed to be from Einstein.5 The truth is evidently somewhere in between. It is generally recognized that both sides of the brain are typically orchestrated to work together if successful results are to be achieved in any activity.
6 Yet, irrespective of facts and legends, all of this underscores that fact that being human requires juggling the demands and expectations coming from two very different sides of our personality (or brain).
One side, after stepping on the scale, asserts that it is time to start going to the gym, and to replace French fries and ice cream with Brussels sprouts, broccoli, and other weird veggies. The other side fully recognizes that it is the right thing to do, but after calculating the effort and discipline required to do it, concludes that the priority is to first finish off the box of Twinkies—otherwise they will go to waste, which would be a shame—and then see if there is a gadget on the shopping channel that could “bring the gym to the living room couch” instead. Therein lies the eternal tug-o-war between wisdom and willpower: The knowledge of what should be done versus the fact it does not get done for a number of reasons—valid or not.
The business model of many profitable gyms is built on that dichotomy: Wisdom will bring many to pay upfront for a year membership, but will-power (or lack thereof) will make them progressively reduce the number of visits per month, eventually down to zero. This is in part why gyms that can host three hundred people can get away with selling six thousand memberships.7 Likewise, an entire industry of selfhelp books thrives because it offers the low-hanging fruit of easy initial commitment for a few bucks to appease the demands of wisdom, knowing full well that the willpower to implement the advice will be lacking. At worst, this will make it possible to sell another book down the line, fueling a sort of perpetual motion.
PERCEPTION, MISPERCEPTION, AND OUTRIGHT IRRATIONALITY Assuming someone can muster the willpower to act, other brain-created obstacles stand in the way. Taking action is good, but what action exactly?
Everybody is aware that there are many optical illusions that can fool the brain. For example, there is the “scintillating grid illusion” (a version of the Hermann grid illusion)8 that is created by a grid of gray lines drawn on a black background, with white dots placed at all the intersections. Staring at a white dot makes the dots farther away look black. When eyes wander across the grid—as fixing a single point does not come naturally—the dark dots appear and disappear at various locations across the board, producing a scintillating effect and driving the mind nuts without the help of 1960s music or hallucinatory drugs.
Straight lines can look curved, static wheels seem to turn, shapes of equal size appear to differ, and a myriad of other tricks9 are all illusions that fool the brain.10 Among hypotheses proposed to explain why this happens, a simple one is that survival requires the rapid recognition of familiar patterns, and that through the long process of human learning, simplified models are created by the brain as “shortcuts” to accelerate things.11 The illusions are created because these shortcuts lead to expected answers, which are wrong in the case of the illusion. The point being, here, that the brain can be fooled all the time—as any respectable illusionist would be able to prove in the blink of an eye.
Likewise, intentionally or not, humans can follow other paths that lead to erroneous judgments. Indeed, humans can be convinced of pretty much anything. Joseph Goebbels (minister of propaganda in Nazi Germany, as mentioned earlier) built his career on the fact that a lie repeated often enough can become a truth—a principle that nowadays produces a lot of confused people when reporters and politicians across the entire ideological spectrum accuse each other of spreading “fake news.” The list of irrational things that humans have held to be true at various times throughout history is endless. If anything, this suggests that the human brain is wired with an ability to believe just about any nonsense, as long as it is presented with authority or in ways that please expectations. While it is possible to forgive those who lived in the Middle Ages for believing that tomatoes were a creation of the devil,12 that black cats were witches in disguise,13 and that Catholic priests were trustworthy, it is much harder to explain in this era of infinite access to knowledge what motivates the Monty Pythonesque logic of flat-earthers and the proponents of loony conspiracy theories.
Talking to someone who is wed to the moon landing conspiracy theory14 can be a surreal experience; it feels like talking to someone who lives in a parallel universe. They typically are lost in an alternate reality that has been constructed in a way that blinds them to facts. Their brain has been conditioned to certain “shortcuts.” As a result, they are navigating a mental Hermann grid where all the white dots are seen as black. Attempts to hold a rational debate are futile, although some make it a point to do so.
15 While conspiracy theorists may seem off the chart in their beliefs, they are not that different from everybody. It turns out that there is some seriously odd “wiring” in the brain—everybody’s brain—that makes people interpret all information in ways that confirm their beliefs. The phenomenon is called confirmation bias.16 Not only do humans prefer and seek information that reinforces their existing mindset, but they also avoid, discredit, distort, or conveniently forget information that conflicts with their views—a process often called selective exposure.17 When the beliefs are political, it leads to divided factions that become further divided and an unwillingness to compromise, which can lead to war and death. When the beliefs are those of religions or cults, it leads to entrenchment in self-righteous proclamations of unalterable truths, which again can lead to war and death.
Although it does not always have to escalate to extremes, this “odd brain wiring” is pervasive in all activities. At insignificantly low levels of consequences, confirmation bias and selective exposure can simply lead to ridicule and unintended consequences. In 1979, Monty Python produced the silly movie Life of Brian which tells the hilarious misadventures of a poor sap mistaken for the “real deal” by a mob searching for the Messiah. Despite his best efforts to chase them away, it eventually leads him to crucifixion—a story unfolding in parallel to the life of the “real deal” shown far in the background in some scenes to confirm that Brian is not Him. It is nowadays considered by many to be one of the best comedy films of all time.18 When the soundtrack of the film offended one South Carolina Presbyterian minister (who did not see the actual movie), he shared his concern with the wife of Senator Strom Thurmond (who did not see the movie or hear the soundtrack), who in turn shared her outrage with her husband,19 who did not need to see or hear anything to take the necessary measures to have all showings of the film suspended in South Carolina. Many other religious groups also publicly condemned the film.20 Of course, as censorship always does, this turned out to be better publicity than what any money could have bought.
While censorship is fundamentally an attempt to control the thinking of others, expurgating nonconforming ideas is at the same time a form of selective exposure by removing from one’s realm any material that is not compliant with ideas already anchored in the brain. This helps perpetuate perceptions even if counter to evidence from reality. It should stand to reason that when tangible proof is provided about certain facts, people should immediately renounce previously erroneous ways, but no. Unless the evidence is brutally physical—like putting the hand of a guru on a hot stove to see if his claim that the mind can control pain holds true—in most cases, the first reaction of someone confronted with any proof is to question the validity of the proof. In fact, the more some beliefs are questioned, the more entrenched those beliefs become. People defend their beliefs against all odds, like kids weaving convoluted explanations to patch up glaring holes in the charming narrative supporting the existence of their beloved Santa Claus—except that kids eventually figure it out, some never forgiving their parents for having fooled them with this enchanting fairy tale.
All cults thrive on this wonderful characteristic of the human brain. In the 1950s, psychologists infiltrated a doomsday cult that awaited to be saved by aliens on a specific date when the end of the world was predicted to happen. Of course, the date came and passed, without a single flying saucer in sight, but instead of realizing that they had been duped, members of the sect concluded that the world had been saved by their piety and continued efforts to recruit new members.21 Interestingly, the promise of accessing heaven via an alien spacecraft rescue is a recurring theme of cults, with multiple hilarious examples throughout history,22 but some tragic ones too—such as when the thirty-nine disciples of the Heaven’s Gate cult committed mass suicide, convinced by their “spiritual leaders” that leaving their bodies was the way to access a spacecraft hidden behind the Hale-Bopp comet when it reached its closest distance to earth.23 However, spacecrafts are not required for wacko beliefs.
Nor are wackos needed. Some psychiatric studies report strong evidence that cult victims include “normal people” without any previous long-standing conflict that would have made them more vulnerable. While psychiatric and addictive disorders can be a factor leading some folks to join cults, many simply were searching for spirituality or personal development.24 As one former cult member testified, she came from a “loving, middle-class, Midwestern family,”25 but her brain was “rewired” by the cult leaders.26 Possibly the same kind of rewiring that made sixteen executives from Hydro Québec—a government-owned power utility company with more than twenty thousand employees—join the Order of the Solar Temple, after one of the company’s vice presidents invited the cult’s leader to give motivational “management” lectures on topics such as “Business and Chaos,” and “The Real Meaning of Work.”27 Built from an eclectic mix of New Age philosophy, extraterrestrial salvation, and archaic Christian rituals, the legacy of the Order can be summarized as arson, mass suicides, and murders,28 including the killing of a baby identified to be the Antichrist29—supposedly saving the planet in the process.30 Having ordinary people fall for that stuff is a testimony to the manipulation skills of cult leaders.31 All the above illustrates the challenges in convincing someone of hazards and associated risks when the mind is not predisposed to receiving this message. This mental block can be attributed to many causes, ranging from ignorance of reality to wishful interpretation of data. The “Nah! There’s never any earthquakes here,” or “there’s never any hurricanes here” is at one end of the misperception spectrum.
At the other end of the spectrum is the individual who has studied all the information available and has reached a conclusion that supports a desired outcome. After the storm surge from Hurricane Matthew flooded a Palm Coast neighborhood, the owners of a house where water had luckily stopped right at its doormat put it for sale, hoping to move to a less flood-prone area. Two years after Hurricane Matthew, an interested buyer parked in front of the house and bragged to the neighbors that he had studied all of Florida’s historical data and determined that hurricanes had never happened in Palm Coast before, which is why he was house-hunting in that wonderful neighborhood. One must be uniquely cocky—or downright blinded by his own beliefs—to preach to those who had survived Hurricane Matthew two years before that hurricanes do not occur in that part of Florida. This kind of naive confidence is akin to putting stamps on a baby’s forehead and mailing it to grandma on the other side of town, on account of having read that the mail service is highly reliable—incidentally, mailing babies is a practice that was banned by the US postmaster general in 1913.32 Fortunately, over time, biases and prejudices can be overcome. First, as the saying goes, seeing is believing. That inescapably works for those surviving a disaster—albeit, too late to have a positive impact for that event. Second, the brain’s resistance to changes in beliefs, which is presumably the result of an evolutionary mechanism to maintain group cohesion and preserve self-identify,33 can be abated over time—sometimes a long time, though. This may require instilling a culture of scientific thinking, where rigor and curiosity matters more than beliefs, and where discovery is recognized to be exhilarating rather than threatening— always an immensely uphill battle in an environment where beliefs are admired as virtues.
Eventually, at some point in time, which varies depending on the individual, all the facts are gathered and the risks are correctly identified and acknowledged. When that finally happens, if one is willing to assume the risk after having duly considered all that information, either because the foreseen benefits are deemed worth the gamble or because one is willing to transfer the risk to insurance coverage, at least, the brain has not been fooled. It can be deemed a rational and respectable decision—as long as it does not harm others.
The challenge comes in the significant and deliberate effort it takes to collect evidence and make such a rational decision with all the inalterable facts in hand—even more effort when the evidence is counter to expectations or wishes.
Effort?
Hmm . . .
This brings it all back to the tug-o-war between the “Yi” and the “Xin.”
THE MAÑANA SYNDROME Compounding the problem, ingrained in human nature, to various degrees, is a natural tendency to procrastinate, to unnecessarily postpone decisions or actions. This is sometimes called the “mañana syndrome,” where mañana, the Spanish word for “tomorrow,” has become a common expression meaning not only “later,” but “unpredictably later,” to the point where it may actually never get done.
Decades of psychological studies on the topic have allowed identifying many possible causes for procrastination,34 including some that are particularly relevant in the current context. At a fundamental level, tasks are usually performed in expectation of a reward, be it money or pleasure. When the reward is perceived to be too far in the future, it is hard to contextualize. This is called temporal discounting or delay discounting.35 Even when it comes to the “satisfaction of a job well done” or “for the greater good,” it turns out that “a bird in the hand is worth two in the bush.” The further in the future the reward, the less appealing it is. The fact that most people will prefer a smaller reward today than a larger one in the future is a phenomenon called hyperbolic discounting.36 A lot can be said about the many different reasons why people procrastinate; that people are more likely to procrastinate when dealing with more abstract goals (for example, lose weight) than concrete ones (spend thirty minutes on the treadmill); that rewards in the future to individuals who perceive themselves as living in the present are strangely perceived as happening to someone else (as if the person spending the retirement fund in the future cannot possibly be the same one investing the dollars today); that some people have an optimistic view of their ability to do it all later when the deadline approaches (such as, not studying at a steady pace but rather cramming the evening before the exam); that some procrastination is attributable to wishful thinking that the problems will automatically take care of themselves over time without the need for intervention; that some people suffer “paralysis by analysis” because they are unable to make decisions for fear of making the wrong choice; that people sometimes prefer to avoid decisions or actions when the task to be done appears overwhelming, creates anxiety about negative outcomes, or requires talking to unpleasant people; that perfectionists will delay doing something unless they are absolutely certain that it can be done flawlessly by unattainable standards; that people with low self-esteem and low self-confidence harbor such a fear of failure that they prefer not acting to receiving negative feedback and criticisms on their accomplishments; that people who are driven to self-handicap or self-sabotage themselves can use inaction to save face (for example, failing an exam on account of not studying does not carry the same judgment on intellectual fitness as failing because of inability to understand the material); that some people have no interest seeing the task done (particularly when feeling they have been set up for failure by being assigned a task above their ability or feel they will be criticized no matter the outcome and irrespectively of their effort); that some people are simply lazy, plain and simple, and prefer watching TV, playing video games, texting, or whatever else provides instant gratification, distraction, or teenage-like rebellion; and much more.37 What leads to procrastination is a most interesting and wide-ranging topic worthy of more exploration.
Unfortunately, doing so is apparently an overwhelming task, likely to be criticized, with the risk of focusing too much or not enough on it, and might be best to do in a different context. At the same time, not expanding on this subtopic here may not be so critical, as it will probably take care of itself in the subsequent chapters—and there are many other important and more enjoyable things to do, so . . . mañana.
As a result of all the above wonderful dilemmas and conflicts at play in the cerebral cortex, things do not often get done to mitigate ahead of time the conditions that can lead to a disaster. When the inevitable happens, then the brain shifts into a different gear. BUYING GOOD INTENTIONS One of the cornerstones of Christianity is the forgiveness of sins. Although the original “your sins are forgiven” statement in the gospel did not split hairs, the church spent much of its existence trying to determine which sins can easily be forgiven and which cannot38—including in that latter category the life-threatening offenses of blasphemy, worshiping false gods, and homosexuality. How the clergy kept busy sorting all of the things that humans do into venial sins, mortal sins, and all manner of categories is a whole story in itself.
39 The important point is that those at the top of the hierarchy realized that some people, when becoming aware of their own wrongdoing—no matter how one defines wrong—feel remorse and need closure. Be it the soul or the brain, something in human nature calls for redemption of some sort to restore peace and return to equilibrium. Depending on whether the feeling of guilt is a burning pain or a mere annoyance, some people seeking redemption will “go all in” and seek a new life to have a positive impact—no matter how positive is defined (as sometimes it is purely imaginary or even harmful)—while others will prefer a quick fix solution.
Given that the consequences of wrongdoing most often cannot be undone, the expedient solution is to forgive in exchange for something else, making for a convenient transaction. It can also become a lucrative one. Any system of confession and penance between a person and his/her conscience that involves an intermediary empowered to dictate the terms of the penance is fraught with risk.
Penances that consist of repeated prayers are gentle—even when it involves hundreds of repetitions—but money talks.
Monks tallied the penitential tariff to be paid for each sin in books to be used by confessors. By the eleventh century, it was already possible for the rich to reduce their required number of days fasting during Lent by offering donations to the church.
Throw a bit a corruption into the mix and it becomes a racket. For example, Pope Leo X demanded that all bishops and cardinals actively sell “indulgences”—like bonds that the nobility could buy at various prices in remission of a smorgasbord of sins—as a way of financing Rome’s extravagant expenses under his regime. In spite of this source of extra revenue, Leo’s lavish expenditures increased faster than his income, and he left the church deeply in debt when he died.40 Likewise, a modern way to soothe the brain following disasters is by contributing dollars to relief efforts. Cynics might see this as purchasing peace of mind, unconsciously done as it may be. Nonetheless, on the positive side, these dollars are directly spent on relieving the plight of the disaster victims. On the negative side, they are rarely directed to activities that would prevent repetitions of similar disasters in the future.
Benefit concerts are one popular expression of how individuals can help with money as a proxy for helping by actions. These are typically put together by megastars who can pull in big dollars from an adult crowd that has plenty of disposable income, or from a less wealthy but larger crowd of star-struck fans. The first of these in modern times was the 1971 Concert for Bangladesh,41 put together by George Harrison and Ravi Shankar, with a last-minute surprise participation by Bob Dylan. Music icons Harrison and Dylan took over the Madison Square Garden stage after a fifteenminute repertoire of Indian compositions led by the sitar virtuoso Shankar that would have rapidly emptied the Garden had it been played at the end of the show instead.
The objective of the Concert for Bangladesh was to raise money for what was turning into a humanitarian crisis, while world governments were not doing much other than watching and commenting from the perspective of their geopolitical interests and ideological biases.42 In November 1970, Cyclone Bhola hit East Pakistan with massive rains and a storm surge that arrived at high tide, flooding many villages and destroying crops.43 Mismanagement of the relief effort by the central government gave munitions to a secessionist movement that escalated into a civil war that displaced more than seven million people.44 The genocide of millions was followed by a rainy season that again flooded the region, and the creation of Bangladesh a few months later.
45 All that triggered by a cyclone—which goes to show that natural disasters should scare politicians.
Examples of other major benefit concerts following disasters include: “The Concert for New York City” in 2001, organized by Paul McCartney with a lineup of eighteen megastars, following the September 11, 2001, attack that destroyed the World Trade Center towers;46 the “Tsunami Relief Cardiff” charity concert to raise money for the victims of the tsunami that hit countries around the Indian Ocean in 2004;47 “Shelter from the Storm: A Concert for the Gulf Coast” following the 2005 Hurricane Katrina;48 “12-12-12: The Concert for Sandy Relief” following the 2012 Hurricane Sandy;49 “Hope for Haiti Now: A Global Benefit for Earthquake Relief” following the 2010 Haiti earthquake;50 “Hand in Hand: A Benefit for Hurricane Relief” in 2017 to provide relief to victims of Hurricanes Harvey and Irma;51 and many more—above and beyond the ones unrelated to specific disasters by name but raising money for charitable organizations that gen erally get involved in recovery activities following disasters.
By the way, short of a full concert, some projects focused on the release of special music albums and videos, such as the “Rock Aid Armenia” project to raise money for the victims of the 1988 Armenian earthquake. The highlight of this project was a rerecording of “Smoke on the Water” with members of Deep Purple, Pink Floyd, Led Zeppelin, Queen, Yes, Black Sabbath, ELP, Iron Maiden, and Asia playing together this four-chord classic—at the time, a CD for the true collector to find, but conveniently within reach on YouTube nowadays.52 Arguably, some have criticized all the above activities on account that having a big party with lots of sing-along does little but stroke the egos of the participating artists and might be a bit too giddy when the circumstances rather call for mourning, but they generally do bring in the cash. As in all such things, the money raised did not always reach the victims it was intended to help. Benefits from the July 1990 show/CD/DVD of “The Wall—Live in Berlin” by Roger Waters and more than a dozen guests, intended to go to the Memorial Fund for Disaster Relief, actually turned out to be losses.53 Even George Harrison met hurdles when trying to forward the concert profits to Bangladesh, in part because his manager forgot to file for tax-exempt status with the US Internal Revenue Service, but eventually, a decade later, most of the funds found their way there through the “George Harrison Fund for UNICEF”54 still active forty years later.
Nonetheless, on balance, charity is always positive and all the above certainly achieved much good and— addressing the topic at hand—provided disaster relief. Yet there apparently has been no benefit concert to provide third world citizens with money specifically targeted to the demolition of brittle buildings that are sure to collapse and kill all their occupants in future earthquakes and to fund the reconstruction in their place of safer buildings. Even less so for fixing buildings and infrastructure in wealthier countries.
Therefore, after an earthquake crushes bodies in the debris of collapsed buildings, the relief comes in international aid, people recover in some fashion and bury their dead, and then, in many cases, rebuild the exact same way as before, setting up the stage for a repeat disaster in the future.
To be fair, focusing on Bangladesh again, UNICEF and other charity organizations invest heavily in programs toward eliminating child marriage, child labor, malnutrition, and violence, and provide social protection and education to children and women to reduce their exploitation. These are important priorities and essential steps, because only a safe and educated population can start dealing with other risks.55 With respect to disasters specifically, UNICEF works to find measures to prevent the outbreak of waterborne diseases due to unsanitary conditions in refugee camps located in flood-prone lands when the cyclone season is looming.56 As always, the problems of the present must be urgently addressed if there is to be a future to be planned.
IT IS DEFINITIVELY SOMEONE ELSE’S FAULT What do the cities and regions of Salò, Calabria, L’Aquila, Of f ida, Irpinia, Sicily, Perugia, Lazio, Emilia-Romagna, Ancona, Friuli, Umbria, Campania, Basilicata, Abruzzo, Porto San Giorgio, Asti, and Molise have in common? Sure, they are all located in Italy, and visiting each one of these would make for a grand tour of the country. More relevant to the topic at hand though, is that these are all places in Italy where strong earthquakes have struck during the twentieth century alone. One and a half dozen memorable ones occurred in only one hundred years. To make matters worse, these quaint cities and regions are packed solid with stone and masonry buildings—the type of unreinforced construction known to be most vulnerable to earthquakes.
Hence, when each of these earthquakes struck, buildings were destroyed, people died, and many more were injured.
Each time, pictures of narrow streets filled with masonry rubble filled the front page of the Italian (and international) newspapers and magazines—frequently enough that it should be impossible find an Italian unaware of the country’s severe seismic risk (short of running into a disciple of the 1970s Pinball Wizard). And, each time, the Italians picked up the rubble and rebuilt their homes nearly the exact same way, and most of the time at the exact same location, or close. Why? Maybe because there is no place to hide from earthquakes in Italy, or maybe because masonry buildings have been the way of life in Italy. Tradition?
What is even more amazing in Italy’s history is that some cities are true “repeat customers” when it comes to taking a seismic beating. Like the city of L’Aquila, where it is estimated that two thousand died during the 1349 earthquake, possibly another five thousand during the 1703 one, six thousand again in 1786,57 and roughly thirty thousand in 1915.58 This is why the following story is somewhat surreal.
From November 2008 to April 2009, a swarm of small tremors centered around L’Aquila made the population edgy —particularly one of magnitude 4 on March 30. So, on the following day, the Italian Civil Protection Agency convened a meeting with its Major Risks Committee in an attempt to assess the situation. However, since nobody can predict earthquakes—as should be obvious by now—the committee concluded that there was no way to know if this swarm of small earthquakes could be the prelude to a major one.
Members of the committee left town, leaving it to the deputy head of the agency to call a press conference, during which he somehow translated the committee’s opinion into words that were construed by some to be reassuring, to the point where citizens who were planning to temporarily leave their home elected instead to stay.
Unfortunately for them, a magnitude 6.3 hit a week later.
Thousands of stone masonry buildings suffered damage, many collapsed, 309 people died, more than 1,500 were injured, and more than 65,000 were left homeless.59 In the aftermath of this catastrophe, something unprecedented happened: Six of Italy’s top seismologists and earthquake engineers, and one civil defense official were indicted for manslaughter for not having warned all residents of the impending disaster, and for not having told people to leave town prior to the earthquake.60 A petition signed by more than four thousand of the world’s scientists was addressed to the Italian president, to tell him in no uncertain terms that nobody can predict earthquakes, and that therefore nobody (and especially not scientists) can be held responsible for their occurrence. That petition was ignored. By October 2012, the Italian court found all the accused to be guilty of manslaughter in connection with their prediction about the earthquake.61 The verdict was not an indictment of the inability of science to predict earthquakes, but rather of the responsibility of scientists to share their expertise. It was felt that the tragedy could have been averted if not for the lack of communication by the experts—who had left town without talking to the public— and the poor communication of the risk by the civil defense of f icial who was reported to have said: “The scientific community tells us there is no danger, because there is an ongoing discharge of energy. The situation looks favorable.” He also allegedly told people that they should go home and have a glass of wine. Two years later, the Italian appeals court reversed the verdict for the six scientists, but not for the civil defense of
f icial, who was condemned to spend two
years in jail, pending any further appeals—a far cry from the stiffer penalties sought by the angry mob gathered outside the court.62 Now, irrespective of the outcome of the court case and its multiple appeals,63 it is worth pausing for a moment and debating why Italian citizens, many of them living in stone masonry buildings all over the country, would for a moment fault someone else for the losses incurred during an earthquake—as heartbreaking as these losses may be.
In all complex human activities, a large number of intertwined factors typically contribute to any specific outcome. It is often impossible to untangle all the causeand-effect chains that are semi-randomly weaved together, to truly understand how all events unfolded and all players collided on the road to that specific outcome, but simple explanations—right or wrong—are always easier to understand. Human nature loves it that way, and when the simplification consists of labeling one person as being responsible for it all, that pretty much settles it—at least, for a while. When the outcome is a positive one, weasels of all kinds have mastered the art of having others “recognize their genius” and make them the reason for the successful outcome. Obviously, all these skillful manipulators make sure to vanish when the outcome is negative and the fingerpointing experts start to exercise their talent to find the perfect scapegoat.
Multiple simple theories to explain a complex chain of events may compete for a while, or not, depending on circumstances. Sometimes, people can find their “Yoko Ono” right away, which greatly simplifies everything and allows for the perfect explanation—circumventing all needs to dissect the problem further. As such, The Beatles did not break up because they had been playing music together for over twelve years, in grueling conditions and penniless for the first five, in grueling conditions and wealthy for the next seven. Not because they met as teenagers fixated on girls and rock’n’roll and grew up to become adults with diverging interests in other important endeavors (in addition, of course, to sex, drugs, and rock’n’roll). Not because they became entangled in creating and running the moneybleeding Apple Records enterprise. Not because they wanted full license to exercise their creativity in their own terms, even if one or all of the other band members disapproved. No. Too complex. The Beatles broke up simply because John Lennon fell in love with a Japanese avantgarde artist named Yoko Ono, presumed to be a nutcase, who enticed him away from the Beatles with more sex than drugs and rock’n’roll—as if John Lennon could not be a nutcase in his own right. The Yoko scapegoat explanation is much simpler: Sold! The “Someone Else’s Fault” syndrome at its best. It always solves the problem.
Some prominent social and philosophical anthropologists, who study and theorize on the evolution of human behaviors to explain how civilization has occurred, believe that the use of the scapegoating mechanism goes at least as far back as the arrival of Homo sapiens. Ever since, when tensions have arisen within or between groups, the murdering of a convenient scapegoat has allowed a release of the tensions, a temporary resolution of the crisis, and thus a return to social peace—an efficient mechanism that eventually became a major cultural characteristic of all societies, and notably their religions.64 Typically, the scapegoat is someone at the wrong place at the wrong time, who is certainly not the source of whatever caused the conflict. It helps to lay accusations if that person conveniently looks or acts differently from the rest of the group in crisis or enjoys envied powers or privileges.
Through some subconscious mechanism, the community deceives itself into believing that this targeted victim is a horrible person who has transgressed a prohibition, violated a sacred rule, or failed in a duty, and therefore deserves to be killed or severely punished—depending on what the term civilization means in the time and place. With the eradication of the victim comes the illusion that the problem is solved, and the community returns to flourishing—for a while, before the cycle repeats itself.
In other words, scapegoating is a psychological defense mechanism.65 It makes it possible for individuals or groups to cope with anger or frustrations created by problems of their own making by projecting their culpability onto other individuals, unwarrantedly blaming them instead for being the root cause of the problem.66 No matter the problem— unemployment, low income, poor sex life—when the emotional energy expended becomes too painful to bear, it is unconsciously transferred to someone else. Coming back to earthquakes, now that the root cause of the problem is known—namely someone else’s failure—it makes perfect sense to return to living in unreinforced stone masonry buildings, in Italy or elsewhere. After all, many unreinforced masonry buildings have survived earthquakes in the past—smaller earthquakes maybe, but survived nonetheless—so there is no proof that the one that is home sweet home will collapse during a future earthquake.
Furthermore, some unreinforced masonry buildings are more likely to survive earthquakes than others, so one’s cozy dwelling could survive just as well—not that anyone necessarily bothers to investigate what might be the assessed seismic resistance of a building before signing on the lease or on the mortgage papers. That is what insurance is for, isn’t it? Finally, earthquakes never happen here, but even if they did, there is no point in worrying—there is a far greater chance of dying in a car accident, truly.
Besides, when things do turn for the worse, it will be “Someone Else’s Fault” and the best scapegoats will not be those ants who trained on their own nickel to become scientists and engineers and toiled days and nights to predict extreme events and to build a more resilient infrastructure. Rather, they will be those grasshoppers who are addicted to living at our expense, who play and sing all day: those we elect.
ON THE DISASTER TRAIL
Who Dominates the Yellow Pages: Where the Buck Stops A Canadian friend who visited the University at Buffalo one day elected to stay in a hotel close to campus. When I saw him again the next morning, he asked me, “Do you happen to know which profession fills the largest number of pages in Buffalo’s Yellow Pages?” The University at Buffalo north campus is suburban, unlike many other universities in denser urban surroundings where tons of businesses and restaurants hug the campus boundaries; when someone has spent time counting pages in the Yellow Pages, you know there is no life around the University at Buffalo campus. For the benefit of the younger generation that has never owned or seen a landline phone in private homes, the Yellow Pages was a massive book that “Ma Bell”67 gave every customer and that contained the phone numbers of all businesses in the area, grouped by types of business. It was so thick that ripping the phone book in half was a stunt performed by some as a show of strength. The book went the way of the dinosaurs, but some can still be seen in the Smithsonian’s museum or in black-and-white gangster movies as a useful tool to beat suspects during police interrogations.
It turns out that my friend had counted the number of pages filled by medical doctors, engineers, car repair shops, insurance brokers, pharmacies, and every other group that had a sizeable number of pages, but the one group that had more pages than everybody else, and by a factor of two over the nearest contender, was lawyers. To top it all, there were also ads for law firms on the back-cover page, on the inside of the cover page, the inside of the back-cover page, and—as no space was wasted—on the thick spine of the book itself.
When the “Someone Else’s Fault” syndrome rules the day, there is apparently money to be made.