A Dangerous World of Failed Imaginations: Fukushima's Triple Disaster

Namie-Machi, Futaba- Gun, Japan. November 8, 2014. Courtesy of Masa Iwatsuki

Namie-Machi, Futaba- Gun, Japan. November 8, 2014. Courtesy of Masa Iwatsuki

Knowledge is power, but not in the way you think.

Since the first industrial revolution, technologies have enabled us to live longer, healthier, and more fulfilling lives. We’ve created a shift in our system, setting the stage for a world of automation. We’ve successfully adopted new technologies that enable capital to be substituted for labor. We’ve mechanized the labor of agriculture, created automated machinery in large factories, industrial robots have replaced production workers, we’ve even substituted white-collar jobs with specialized software and artificial intelligence. We’ve turned tasks into a click of a button in the span of nearly 100 years.

Technologies today have evolved by promising value, both economic and social. Artificial intelligence is expected to generate between $3 trillion and $5 trillion dollars across 20 industries alone, and blockchain has been proven to revolutionize humanitarian relief.

Humankind is slowly realizing how technologies are challenging our daily lives by bringing with them their undesirable side effects. They are already breaking social cohesion, increasing inequality, and inevitably transforming everything from global politics to personal identities. Even distinguished scholars such as Professor Stephen Hawking have already expressed their concern about the risks of advanced technologies, rating the chance of a technologically induced global catastrophe in the next century to be greater than 10%.

The problem here is that no one fully foresaw or intended these consequences. The recent dispute about how governments are forming data collection on social media exploits the very nature of our human rights and vulnerabilities and how technologies are becoming dangerous tools made by their creators that can potentially impact us in harmful ways.

The role of technology and its course in ethics and values needs to be critically readdressed, rethought, and redesigned in a way that constitutes responsible long term developments.

Namie-Machi, Futaba- Gun, Japan. November 8, 2014. Courtesy of Masa Iwatsuki

Namie-Machi, Futaba- Gun, Japan. November 8, 2014. Courtesy of Masa Iwatsuki

The Frankenstein Syndrome

I believe the majority of those reading are familiar with Mary Shelley’s novel, Frankenstein. The focal point, however, is not the infamous patched up monster that we’re all so well acquainted with, but rather the man behind his creation, Dr. Victor Frankenstein. The premise of the story involves a scientist that is fuelled by his knowledge and curiosity who creates a monster out of the limbs of the deceased, brings it to life, and subsequently wreaks havoc on humanity. That is, however, the Hollywood version of the story, and like many stories, Hollywood completely misses the point...

The novel was not penned to convey the atrocities of a mindless monster, but rather to magnify the repercussions of when a man single-handedly pushes the boundaries of science, seeking esoteric knowledge, and constituting newfound progress within mankind. Dr. Frankenstein’s creation evidently shows signs of intelligence, teaching itself to speak eloquently. But because of its grotesque appearance, the world shunned it. The creature, hence, vowed to torture its creator, Dr. Frankenstein.

The moral of the story teaches us that humanity’s potential to push the boundaries of knowledge can eventually spiral out of control. The Frankenstein syndrome strikes a socially moral responsive chord, questioning our doubts and fears about science and technology.

Bernard Rollin puts it simply:

“When and if the myth of knowledge becomes reified or transformed into or equated with reality, it conceals nuances, shades and subtleties of what it represents...The myth is either accepted as literal truth or categorically rejected as nonsense, with little thought for the possibilities in between, where the truth surely lies.”  

As a generation that strives on the pursuit of research and development, our creations and discoveries have taught us to fall into two categories - black or white - success or failure. The grey area in between is rarely focused on, that is, the implications, consequences, and the repercussions of our incredible breakthroughs.

200 years later, Mary Shelley’s Frankenstein provokes our thought process and warns us of the perils and risks of tampering with science and nature and the dangers of ambitions turned into obsessions. Today, humanity possesses the ability to change industries and the global environment - artificial intelligence, genetically modified organisms, and technologies of destructive capability are continuously being researched in startup companies, large corporations, and academia.

Perhaps there are certain things humans are not meant to do.

The Failure of Imagination

In 1962, a British science writer and futurist by the name of Arthur C. Clarke examined the future development of technological advancements in a book called Profiles of the Future. The book’s introduction, “Hazards of Prophecy”, delves the reader into two traps of assumptions: “The Failure of Nerve” and “The Failure of Imagination”.   

“The Failure of Nerve” explained by Clarke notes the incapacity of man to see failure even when given all the relevant facts, some of which are so bizarre that the event would be considered unbelievable. Human psychological analysis has been quick to assume that certain things could not have been done, could not have been invented, and could not be physically possible, often providing consistent “surprises” to many skeptics over the years.

The mental climate regarding electricity-powered light, for example, was disregarded by many experts, saying it was impossible. The common refusal to face facts has consistently been shut down by speculative motions, labelling inventors, politicians, and scientists as too visionary and imaginative. Clarke iterates his notion of “The Failure of Nerve” as a lesson that anything that is theoretically possible will be achieved with practice, no matter the technical difficulties. But to implement theoretical possibilities, one must also have logical conclusions.

“The Failure of Imagination” mentions that all the available facts are implemented correctly but there are some that are undiscovered and that the possibility of their existence has not yet been mentioned. Lord Rutherford, a British physicist famous for his work in nuclear physics, consistently mocked scientists who believed that energy could one day be harnessed from locked up matter. Only five years after his death in 1937, the first nuclear chain-reaction started in Chicago. Unfortunately, Lord Rutherford failed to embrace the idea that a nuclear reaction could release more energy than is required to start it.

Similar to chemical combustion, scientists worked effortlessly to harness the fission of uranium, ultimately proving successful. Lord Rutherford proves as an example that despite his expertise in the field, he was not a man who could give reliable visions of the future of nuclear energy, and had therefore blocked what fuels the imagination.

To embrace the unexpected is to keep an open and unprejudiced mind, something that is ideally overlooked and avoided more commonly than often. This lack and failure of imagination is overlooked when known facts are available but other truths are unknown. The possibility of the unknown is rarely admitted and often pays a large price when the notion is not confronted. The fallacies presented by Arthur C. Clarke in 1962 align perfectly with the situation that happened at the Fukushima Daiichi Nuclear power plant, disregarding regulations (known facts) and unanticipated external factors (the unknown).


On March 11 2011, at around 2:46 PM Japan Standard Time, an earthquake measuring 9.0 on the Richter scale sent tremors off the northeast coast of Japan.

Unfortunately, the epicenter of the earthquake was not too far off from the Fukushima Daiichi power plant, consisting of three functioning and three offline reactors located in Okuma. At the notice of the earthquake, operators immediately went ahead and shut down the reactors to protect their cores.

An hour later, residents in Fukushima noticed waves forming in the distance. A violent tsunami was speedily approaching the coastline of Japan. In minutes, thousands of homes and businesses were underwater and destroyed by the waves.

The events that unfolded at the Fukushima power plant were not any better. The plant had lost its power completely after the tsunami ravaged its diesel generators. The batteries used to control steam driven emergency pumps lost power, consequently causing the three online reactors to overheat, oxidizing the protective covering and melting their radioactive cores, producing hydrogen gas. Reactors 1 to 3 all exploded blowing their reactor cores. At the same time, fuel stored in pools located in units 1, 3, and 4 lost cooling water and eventually released radioactive material into the environment.

Eight years later, Japan continues to struggle to recover from the major devastation natural disasters had caused, making the accident one of the most substantial releases of radioactivity since the Chernobyl disaster in 1986 and the Three Mile Island disaster in 1979. Since the triple disaster event, assessing and decommissioning the damaged plants of the Fukushima Daiichi power plant has been an overwhelming task, requiring various resources, innovations, commitment and responsibility from governmental and corporate institutions.

The corporation that owned the Fukushima Power Plant, the Tokyo Electric Power Company (TEPCO) and Japan’s nuclear regulator, the Nuclear and Industrial Safety Agency (NISA), were the leading agents responsible for employing the best practices to ensure the safety and resiliency of the plant against large-scale tsunamis or earthquakes. If the plant had applied international safety practices and preventative measures, the Nuclear Power plant would have been able to resist the overwhelming effects of natural disasters.

While finger pointing may be deprecatory, it is important to understand the narratives of nuclear power since its inception. The safety of nuclear power sparked a controversy internationally since the unfolding events of March 2011. Americans were seen noting the abandonment of the “nuclear renaissance”, the Germans to avoid a nuclear-powered future, and French politicians began to rethink the commitment of nuclear energy. Even the Chinese felt as though they had to suspend construction of new nuclear power plants. The Japanese Prime Minister at the time, Naoto Kan, planned to reevaluate renewable energy, stating that he had to start rethinking the future of Japan’s energy sector.

The problem, however, lies amongst TEPCO and the International Atomic Energy Agency (IAEA). In 2008, a controversial debate surrounding the resiliency of the nuclear power plants in Japan was released by the IAEA, warning that all power plants in Japan were susceptible to earthquakes. In conclusion, the IAEA and TEPCO were fully aware of the vulnerabilities and risks surrounding nuclear power, yet did nothing.

The regulatory measures used at TEPCO and NISA were below international standards in three aspects:

  • Little to no attention was given to the tsunamis that had previously haunted the Japanese community.

  • Severe underestimation of tsunamis in the area where computer models and simulations were considered inadequate and unrealistic of imminent threats.

  • Consistent reviews managed by NISA and TEPCO were lacking and failed the development of realistic simulation methods.

While TEPCO was fully aware of this event, it overlooked the regulatory procedures to move and enhance emergency power supplies to higher ground. There was no indication as to why TEPCO and NISA failed to follow international standards of nuclear safety. It is clear that focus was diverted from imminent, large scale events to singular and low risk threats, further speculating seismic activity in the surrounding region rather than considering other risks. Nuclear professionals failed to use local expert knowledge in the area and often evaded advice from experts outside of the field. Another cause was simply that such an event was considered virtually impossible.

By examining the enviro-technical analysis of the disaster, saying that Fukushima was an “accident” is an understatement for the harrowing accumulation of problems that occurred.

There is growing evidence that the nuclear plant’s design focused more on nuclear reactor’s functionality and internal safety than safety from external factors such as natural disasters that have haunted Japan periodically over the years. The Fukushima disaster was completely preventable, and terming it as an “act of God” is not only artificial and false, but utterly irresponsible.

Seyma KokashComment