The 3rd Global Environmental Leaders International Symposium Hiroshima University
Woody Epstein
3/25/13

Ageo-shi, Japan, March 11, 2013

I would like to begin a conversation with you about risk and society. Some of you will find my words difficult to hear. I want to present personal ideas and observations so that each of you might think deeply about electricity generated by nuclear power. I have no answers. I hope that my words will help form a basis for honest relationships with the public and policy makers in the future.

We are gathered here today in a city which certainly understands nuclear power, although in another sense than electricity production. My giri no haha, my mother-in-law, and her family come from Hiroshima; she was here on that day in August, 1945. Perhaps through the lens of that horrible event, we can begin to understand our delicate position as residents of Japan, and the decisions we must make about continuing, or not, the use of nuclear fission to create electricity.

Because of the accident at Fukushima Daiichi, we are here today speaking about nuclear energy. More than Three Mile Island, more than Chernobyl, the accident of 3.11 has focused the attention of the world on whether it is possible to use nuclear energy in an acceptably safe way.

I also want you to know that I am neither “pro” nor “anti” nuclear power. I am, however, “pro” honesty and “anti” dishonesty. Fundamental to my beliefs is that if those of us in the “nuclear village” cannot be honest about nuclear power, then safe or not, these people cannot be involved in the hard decisions we have before us.

I am a mathematician who has worked for 30 years with technologists who analyze risk. Two goals for a risk analyst are to provide reasoned evidence to decision makers and clear, down-to- earth explanations to the public.

Let me begin by explaining what we mean by “risk”. Simply put, risk (and therefore safety) is the answer to three questions:

  1. What can go wrong?
  2. How likely is it?
  3. What are the consequences?

When doing a risk assessment for a nuclear power plant, we build elaborate, large physical and mathematical models with the aim of quantifying the likelihood that an accident might happen and what its consequences may be. The best risk assessment professionals try to build good models, try to constantly revise the models, question scientific assumptions, and try to incorporate new data and findings. We are, after all, human beings and we live on the planet earth, also.

I am sure that you realize that nuclear power generation has, and always will have, risks. Nothing is 100% safe. Kurokawa Kiyoshi, the Chairman of the government’s independent panel to investigate the Fukushima accident, said, “The public is not stupid. Accidents happen, machines break, and humans make errors.”

Over the years, most regulators, and the IAEA, have endorsed a safety goal which says that we must ensure that for each reactor, the likelihood of a core damage accident is no greater than 1 time every 10,000 years; and that the likelihood of a large release of radiation is no greater than 1 time every 100,000 years.

Let me put this in perspective. As of March 10, 2011, there were 438 commercial nuclear power generating units in the world. If each unit was operating 70% of the time, and each unit was safe according to the safety goals I have just described, then the likelihood of having a core damage accident, somewhere in the world, was about 3 times every 100 years. To put this into very human terms, you should expect a core damage accident about 2-3 times during your lifetimes.

This, ladies and gentlemen, is the likelihood if we attain our safety goals: Somewhere in the world, 2-3 times during your lifetimes, you should expect a core damage accident.

Now core damage accidents do not mean radiation releases, they do not mean Fukushima or Chernobyl accidents. Three Mile Island had a core damage accident, but there was no release of radiation. If we use the same reasoning for large releases of radiation, then we should expect an accident such as Fukushima about 1 time every 330 years.

So here is my question to you: can you accept this? Are you willing to say that you will accept nuclear power if the likelihood of a large radiation release, somewhere in the world, is about 1 time every 330 years? Realize this likelihood will grow greater as more nuclear power plants are built; also realize that we live in a global village: an accident in China will affect lives in Okinawa.

Please remember that I am only talking about the likelihood of an accident, not the consequence that you will die or become sick from this type of accident. You are much more at risk (likelihood and consequence) from being in a car crash or drinking alcohol too much, eating fatty foods, or smoking cigarettes.

Nuclear power has benefits: smaller carbon footprints, reducing reliance on oil, lower balances of trade, less pollution from burning fossil fuels. We can even make econometric models to help us achieve a sustainable energy balance.

Somehow these benefits have little impact on our thinking because we are talking about nuclear power. Even if no one dies or becomes sick from the radiation from the Fukushima accident, the human tragedies have their own half-life, they will never go away entirely. Somehow comparing risks, or comparing risks vs. benefits, seems besides the point. We are dealing with our human perception of risk.

The first use of nuclear power was in the form of atomic bombs destroying the people and cities of Hiroshima and Nagasaki, the ground-zero narrative of our risk perception of things nuclear in Japan. Scientists, utilities, and officials must respect the public’s reaction to this terrible legacy. It is only right and proper.

Nuclear accidents are silent. They move out of control invisibly. Radiation makes bad things grow inside you. Fears such as these must be listened to and not dismissed with comparisons to other risks or authoritarian claims of safety. The myth of 100% nuclear safety was accepted by almost everyone in Japan. Thus when the myth was shattered, we were shattered.

Technological consequences to the public will not go away if we abandon nuclear power. Look at all of the gas, oil, and LNG tanks on the coasts of Japan. We have done studies on the impacts of large earthquakes, tsunami, and typhoon on the chemical, oil, and gas industries. Believe me, there are dangerous accident scenarios with environmental, social, and economic consequences which could approach Fukushima. But our perception of these risk does not approach our risk perceptions of nuclear power. Our policy makers avoid regulations which mandate risk assessments for these industries.

Policy makers tend to borrow ideas from economics to create models for risk perception. One key assumption is that individuals and society behave in a rational manner. Society is considered to adapt to whatever risks are present. If we could characterize risks in a society, the public’s preferences would be revealed and policy makers could make enlightened decisions.

The implication is that if we could provide people with more or better information, everyone would make more logical, rational, and informed decisions regarding risk.

But people do not behave rationally. The outstanding work of Paul Slovic, and the Nobel Prize winning work of Daniel Kahneman and Amos Tversky are worth a colloquium, or at least a true symposium at a local izakaya. I certainly would participate.

Rationality is one part of decisions. Emotions are just as important. Those who ignore the need for emotions, like sympathy and empathy, in decision making have missed the point of the human condition.

How can technologists more effectively work with the public and government? We need to be better listeners. We need to understand the issues important to all of the people involved. We need to be better in giving easy-to-understand explanations.

Part of the challenge is that we, in the technical community, have disengaged from public discourse to a large extent. A hundred years ago, the latest scientific theories and discoveries – including such esoteric subjects as relativity and quantum mechanics – were discussed in leading newspapers in a way that an “educated person” could understand. These articles were followed with interest by the public. Professors often gave evening lectures to the community. There was a dialog with the public, and science was a subject of popular discussion.

What happened? We in the scientific community have become isolated, and, dare I say, arrogant; we have lost the art of conversation with the public. A Tweet is not a conversation. Many in the scientific community have no ability to talk straight with the people.

Soon after the Fukushima accident, I was at a town meeting in Tokai-mura. Tokai-mura experienced a nuclear accident, ironically, on March 11th, 1997; and another accident, more serious, on September 30th, 1999.

During the meeting, a very worried mother asked one of the sensei leading the meeting about the amount of radiation which could harm her children. And what did the learned sensei say? He told her to look it up on the internet.

This sensei did not listen to the mother’s heart. He did not understand what Kurokawa-san so directly said, “People are not stupid.”

It is far too easy for technologists to ignore the essential role of the study of risk perception and its influence on effective risk communication. Often the decision makers we support are not technical professionals or they must answer directly to the public in elections. How are we to understand the framework within which the evidence we develop is understood and applied?

Step one, as I have said, is learning to listen. As every married person quickly learns, communication begins with listening, not talking. When you begin to listen, you begin to build a bridge.

The second step is for everyone to ask each other hard questions. Like Socrates in the marketplace, asking questions is the road to knowledge; remember that the definition of risk seeks answers to three questions. Technologists must bring the spirit of questioning to the decision makers we support and, in the case of technologies with the potential to harm the public, to the people as well.

Step three is to realize that science is never exact or final Science changes over the years, sometimes with unanticipated discoveries. All scientific understanding is made through the veils of uncertainty and ignorance. We must help you understand the uncertainty, how to “expect the unexpected”, and how to create resilient institutions which can flexibly respond to an accident when (not if) it happens.

After Fukushima, many of my Japanese friends asked me, “Woody, when is safe, safe enough?” I answered them, “Safe, is safe enough when you say it is safe enough. How do you want to live? What risks are you willing to accept to live the way you want to live? Do you like the lights of the Ginza? Will you give up dependence on foreign oil? Do you want to enjoy the simple pleasures of the countryside?” We are caught in a Faustian bargain: What kind of a deal are you willing to make with technology to have the life you want?

There are no magic answers. There are only difficult choices, some of which have objective measures, and some of which have to be made with the heart. What is important is that we all take responsibility for our decisions, especially if the decisions go wrong as they did at Fukushima Daiichi. When you’re wrong, be the first to admit it, even if you’re the last one to know.

Risk is a perception, a feeling, and we as technologists must be honest with the public and the policy makers by saying truthfully what we know and what we do not. All of us must be in this together, in a continual conversation, trying to do the right thing.

Acknowledgements

I would like to thank Dr. David Johnson (ABS Consulting) and Dr. Bob Geller (Tokyo University) for our continual conversations and the joy of listening.

Download Speech as PDF