- #1
cwatkin
- 1
- 0
Why can’t decay heat be harnessed and used as an energy source to safely power down/cool a nuclear reactor?
I have been wondering about this since the reactor incidents in Japan as it appears a tremendous amount off energy must still be dissapated after the shutdown of a nuclear reactor in order to prevent damage to the reactor and a potential disaster. It appeared that the Fukushima Daiichi nuclear reactors survived the 5th largest recorded earthquake on Earth remarkably well and initiated normal shutdown procedures. It was the fact that the tsunami later damaged the backup power system which relied on diesel generators for cooling, resulting in a cascade of failures and a core meltdown in three of the reactors. I feel that nuclear energy is a clean source of power and that it can help solve our dependence on imported fossil fuels, promote national security, as well as provide virtually no CO2 emissions. On the other hand, plants should be designed to withstand extreme events, even if they are of a very low probability. In the Japan case, ancient stone markers warned of tsunami risk at levels well above the Fukushima backup generators and the plant designers/operators disregarded something left between 500-1000 years ago. I know that many facilities such as this are built, they are meant for an operational life of between 40-75 years. The consequences of something going wrong at a facility such as this are simply too large to design for on an operational life scale. Had a few minor circumstances in this situation unfolded differently, we could have been looking at the evacuation and resettlement of Tokyo for between 100-300 years due to radioactive contamination.
Most large engineering/industrial disasters include "the human factor" as a key component of how they unfold. These includes the Chernobyl disaster, Deepwater Horizon oil spill, Exxon Valdez, Challenger disaster, sinking of the Titanic, and many others. Had a few simple decisions been made differently, none of these would have happened. Unfortunately it is very rare when the human factor is recognized as a main factor in preventing a major catastrophe. Perhaps the best known recent incident is the U.S. Airways Flight 1549 or "Miracle on the Hudson" incident. An experienced airline pilot made some fateful split-second decisions after realizing that all his engines had been disabled by a collision with a flock of geese. It was enough of a miracle that he safely "ditched" on the Hudson and no passengers on the aircraft died. This also occurred in one of the most populated areas on the planets and could have easily resulted in many hundreds or thousands of deaths on the ground had this plane not been ditched successfully on the Hudson but instead came crashing down in a populated part of New York or New Jersey. The human factor likely also played a major role in preventing the Fukushima incident from being far worse.
As an engineer and a scientist, I dislike getting information on important topics through normal news outlets that like to sensationalize and oversimplify stories. I understand that I am not a nuclear engineer so maybe this is a dumb question but I have dealt with lots of disasters including Katrina and know that failures of the power grid over an extended period could result in the loss of backup cooling due to diesel fuel running low and such. I understand that backup cooling systems at U.S nuclear reactors are required to be able to operate for 72 hours. It seems something more robust and redundant should be used as an earthquake or solar flare could disable enough of the power grid to not allow for the safe shutdown of many nuclear recators.
It is my understanding that the typical reactor will produce between 5-7% of its rated output in decay heat due to the radioactive decay of fission byproducts after shutting down and decay down to around 1% within a day or so. I understand that the amount of heat generated depends on the length of time the fuel has been in use and undergoing fission so older fuel will have a larger and longer lasting decay heat concern. I understand the heat generation drops quite rapidly as the short lived isotopes decay but that longer lived isotopes continue to decay and generate a significant amount heat so that cooling is required for a very long time (5-10 years) after the spent fuel is removed from service.
I looked up the operational rating of several nuclear power plants in the U.S. and most tend to range between 1000-1200 Megawatts (MW) of power, which is quite a large number. When one of these shuts down, decay heat should be generated in an amount around 50 MW (or more) immediately after shutdown based on the 5-7% heat of operational output. 50 MW is an immense amount of power (enough to power about 50,000 average U.S. homes) and I would think this would well exceed the rated output of even the largest (or a bank of) diesel generators used to provide backup power to cool a shutdown nuclear reactor.
My question is why this tremendous amount of energy cannot be harnessed and used to generate power that could be used to safety shut down and cool a nuclear reactor? It seems there is plenty of heat to lead to a complete core meltdown and/or fire long after the primary fission reaction is shut down. Why can’t this heat be used to generate power, whether it be electrical or mechanical, in order to run pumps and such to cool the reactor during shutdown? Why couldn’t one of the steam turbines be run to generate power to run the pumps? If the main turbines are too large to run on such a reduced output, could a smaller turbine be used for backup purposes? How about running the pumps directly and mechanically without any electric generation via a turbine meant just for this purpose? I like to keep things simple as there is less to go wrong so a purely mechanical pump might be in order. How about a thermocouple system? I know that radioactive decay heat is used to power space probes such as Voyager 1 and 2, among others, in this manner and such but don’t know how it would work on such a large application. Even if decay heat cannot produce enough power, can it not provide some power and reduce dependence of batteries or diesel? If nothing else, it could reduce the rate at which batteries and/or diesel are depleted and buy time to solve the underlying problem.
As decay heat drops over time, potential power generated from it also drops, but so would the cooling requirements. Pumps would not be able to be run at their maximum rating but is this a bad thing after most of the short lived isotopes have decayed? I am not an expert so maybe decay heat can remain dangerous even if it isn’t enought to generate a meaningful amount of power. Is it like my electric stove. Sometimes I turn it off right before the food is done and let it cook with the residual heat. Eventually it cools off to where it can no longer cook but would still be dangerous to touch. I know this is very simple but is it a good comparison?
If decay heat cannot effectively be used to shut down a nuclear reactor, why can’t the reactor be reduced down to a down to a "fail safe" or “idle” mode where it generates just enough power to run the emergency cooling systems? It could be run this way indefinitely and let some of the short-lived isotopes generated during full power operation decay over a period time before reducing power further or shutting down completely once enough short-lived isotopes have decayed. Why is this not done in emergency shutdowns where the reactor infrastructure is left intact but external power is lost?
All it takes is one unforeseen disaster to knock out external power at a nuclear plant and it seems this might be a solution or at least part of the solution to the decay heat issue. I have been reading about solar flares and their ability to fry large electrical transformers that are key to large parts of the power grid. I understand that we are entering a very active solar cycle and there is some concern one of these flares could knock out a large part of the grid for an extended period and it might takes months or longer to restore some parts of the power grid due to the destoyed transformers. What would happen to a nuclear plant in such a situation where external power is lost for a very extended period of time?
I have never heard this question asked and did extensive searching online on this subject, only to find nothing. Engineers and scientists realized that the incredible amount of power contained within the atom could be used for peaceful and beneficial purposes even before the first atomic bomb was dropped. It seems that the decay heat generated after a nuclear reactor is shutdown could be harnessed and used to cool a reactor instead of being viewed only as a destructive form of energy that must be released in order to prevent a disaster.
Thank you.
I have been wondering about this since the reactor incidents in Japan as it appears a tremendous amount off energy must still be dissapated after the shutdown of a nuclear reactor in order to prevent damage to the reactor and a potential disaster. It appeared that the Fukushima Daiichi nuclear reactors survived the 5th largest recorded earthquake on Earth remarkably well and initiated normal shutdown procedures. It was the fact that the tsunami later damaged the backup power system which relied on diesel generators for cooling, resulting in a cascade of failures and a core meltdown in three of the reactors. I feel that nuclear energy is a clean source of power and that it can help solve our dependence on imported fossil fuels, promote national security, as well as provide virtually no CO2 emissions. On the other hand, plants should be designed to withstand extreme events, even if they are of a very low probability. In the Japan case, ancient stone markers warned of tsunami risk at levels well above the Fukushima backup generators and the plant designers/operators disregarded something left between 500-1000 years ago. I know that many facilities such as this are built, they are meant for an operational life of between 40-75 years. The consequences of something going wrong at a facility such as this are simply too large to design for on an operational life scale. Had a few minor circumstances in this situation unfolded differently, we could have been looking at the evacuation and resettlement of Tokyo for between 100-300 years due to radioactive contamination.
Most large engineering/industrial disasters include "the human factor" as a key component of how they unfold. These includes the Chernobyl disaster, Deepwater Horizon oil spill, Exxon Valdez, Challenger disaster, sinking of the Titanic, and many others. Had a few simple decisions been made differently, none of these would have happened. Unfortunately it is very rare when the human factor is recognized as a main factor in preventing a major catastrophe. Perhaps the best known recent incident is the U.S. Airways Flight 1549 or "Miracle on the Hudson" incident. An experienced airline pilot made some fateful split-second decisions after realizing that all his engines had been disabled by a collision with a flock of geese. It was enough of a miracle that he safely "ditched" on the Hudson and no passengers on the aircraft died. This also occurred in one of the most populated areas on the planets and could have easily resulted in many hundreds or thousands of deaths on the ground had this plane not been ditched successfully on the Hudson but instead came crashing down in a populated part of New York or New Jersey. The human factor likely also played a major role in preventing the Fukushima incident from being far worse.
As an engineer and a scientist, I dislike getting information on important topics through normal news outlets that like to sensationalize and oversimplify stories. I understand that I am not a nuclear engineer so maybe this is a dumb question but I have dealt with lots of disasters including Katrina and know that failures of the power grid over an extended period could result in the loss of backup cooling due to diesel fuel running low and such. I understand that backup cooling systems at U.S nuclear reactors are required to be able to operate for 72 hours. It seems something more robust and redundant should be used as an earthquake or solar flare could disable enough of the power grid to not allow for the safe shutdown of many nuclear recators.
It is my understanding that the typical reactor will produce between 5-7% of its rated output in decay heat due to the radioactive decay of fission byproducts after shutting down and decay down to around 1% within a day or so. I understand that the amount of heat generated depends on the length of time the fuel has been in use and undergoing fission so older fuel will have a larger and longer lasting decay heat concern. I understand the heat generation drops quite rapidly as the short lived isotopes decay but that longer lived isotopes continue to decay and generate a significant amount heat so that cooling is required for a very long time (5-10 years) after the spent fuel is removed from service.
I looked up the operational rating of several nuclear power plants in the U.S. and most tend to range between 1000-1200 Megawatts (MW) of power, which is quite a large number. When one of these shuts down, decay heat should be generated in an amount around 50 MW (or more) immediately after shutdown based on the 5-7% heat of operational output. 50 MW is an immense amount of power (enough to power about 50,000 average U.S. homes) and I would think this would well exceed the rated output of even the largest (or a bank of) diesel generators used to provide backup power to cool a shutdown nuclear reactor.
My question is why this tremendous amount of energy cannot be harnessed and used to generate power that could be used to safety shut down and cool a nuclear reactor? It seems there is plenty of heat to lead to a complete core meltdown and/or fire long after the primary fission reaction is shut down. Why can’t this heat be used to generate power, whether it be electrical or mechanical, in order to run pumps and such to cool the reactor during shutdown? Why couldn’t one of the steam turbines be run to generate power to run the pumps? If the main turbines are too large to run on such a reduced output, could a smaller turbine be used for backup purposes? How about running the pumps directly and mechanically without any electric generation via a turbine meant just for this purpose? I like to keep things simple as there is less to go wrong so a purely mechanical pump might be in order. How about a thermocouple system? I know that radioactive decay heat is used to power space probes such as Voyager 1 and 2, among others, in this manner and such but don’t know how it would work on such a large application. Even if decay heat cannot produce enough power, can it not provide some power and reduce dependence of batteries or diesel? If nothing else, it could reduce the rate at which batteries and/or diesel are depleted and buy time to solve the underlying problem.
As decay heat drops over time, potential power generated from it also drops, but so would the cooling requirements. Pumps would not be able to be run at their maximum rating but is this a bad thing after most of the short lived isotopes have decayed? I am not an expert so maybe decay heat can remain dangerous even if it isn’t enought to generate a meaningful amount of power. Is it like my electric stove. Sometimes I turn it off right before the food is done and let it cook with the residual heat. Eventually it cools off to where it can no longer cook but would still be dangerous to touch. I know this is very simple but is it a good comparison?
If decay heat cannot effectively be used to shut down a nuclear reactor, why can’t the reactor be reduced down to a down to a "fail safe" or “idle” mode where it generates just enough power to run the emergency cooling systems? It could be run this way indefinitely and let some of the short-lived isotopes generated during full power operation decay over a period time before reducing power further or shutting down completely once enough short-lived isotopes have decayed. Why is this not done in emergency shutdowns where the reactor infrastructure is left intact but external power is lost?
All it takes is one unforeseen disaster to knock out external power at a nuclear plant and it seems this might be a solution or at least part of the solution to the decay heat issue. I have been reading about solar flares and their ability to fry large electrical transformers that are key to large parts of the power grid. I understand that we are entering a very active solar cycle and there is some concern one of these flares could knock out a large part of the grid for an extended period and it might takes months or longer to restore some parts of the power grid due to the destoyed transformers. What would happen to a nuclear plant in such a situation where external power is lost for a very extended period of time?
I have never heard this question asked and did extensive searching online on this subject, only to find nothing. Engineers and scientists realized that the incredible amount of power contained within the atom could be used for peaceful and beneficial purposes even before the first atomic bomb was dropped. It seems that the decay heat generated after a nuclear reactor is shutdown could be harnessed and used to cool a reactor instead of being viewed only as a destructive form of energy that must be released in order to prevent a disaster.
Thank you.