Why can't we use fissions products for electricity production ?
As far has I know fissions products from current nuclear power plants create enough 'waste' heat to boil water; and temperature decreases too slowly for an human life. So why can't we design a reactor to use this energy.
Answer
Here are some "order-of-magnitude" arguments:
Quoting https://en.wikipedia.org/wiki/Decay_heat#Spent_fuel :
After one year, typical spent nuclear fuel generates about 10 kW of decay heat per tonne, decreasing to about 1 kW/t after ten years
Now since this is heat, you can't convert it to electricity with 100% efficiency, the maximum possible efficiency is given by the Carnot efficiency $\eta$:
$$ \eta \le 1 - \dfrac{T_\mathrm{cold}}{T_\mathrm{hot}} $$
where $T_\mathrm{hot}$ would be the temperature of the spent fuel rods (in Kelvin) and $T_\mathrm{cold}$ would be the temperature of a cold reservoir against which a generator would work. One would have to do another calculation what a reasonable temperature of the fuel rods would be (in practice they currently seem to be kept at 50 degrees C).
With 'primary' fuel typically 55 Gigawatt days per tonne can be produced, i.e. a 1 Gigawatt powerplant would use 365.25 / 55 = 6.6 tonnes per year.
Even assuming you would be able to convert this to electricity with 100% efficiency and assuming an average 5 kilowatts per tonne over 10 years, this would yield about 18'000 Kilowatt days or 0.018 Gigawatt days, about 0.03% of the primary energy production.
You'll also see from the Carnot efficiency above that higher temperatures imply a higher possible efficiency, i.e. if one can spend some energy to extract the still fissionable material to be used in a reactor again, that is likely to be more efficient in terms of electricity generation.
It's true on the other hand that radioisotope thermoelectric generators (radioactive sources combined with thermocouples) have been used on satellite missions.
No comments:
Post a Comment