The United States faces an energy crisis. America’s reliance on fossil fuels causes irreparable environmental degradation that will only increase in magnitude the longer we depend on them. However, we are currently unable to fully commit to renewable energy sources such as solar and wind power due to their inherent unreliability. Once technology is developed to cheaply and efficiently harvest and store renewable energy, renewables can, should, and will become the main energy source from which the world runs. At this time, the expansion of the nuclear power sector is the best solution to this issue.

After nuclear fission was discovered in the late 1930s, the first self-sustaining nuclear reactor, named Chicago Pile-1, was developed at the University of Chicago in 1942. This technology was almost immediately weaponized by the United States via the Manhattan Project to create the nuclear bomb. However, additional research was conducted on the prospects of nuclear fission as an energy source. Eventually, the first reactor to generate electricity from nuclear fission operated successfully in 1951, and the first commercial nuclear power plant opened in 1957.

Over the following few decades, nuclear power gained a continuously increasing share of the United States electricity source market. In 1991, the United States was a global leader in nuclear energy. At the time, nuclear power plants provided 22% of American electricity, a comparatively high percentage to the rest of the world. 

In the years since, however, hardly any new power plants in the United States were developed, and public perception towards nuclear power shifted dramatically, largely due to mismanaged nuclear accidents. The largest nuclear disaster in US history occurred when a reactor at Three Mile Island in Pennsylvania partially melted down in 1979. Seven years later, the deadliest and most destructive nuclear disaster took place in Chernobyl. More recently, as a result of the 9.1 magnitude earthquake near Japan and the resulting tsunami in 2011, there was a meltdown and the release of radioactive material from Fukushima. These incidents have been scrutinized to a level disproportionate to their actual human, financial, and environmental toll, and thus the prospect of a nuclear dominated future has become undesirable for many. 

Furthermore, cheaper access to fossil fuels such as natural gas and an absence of political enthusiasm to continue to maintain existing nuclear power plants have led to several American nuclear power plants being shut down in the past decade, with several more sets to close in the coming years. California, for example, previously had two operating nuclear power plants, but shut down San Onofre in 2012. The sole remaining nuclear power plant in California at Diablo Canyon was set to shut down in 2025, but recently the state legislature extended its lifespan through 2030 because of its necessity in providing clean energy to Californians.

The United States has fallen far behind other nations in the quest for nuclear power. The rate of electricity from nuclear power is around 20% in the United States, whereas it is around 70% in France. China is currently leading the charge for the creation of new nuclear reactors. The US could increase its nuclear output, but an abundance of misinformation and a lack of political prioritization surrounding nuclear energy has put the United States far behind.

Nuclear energy is comparable with wind and solar both in terms of greenhouse gas emissions and safety, and considerably better than all types of fossil fuels. Coal, oil, and natural gas emit 820, 720, and 490 tons of carbon dioxide per gigawatt-hour of electricity generated respectively, and wind, solar, and nuclear energy emit 4, 5, and 3 tons respectively. The death rates from accidents and air pollution per terawatt-hour for coal, oil, and natural gas are 24.6, 18.4, and 2.8 respectively, whereas for wind, solar, and nuclear energy, the values are 0.04, 0.02, 0.03 respectively. These data show that even when controlling for electricity output, nuclear and renewable energy sources are hundreds of times better for the environment and thousands of times better for the health of workers in those industries and surrounding inhabitants. 

The statistics regarding safety do not even take into account the long term indirect effects of climate change caused by greenhouse gas emissions on health. Climate change will cause increasing temperatures, more extreme weather events, droughts, rising sea levels and more, which correspond with famine, displacement, and global instability. Many more people could die in the future due to emissions that occur today, which is why it is so urgent that the world moves away from fossil fuels and towards clean sources of energy. 

If the United States decided to make a concerted effort to grow the nuclear sector, there would be some key short-term economic costs and long-term benefits. The transition to an electricity-dominated nation that is powered in large part by nuclear power would be very expensive. 

First, the US does not currently have the top nuclear experts, as they are largely international in countries that prioritize nuclear energy like France, China, and Germany. There must be significant investment into nuclear research at universities and innovative companies. The US would need to subsidize the R&D, because it is not currently viewed as profitable for companies to spend on nuclear power innovation, while it appears that nuclear energy is being phased out. Plus, America would need to prove to be an attractive market for experienced nuclear scientists to come to.

In order to keep existing nuclear power plants from shutting down, additional government investment is required. Low natural gas prices have left nuclear power plants that were already having difficulties due to an unjustly bad reputation struggling to compete with energy prices. This step is already being done. President Biden allocated $6 billion from the trillion dollar infrastructure bill that was approved in November 2021 to bail out owners of nuclear reactors that would soon be forced to shut down.

Finally, there are significant expenses involved in the creation of new nuclear power plants. Many of the main costs of a nuclear power plant originate from capital costs over plant operating, external, and other costs. Capital costs include engineering, procurement, construction, financing, licensing, and more. Paying off these costs are primarily why nuclear power plants are struggling currently compared to fossil fuels.

However, nuclear energy is expected to have the lowest costs out of any clean energy source for the immediate future. Additionally, as nuclear power plants are being proven to last longer than initially expected with proper management and use, nuclear power plants that function under long-term operation become the cheapest energy source, clean or otherwise. This means that if the United States incentivizes some nuclear development and expresses commitment to a long-term nuclear utilization, then costs for electricity generated by nuclear power can be the lowest available.

The most significant economic benefit to investing in nuclear energy will not be felt by those living today or anyone that those living today will ever know. The impacts of today’s decisions regarding emissions will have much greater economic effects generations from now than they will today. If we reach the point where climate change becomes irreversible, then the catastrophic economic consequences will far exceed any losses we might incur now due to additional initial investment in nuclear energy.

Featured Image Source: Prospero

Disclaimer: The views published in this journal are those of the individual authors or speakers and do not necessarily reflect the position or policy of Berkeley Economic Review staff, the Undergraduate Economics Association, the UC Berkeley Economics Department and faculty,  or the University of California, Berkeley in general.

Share this article:

Leave a Reply

Your email address will not be published. Required fields are marked *