Right now I am reading the novel The Last Heroes by W.E.B. Griffin. It's about the beginnings of the O.S.S. in World War 2. There are a couple of reasons I picked up this book, first of which I used to know a man who was in the O.S.S. during WW2, and the thrust of the novel is obtaining Uranium for atomic energy and weapons. Energy for the Navy, as that is the main obstacle for naval forces: fuel. Weaponry to defeat our enemies.
This is not meant to be a book review, but some thoughts on nuclear energy. In part, this is prompted by an article this week in Christian Science Monitor on Fusion Reactors.
The article discusses the current state of research and what the near future looks like, and it looks bright.
One of the biggest challenges in harnessing nuclear fusion is containing and controlling an environment where temperatures are 100 million degrees Celsius. That's hot! However, with the tremendous advancements that have taken place in technology, computers in particular, it has become possible to model what such an environment would look like and design equipment to control it.
For years, researchers worried that at the energy levels ITER was aiming for, the plasma would fail to remain stable or that the magnetic fields would fail to keep the plasma bottled up.
But since the mid-'90s, technological advances have yielded fresh insights into the way such reactors can operate. They include improved test equipment, new ways to tweak the reactions from outside the reactor vessel, and more-powerful computers that model the conditions in the reactors. "Now we know what we're looking at," Goldston says.
For example, when the plasma grows turbulent, it forms eddies and the plasma cools. Researchers had a difficult time figuring out what determined the size of the eddies and how to control them. With the added computational horsepower and the new instruments, they determined the factors that controlled their size. Just as important, they found that they could apply more push to the flowing plasma than the system would generate on its own, shearing off the eddies almost before they got started.
This, to me, is exciting stuff, as nuclear fusion may soon be moving from the realm of Science Fiction to reality. A limitless supply of clean electricity, enough to supply all the world's needs until the end of time.
This article also prompted me to look back at the current state of nuclear energy, the fission of uranium molecules.
Fission has remained unpopular in the United States for decades, and I have never understood the hysteria against nuclear energy. True, there have been a few accidents involving nuclear reactors, most notable was the Chernobyl disaster in the ‘80s. That particular accident was due to a very bad and risky design, which resulted in the reactor entering a run away reaction that, to coin a phrase, blew the lid off of the place.
However, it seems to be that one of the main drivers is a fear of nuclear proliferation, which resulted, during the Carter Administration, in the U.S. abandoning the reprocessing of spent fuel rods. You see, it turns out that one of the by-products of reprocessing fuel rods, plutonium, as well as residual uranium, can be used in the development of nuclear weapons. This was the driving reason for shutting down the nuclear energy industry in the United States.
There is an additional side effect of abandoning the reprocessing of spent fuel rods: economic stagnation. It causes the costs for running our nuclear power plants to go up as they have to store more radioactive material, instead of transferring these materials to a place where they can be safely reprocessed.
It is interesting to note that there are several European countries, including France, that have fuel reprocessing plants.
France uses nuclear reactors to generate most of its electricity, Germany receives nearly a third of its electricity from nuclear power plants. Not to mention the United States Navy, as most ships and submarines are nuclear powered.
Should the U.S. resume building nuclear power plants? Look at the increase in the costs of natural gas to heat your homes (I have geothermal, so no gas here) and cooking (where people have gas ranges). The prices have gone up, mainly because power generation plants have been switching from coal to natural gas to generate electricity. As the demand for natural gas continues to escalate, especially during the winter months, the higher the price for natural gas.
This is a very good reason to revitalize the nuclear energy industry. By building more plants, and modern designs are far safer than the designs from the ‘50s and ‘60s, we can easily reduce, and eventually eliminate the need for coal burning plants, thus eliminating the "green-house" gases the environmentalists are always up in arms about (and the driver for the move from coal to natural gas) as well as eliminate the need for natural gas in energy generation, thus causing a drop in the price of gas for winter heating.
Sure, the environmentalists will tell you to switch to solar and wind, but electricity from those technologies still costs more than other means of generating electricity, and although it is "green" energy, the process to make the parts needed is not a green process. And it is not cheap technology to implement, nor is it cost effective.
As I mentioned above, I have geothermal to heat and cool my house. Neat technology. I have looked at photovoltaic (solar) and wind, and, yes, I may implement one or both of those as well, just to reduce my reliance on outside sources (going "off grid"), however, it definitely is not cheap.
Nuclear energy still remains the most cost effective means of generating energy. Nuclear waste can be better handled by the re-introduction of reprocessing plants in the United States, as well as a move to other, more modern, power plant designs will increase safety and decrease per kilowatt costs.