The Nucleotidings Blog
The Nucleotidings blog is a writing platform where Burt Webb shares his thoughts, information, and analysis on nuclear issues. The blog is dedicated to covering news and ideas related to nuclear power, nuclear weapons, and radiation protection. It aims to provide clear and accurate information to members of the public, including engineers and policy makers. Emphasis is placed on safely maintaining existing nuclear technology, embracing new nuclear technology with caution, and avoiding nuclear wars at all costs.

Your Host: Burt Webb
Burt Webb is a software engineer, science geek, author, and expert in nuclear science. Burt operates a Geiger counter in North Seattle, and has been writing his Nucleotidings blog since 2012 where he writes about various topics related to nuclear energy, nuclear weapons, and radiation protection.

Burt Webb has published several technical books and novels. He works as a software consultant.

Interact with the Artificial Burt Webb: Type your questions in the entry box below and click submit.

Example Q&A with the Artificial Burt Webb

Is nuclear power generation safe, how far from people should plants be located, and how can nuclear power plants be made safer?

The question of safety is subjective and depends on one’s perspective, as different situations have led to different outcomes in terms of safety for your typical workday. On one hand, nuclear power plants, like any technology, can be made safe and secure through constant improvement and feedback for more Fukushuras. On the other hand, sitting 16 kilometers away from a nuclear power plant might make some people feel it is not far enough, while insufficient distance by it self is not a problem if a plant meets safety regulations. Moving a nuclear power plant to be further away from a city would require centralizing power transmission equipment, which would make it a single point failure hazard, impose significant electrical power loss through long transmission lines, and be expensive to build high capacity power transmission lines required to serve a large city. Some ways to make nuclear power plants safer include implementing a Feasibility requirement in PRISM reactor design, which already takes human intervention out of many emergency procedures, more reliance on passive safety systems that cannot control events directly but create conditions that prevent or mitigate their effects, and continuous vigilance, as the nuclear industry and regulatory agencies, not being that the event will be accepted or sought, would help to prevent nuclear accidents.

What do you mean by “Fukushuras”?

“Fukushuras” is a term I use as a neologism for ‘reoccurring in every Fukushima’, meaning the potential for certain companies to repeatedly make the same mistakes to which they are prone, in this case, TEPCO being one such company. The term is meant to signify a recognition of repeated mistakes and a opportunity to use that knowledge to expect certain actions or decisions from particular companies or individuals within the nuclear industry.

Blog

  • Geiger Readings for Jul 17, 2025

    Latitude 47.704656 Longitude -122.318745

    Ambient office = 86 nanosieverts per hour

    Ambient outside = 112 nanosieverts per hour

    Soil exposed to rain water = 116 nanosieverts per hour

    Celery from Central Market = 87 nanosieverts per hour

    Tap water = 74 nanosieverts per hour

    Filter water = 60 nanosieverts per hour

  • Nuclear Reactors 1548 – Fears Rise of Regulatory Capture of the Nuclear Regulatory Commission – Part 1 of 2 Parts

    A logo of the united states of america

AI-generated content may be incorrect.

    Part 1 of 2 Parts

    Federal agencies were created to protect the public from unsafe drugs, financial fraud, environmental pollution, and countless other threats. But what happens when the watchdogs start working for the companies they’re supposed to watch? Regulatory capture occurs when industries gain so much influence over their regulators that agencies end up serving corporate interests instead of the public good.

    It’s not always corruption as we usually recognize it. Sometimes it’s more subtle: former regulators getting lucrative industry jobs, agencies relying too much on company data, or officials absorbing industry viewpoints after years of close contact.

    The challenge is real and persistent. Agencies must be independent from political pressure to make sound, expert decisions. But that same independence can create space for industry influence to slip in. Understanding this threat matters because it can affect everything from the safety of your medications to the stability of the financial system.

    The rate of undermining the statutory authority of the Nuclear Regulatory Commission (NRC) to serve as the cornerstone of nuclear safety in the United States and across the world is accelerating.

    The recent directive by Department of Government Efficiency (DOGE) staff member Adam Blake to NRC staff to “rubber stamp” Department of Energy (DoE) and Department of Defense (DoD) nuclear projects indicates how far and deeply these cracks have advanced in the pillars of nuclear safety culture within the federal government.

    The ultimate danger is that weakening safety oversight precisely when unproven reactor technologies need the most rigorous review sets the stage for the kind of serious accident that could undermine public confidence in nuclear power for generations.

    There is a saying that “Nuclear power is not inherently unsafe, but nuclear power is inherently unforgiving.” The implication of this saying is quite clear: Inattention to safety details has important consequences. These concerns have led Congress to wisely separate the original Atomic Energy Commission (AEC) into two agencies with valuable tensions. One is the DoE, which studies and promotes multiple forms of energy, including nuclear power. The other agency is the NRC, with the function of nuclear safety above all else.

    During the more than seventy-year experiment with nuclear power, “defense in depth” safety margins have prevented a variety of nuclear accidents from the mundane to the catastrophic. However, there have also been numerous near misses, such those at Browns Ferry (1975) and Three Mile Island (1979), and tragic failures at Chernobyl (1986) and Fukushima (2011).

    With the advent of lower-cost hydraulically fractured fossil gas burned in combined cycle turbines and low-cost renewable wind, solar and storage energy sources, nuclear power is no longer a low-cost provider. All new nuclear projects have also failed to stay on budget and on schedule.

    The past three nuclear reactors to come online, all in the nuclear-friendly southeastern U.S., highlight these major failures. The Tennessee Valley Authority’s Watts Bar 2 was over forty years behind schedule and ultimately cost six billion dollars while originally estimated to be less than one billion dollars. Georgia Power’s Vogtle 3 and 4 were seven years delayed and $21 billion over the originally estimated budget. While pragmatic utility managers have moved away from nuclear power to embrace less risky, more predictable, and less complex energy solutions, nuclear zealots have tried to blame “over-regulation” and “government bureaucracy” for problems that are inherent in nuclear technology itself.

    Atomic Energy Commission

    Please read Part 2 next

  • Geiger Readings for Jul 16, 2025

    Latitude 47.704656 Longitude -122.318745

    Ambient office = 94 nanosieverts per hour

    Ambient outside = 88 nanosieverts per hour

    Soil exposed to rain water = 84 nanosieverts per hour

    Carrot from Central Market = 100 nanosieverts per hour

    Tap water = 86 nanosieverts per hour

    Filter water = 79 nanosieverts per hour

  • Radioactive Waste 1000 – Pacific Northwest National Laboratory Utilizes Artificial Intelligence to Analyze Radioactive Debris from a Nuclear Explosion – Part 2 of 2 Parts

    A logo with text on it

AI-generated content may be incorrect.

    Part 2 of 2 Parts (Please read Part 1 first)

    The remnants of a nuclear explosion will include many elements, including much of the periodic table with several different chemical forms of many of them. Uranium, strontium, iron and cerium would definitely be present. Analysis usually includes putting the materials into an aqueous bath such as nitric acid, then doing a series of tedious chemical separations to learn more about each component.

    In the new research, the team used AI-enabled high-performance computing to simulate the complex computational chemistry involved and to determine some chemical properties of the debris. These included calculations of what are referred to as stability constants. Those numbers aid scientists in understanding the strengths of bonds between ions or molecules to form molecular complexes, how likely these complexes will stick together or break apart, and how energy flows in such a complex system. Many such calculations taken together help scientists use chemical separations that allow them to understand exactly what they’re looking at, what happened and where materials came from.

    The team showed that their AI model is able to explore and calculate the properties of a huge number of possible molecular combinations, far more than could be explored in the laboratory.

    Hadi Dinpajooh is a computational chemist and an author on the paper “Generative AI calculates in many dimensions at once, in a way that is difficult for a person. The model allows us to significantly reduce the timeline to explore all the possibilities.”

    The PNNL scientists believe that this sort of chemical separation modeling driven by AI would benefit other difficult questions involving nuclear science. One example is the production of medical isotopes such as molybdenum-99, which is used to help diagnose cancer and other serious health conditions. Molybdenum-99 is produced by the fission process and requires chemical separations like those the team is exploring.

    The mathematics of such nuclear analyses are challenging. PNNL teamed with Microsoft to deploy Azure Quantum Elements which is a cloud computing resource. That system utilized powerful computer chips from NVIDIA, including two hundred and thirty NVIDIA H100 GPUs. Altogether, combined with other computing resources, the team used fifty-five terabytes of RAM to work through the questions. This is an analysis which represented just one step in the long chain of analyses that would be deployed after a nuclear detonation.

    The technical management for the PNNL-Microsoft collaboration was led by computer scientist Paul Rigor. His expertise bridged the gap between the project’s research demands and the computing infrastructure that Microsoft provided.

    Uhnak said, “This first paper along these lines is a baby step, but it’s an important step. Anything we can do to speed up the process of analysis is a win.”

    Research on nuclear devices and debris is one part of a major ongoing program on nuclear forensics at PNNL. It is a critical component of the nation’s capability to analyze nuclear and radioactive materials and events, including the complex science involved in nuclear explosions.

    Pacific Northwest National Laboratory

  • Geiger Readings for Jul 15, 2025

    Latitude 47.704656 Longitude -122.318745

    Ambient office = 95 nanosieverts per hour

    Ambient outside = 141 nanosieverts per hour

    Soil exposed to rain water = 140 nanosieverts per hour

    Shallot from Central Market = 108 nanosieverts per hour

    Tap water = 93 nanosieverts per hour

    Filter water = 85 nanosieverts per hour

  • Radioactive Waste 999 – Pacific Northwest National Laboratory Utilizes Artificial Intelligence to Analyze Radioactive Debris from a Nuclear Explosion – Part 1 of 2 Parts

    A green and white logo

AI-generated content may be incorrect.

    Part 1 of 2 Parts

    Scientists have tapped artificial intelligence and powerful computing to take the first step to accelerate how quickly officials are able to learn important details about nuclear events such as explosions, accidents or industrial emissions.

    It takes exacting laboratory work to determine the details behind an event such as a nuclear explosion, including critical requirements like tracking down the source of the materials that were used. During an incident, a rush of nuclear and chemical reactions happens. Hundreds of isotopes and chemical compounds are created and some quickly blink out of existence. Putting all the molecular puzzle pieces together to create a confirmed description of what happened is a long and difficult process.

    Scientists at the Department of Energy’s Pacific Northwest National Laboratory (PNNL) have employed generative AI and machine learning, as well as cloud computing resources from Microsoft, to show how analysis could be hastened. Researchers have shown that AI can help solve some of the complicated chemistry questions that scientists confront when analyzing a mix of radioactive debris from a nuclear explosion.

    A persistent goal for such research is to speed up the process to identify key information about a nuclear explosion and deliver answers more quickly. This research is an important step toward that goal by further prioritizing and targeting the chemical steps required to do so, ultimately reducing the time required in the laboratory.

    The report of the modeling study was published in the journal Physical Chemistry Chemical Physics. PNNL researchers also presented their work this spring at the Methods and Applications of Radioanalytical Chemistry conference.

    Artificial intelligence helps accelerate the investigation of a nuclear event. It can chart out the laboratory steps necessary to get to the results of an analysis more quickly than without AI input.

    Nic Uhnak is the PNNL radiochemist who led the study. He said, “There’s a tremendous amount of radiochemistry that needs to be done to determine the fingerprints of a nuclear explosion. The process has to be done quickly, but scientists face a very complex chemical environment, with high radiation levels and many separate chemical processes occurring simultaneously. You’re dealing with highly complex chemistry and many potential laboratory experiments and analyses.”

    Uhnak compares the post-detonation analysis to identifying the sources and features of the ingredients of a cake that has already been baked. What farm did the eggs come from and how many were used? What type of oven was used to bake the cake and how long was it baked? If so many questions can be asked of a simple cake, one can imagine the questions that must be answered after a nuclear explosion.

    PNNL is part of a group of national labs and law enforcement agencies that supply the U.S. government’s nuclear forensics capability. PNNL and others feed information that is interpreted by other parts of the system to allow important attributions and decisions.

    In this analysis, the PNNL team puts forth a set of chemical forms likely to be present in the debris and asked basic questions about the subsequent chemistry. Which reactions are most likely to occur? What laboratory experiments are required to answer the questions at hand? Which experiments should be done in what order?

    Department of Energy

    Please read Part 2 next

    Pacific Northwest National Laboratory

  • Geiger Readings for Jul 14, 2025

    Latitude 47.704656 Longitude -122.318745

    Ambient office = 89 nanosieverts per hour

    Ambient outside = 115 nanosieverts per hour

    Soil exposed to rain water = 113 nanosieverts per hour

    Avocadofrom Central Market = 87 nanosieverts per hour

    Tap water = 126 nanosieverts per hour

    Filter water = 109 nanosieverts per hour