Nuclear Power 1 - History 1 - Beginnings to 1970

Nuclear Power 1 - History 1 - Beginnings to 1970

In the early Twentieth Century, it was discovered that radioactive elements could release great amounts of energy according to Albert Einstein's famous equation, E = MC2. This equation says that the energy in a quantity of matter is equal to that quantity of matter multiplied by the speed of light squared. There are approximately twenty five million kilowatt hours of energy in one gram of matter. Another way to think of it is that the amount of energy in one gram of matter is equivalent to the amount of energy released by burning five hundred sixty eight thousand gallons of gasoline.

            The early pioneers of radioactivity research did not believe that it would be possible to harness this vast amount of energy in matter as a power source. In the third decade of the Twentieth Century, it was discovered that neutrons could induce radioactivity disintegration called nuclear fission which resulted in the breakdown of uranium nuclei into roughly equal smaller nuclei. Because this reaction also released neutrons, it was possible to create a self-sustaining fission process called a chain reaction. The heat generated from this reaction could be used to boil water to generate steam and thus could be source of electrical power.

            In 1942, the first man-made reactor, called Chicago Pile-1, was created and became part of the Manhattan Project which went on to enrich uranium and build reactors to breed plutonium for the atom bombs dropped on Japan at the end of World War Two.

            After the War, the development of peaceful uses for nuclear power was advocated in the U.S. Some say that the U. S. government wanted to increase spending on nuclear weapons research and that the prospect of civilian use of nuclear power was just an excuse to get more money out of Congress. The fear that such research would create materials that could be used for bombs encouraged national governments such as the United States and the Soviet Union to keep all such research under strict government control. The United States Atomic Energy Commission was created in the U.S. to oversee nuclear research.

            The first use of nuclear energy to generate electricity was at Arco, Idaho on December 20, 1951 at the EBR-1 experimental station. The navy dedicated research to the development of a small nuclear reactor that could be used to power naval vessels. The USS Nautilus was the first nuclear powered submarine. It was launched in 1955. In 1953 Dwight Eisenhower, the U.S. President gave a famous speech titled Atoms for Peace at the United Nations that encouraged the international development of peaceful uses for nuclear power. The 1954 Amendments to the Atomic Energy Act allowed the declassification of U.S. reactor designs and encouraged construction of reactors by private industry. On June, 7, 1954, the U.S.S.R. started the first nuclear reactor which fed electrical power into a power grid at Obninsk.

            Advocates for civilian nuclear power stated that nuclear reactors would soon be producing energy at a cost that was comparable to conventional sources. There were also wild claims that nuclear power would soon be "too cheap to meter" or, in other words, free. There was some disappointment when this did not prove to be true.

            A United Nations conference on nuclear power was convened at Geneva in  1955. In 1957, EURATOM was created along with the European Economic Community which would become the European Union. The International Atomic Energy Agency was also launched in 1955.

            In 1956, England opened Calder Hall,  the world's first commercial nuclear power plant in Sellafield. It started with a generation capacity of fifty million watts and later expanded to two hundred million watts. The Shippingport Reactor in Pennsylvania was the first commercial nuclear power plant in the United States. It was brought online in 1957. The age of atomic energy had begun.