Tuesday, July 08, 2008

Part X: Nuclear



Today in the energy series we will look at one of my favorite "new", "old" technologies, in Part X: Nuclear
The science of atomic radiation, atomic change and nuclear fission was developed from 1895 to 1945, much of it in the last six of those years.
Over 1939-45, most development was focused on the atomic bomb.
From 1945 attention was given to harnessing this energy in a controlled fashion for naval propulsion and for making electricity.
Since 1956 the prime focus has been on the technological evolution of reliable nuclear power plants.
Uranium was discovered in 1789 by Martin Klaproth, a German chemist, and named after the planet Uranus.

Ionising radiation was discovered by Wilhelm Rontgen in 1895, by passing an electric current through an evacuated glass tube and producing continuous X-rays. Then in 1896 Henri Becquerel found that pitchblende (an ore containing radium and uranium) caused a photographic plate to darken. He went on to demonstrate that this was due to beta radiation (electrons) and alpha particles (helium nuclei) being emitted. Villard found a third type of radiation from pitchblende: gamma rays, which were much the same as X-rays. Then in 1896 Pierre and Marie Curie gave the name 'radioactivity' to this phenomenon, and in 1898 isolated polonium and radium from the pitchblende. Radium was later used in medical treatment. In 1898 Samuel Prescott showed that radiation destroyed bacteria in food.
In 1902 Ernest Rutherford showed that radioactivity as a spontaneous event emitting an alpha or beta particle from the nucleus created a different element. He went on to develop a fuller understanding of atoms and in 1919 he fired alpha particles from a radium source into nitrogen and found that nuclear rearrangement was occurring, with formation of oxygen. Niels Bohr was another scientist who advanced our understanding of the atom and the way electrons were arranged around its nucleus through to the 1940s.
By 1911 Frederick Soddy discovered that naturally-radioactive elements had a number of different isotopes (radionuclides), with the same chemistry. Also in 1911, George de Hevesy showed that such radionuclides were invaluable as tracers, because minute amounts could readily be detected with simple instruments.
In 1932 James Chadwick discovered the neutron. Also in 1932 Cockcroft and Walton produced nuclear transformations by bombarding atoms with accelerated protons, then in 1934 Irene Curie and Frederic Joliot found that some such transformations created artificial radionuclides. The next year Enrico Fermi found that a much greater variety of artificial radionuclides could be formed when neutrons were used instead of protons.
Fermi continued his experiments, mostly producing heavier elements from his targets, but also, with uranium, some much lighter ones. In 1939 Otto Hahn and Fritz Strassman in Berlin showed that the new lighter elements were barium and others which were about half the mass of uranium, thereby demonstrating that atomic fission had occurred. Lise Meitner and her nephew Otto Frisch, working under Niels Bohr, then explained this by suggesting that the neutron was captured by the nucleus, causing severe vibration leading to the nucleus splitting into two not quite equal parts. They calculated the energy release from this fission as about 200 million electron volts. Frisch then confirmed this figure experimentally.
This was the first experimental confirmation of Albert Einstein's paper putting forward the equivalence between mass and energy, which had been published in 1905.


These 1939 developments sparked activity in many laboratories. Hahn and Strassman showed that fission not only released a lot of energy but that it also released additional neutrons which could cause fission in other uranium nuclei and possibly a self-sustaining chain reaction leading to an enormous release of energy. This suggestion was soon confirmed experimentally by Joliot and his co-workers in Paris, and Leo Szilard working with Fermi in New York.
Bohr soon proposed that fission was much more likely to occur in the uranium-235 isotope than in U-238 and that fission would occur more effectively with slow-moving neutrons than with fast neutrons, the latter point being confirmed by Szilard and Fermi, who proposed using a 'moderator' to slow down the emitted neutrons. Bohr and Wheeler extended these ideas into what became the classical analysis of the fission process, and their paper was published only two days before war broke out in 1939.
Another important factor was that U-235 was then known to comprise only 0.7% of natural uranium, with the other 99.3% being U-238, with similar chemical properties. Hence the separation of the two to obtain pure U-235 would be difficult and would require the use of their very slightly different physical properties. This increase in the proportion of the U-235 isotope became known as 'enrichment'.
The remaining piece of the fission/atomic bomb concept was provided in 1939 by Francis Perrin who introduced the concept of the critical mass of uranium required to produce a self-sustaining release of energy. His theories were extended by Professor Peierls at Birmingham University and the resulting calculations were of considerable importance in the development of the atomic bomb. Perrin's group in Paris continued their studies and demonstrated that a chain reaction could be sustained in a uranium-water mixture (the water being used to slow down the neutrons) provided external neutrons were injected into the system. They also demonstrated the idea of introducing neutron-absorbing material to limit the multiplication of neutrons and thus control the nuclear reaction (which is the basis for the operation of a nuclear power station).

British scientists had kept pressure on their government. The refugee physicists Peierls and Frisch (who had stayed in England with Peierls after the outbreak of war), gave a major impetus to the concept of the atomic bomb in a three-page document known as the Frisch-Peierls Memorandum. In this they predicted that an amount of about 5kg of pure U-235 could make a very powerful atomic bomb equivalent to several thousand tonnes of dynamite. They also suggested how such a bomb could be detonated, how the U-235 could be produced, and what the radiation effects might be in addition to the explosive effects. They proposed thermal diffusion as a suitable method for separating the U-235 from the natural uranium. This memorandum stimulated a considerable response in Britain at a time when there was little interest in the USA.
A group of eminent scientists known as the MAUD Committee was set up in Britain and supervised research at the Universities of Birmingham, Bristol, Cambridge, Liverpool and Oxford. The chemical problems of producing gaseous compounds of uranium and pure uranium metal were studied at Birmingham University and Imperial Chemical Industries (ICI). Dr Philip Baxter at ICI made the first small batch of gaseous uranium hexafluoride for Professor James Chadwick in 1940. ICI received a formal contract later in 1940 to make 3kg of this vital material for the future work. Most of the other research was funded by the universities themselves.
Two important developments came from the work at Cambridge. The first was experimental proof that a chain reaction could be sustained with slow neutrons in a mixture of uranium oxide and heavy water, ie. the output of neutrons was greater than the input. The second was by Bretscher and Feather based on earlier work by Halban and Kowarski soon after they arrived in Britain from Paris. When U-235 and U-238 absorb slow neutrons, the probability of fission in U-235 is much greater than in U-238. The U-238 is more likely to form a new isotope U-239, and this isotope rapidly emits an electron to become a new element with a mass of 239 and an Atomic Number of 93. This element also emits an electron and becomes a new element of mass 239 and Atomic Number 94, which has a much greater half-life. Bretscher and Feather argued on theoretical grounds that element 94 would be readily fissionable by slow and fast neutrons, and had the added advantages that it was chemically different to uranium and therefore could easily be separated from it.
This new development was also confirmed in independent work by McMillan and Abelson in the USA in 1940. Dr Kemmer of the Cambridge team proposed the names neptunium for the new element # 93 and plutonium for # 94 by analogy with the outer planets Neptune and Pluto beyond Uranus (uranium, element # 92). The Americans fortuitously suggested the same names, and the identification of plutonium in 1941 is generally credited to Glenn Seaborg.



By the end of 1940 remarkable progress had been made by the several groups of scientists coordinated by the MAUD Committee and for the expenditure of a relatively small amount of money. All of this work was kept secret, whereas in the USA several publications continued to appear in 1940 and there was also little sense of urgency.
By March 1941 one of the most uncertain pieces of information was confirmed - the fission cross-section of U-235. Peierls and Frisch had initially predicted in 1940 that almost every collision of a neutron with a U-235 atom would result in fission, and that both slow and fast neutrons would be equally effective. It was later discerned that slow neutrons were very much more effective, which was of enormous significance for nuclear reactors but fairly academic in the bomb context. Peierls then stated that there was now no doubt that the whole scheme for a bomb was feasible provided highly enriched U-235 could be obtained. The predicted critical size for a sphere of U-235 metal was about 8kg, which might be reduced by use of an appropriate material for reflecting neutrons. However, direct measurements on U-235 were still necessary and the British pushed for urgent production of a few micrograms.
The final outcome of the MAUD Committee was two summary reports in July 1941. One was on 'Use of Uranium for a Bomb' and the other was on 'Use of Uranium as a Source of Power'. The first report concluded that a bomb was feasible and that one containing some 12 kg of active material would be equivalent to 1,800 tons of TNT and would release large quantities of radioactive substances which would make places near the explosion site dangerous to humans for a long period. It estimated that a plant to produce 1kg of U-235 per day would cost ?5 million and would require a large skilled labour force that was also needed for other parts of the war effort. Suggesting that the Germans could also be working on the bomb, it recommended that the work should be continued with high priority in cooperation with the Americans, even though they seemed to be concentrating on the future use of uranium for power and naval propulsion.
The second MAUD Report concluded that the controlled fission of uranium could be used to provide energy in the form of heat for use in machines, as well as providing large quantities of radioisotopes which could be used as substitutes for radium. It referred to the use of heavy water and possibly graphite as moderators for the fast neutrons, and that even ordinary water could be used if the uranium was enriched in the U-235 isotope. It concluded that the 'uranium boiler' had considerable promise for future peaceful uses but that it was not worth considering during the present war. The Committee recommended that Halban and Kowarski should move to the USA where there were plans to make heavy water on a large scale. The possibility that the new element plutonium might be more suitable than U-235 was mentioned, so that the work in this area by Bretscher and Feather should be continued in Britain.
The two reports led to a complete reorganisation of work on the bomb and the 'boiler'. It was claimed that the work of the committee had put the British in the lead and that "in its fifteen months' existence it had proved itself one of the most effective scientific committees that ever existed". The basic decision that the bomb project would be pursued urgently was taken by the Prime Minister, Winston Churchill, with the agreement of the Chiefs of Staff.
The reports also led to high level reviews in the USA, particularly by a Committee of the National Academy of Sciences, initially concentrating on the nuclear power aspect. Little emphasis was given to the bomb concept until 7 December 1941, when the Japanese attacked Pearl Harbour and the Americans entered the war directly. The huge resources of the USA were then applied without reservation to developing atomic bombs.


The Americans increased their effort rapidly and soon outstripped the British. Research continued in each country with some exchange of information. Several of the key British scientists visited the USA early in 1942 and were given full access to all of the information available. The Americans were pursuing three enrichment processes in parallel: Professor Lawrence was studying electromagnetic separation at Berkeley (University of California), E. V. Murphree of Standard Oil was studying the centrifuge method developed by Professor Beams, and Professor Urey was coordinating the gaseous diffusion work at Columbia University. Responsibility for building a reactor to produce fissile plutonium was given to Arthur Compton at the University of Chicago. The British were only examining gaseous diffusion.
In June 1942 the US Army took over process development, engineering design, procurement of materials and site selection for pilot plants for four methods of making fissionable material (because none of the four had been shown to be clearly superior at that point) as well as the production of heavy water. With this change, information flow to Britain dried up. This was a major setback to the British and the Canadians who had been collaborating on heavy water production and on several aspects of the research program. Thereafter, Churchill sought information on the cost of building a diffusion plant, a heavy water plant and an atomic reactor in Britain.
After many months of negotiations an agreement was finally signed by Mr Churchill and President Roosevelt in Quebec in August 1943, according to which the British handed over all of their reports to the Americans and in return received copies of General Groves' progress reports to the President. The latter showed that the entire US program would cost over $1,000 million, all for the bomb, as no work was being done on other applications of nuclear energy.
Construction of production plants for electromagnetic separation (in calutrons) and gaseous diffusion was well under way. An experimental graphite pile constructed by Fermi had operated at the University of Chicago in December 1942 ?the first controlled nuclear chain reaction.
A full-scale production reactor for plutonium was being constructed at Argonne, with further ones at Oak Ridge and then Hanford, plus a reprocessing plant to extract the plutonium. Four plants for heavy water production were being built, one in Canada and three in the USA. A team under Robert Oppenheimer at Los Alamos in New Mexico was working on the design and construction of both U-235 and Pu-239 bombs. The outcome of the huge effort, with assistance from the British teams, was that sufficient Pu-239 and highly enriched U-235 (from calutrons and diffusion at Oak Ridge) was produced by mid-1945. The uranium mostly originated from the Belgian Congo.
The first atomic device tested successfully at Alamagordo in New Mexico on 16 July 1945. It used plutonium made in a nuclear pile. The teams did not consider that it was necessary to test a simpler U-235 device. The first atomic bomb, which contained U-235, was dropped on Hiroshima on 6 August 1945. The second bomb, containing Pu-239, was dropped on Nagasaki on 9 August. That same day, the USSR declared war on Japan. On 10 August 1945 the Japanese Government surrendered.

By the end of World War II, the project predicted and described in detail only five and a half years before in the Frisch-Peierls Memorandum had been brought to partial fruition, and attention could turn to the peaceful and directly beneficial application of nuclear energy. Post-war, weapons development continued on both sides of the "iron curtain", but a new focus was on harnessing the great atomic power, now dramatically (if tragically) demonstrated, for making steam and electricity.
In the course of developing nuclear weapons the Soviet Union and the West had acquired a range of new technologies and scientists realised that the tremendous heat produced in the process could be tapped either for direct use or for generating electricity. It was also clear that this new form of energy would allow development of compact long-lasting power sources which could have various applications, not least for shipping, and especially in submarines.
The first nuclear reactor to produce electricity (albeit a trivial amount) was the small Experimental Breeder reactor (EBR-1) in Idaho, in the USA, which started up in December 1951.
In 1953 President Eisenhower proposed his "Atoms for Peace" program, which reoriented significant research effort towards electricity generation and set the course for civil nuclear energy development in the USA.
In the Soviet Union, work was under way at various centres to refine existing reactor designs and develop new ones. The existing graphite-moderated channel-type plutonium production reactor was modified for heat and electricity generation and in 1954 the world's first nuclear powered electricity generator began operation in the then closed city of Obninsk at the Institute of Physics and Power Engineering (FEI). The AM-1 (Atom Mirny -- peaceful atom) reactor is water-cooled and graphite-moderated, with a design capacity of 30 MWt or 5 MWe. It was similar in principle to the plutonium production reactors in the closed military cities and served as a prototype for other graphite channel reactor designs including the Chernobyl-type RBMK (reaktor bolshoi moshchnosty kanalny -- high power channel reactor) reactors. AM-1 produced electricity until 1959 and was used until 2000 as a research facility and for the production of isotopes.
Also in the 1950s Obninsk was developing fast breeder reactors (FBRs). In 1955 the BR-1 (bystry reaktor -- fast reactor) fast neutron reactor began operating. It produced no power but led directly to the BR-5 which started up in 1959 with a capacity of 5MWt which was used to do the basic research necessary for designing sodium-cooled FBRs. It was upgraded and modernised in 1973 and then underwent major reconstruction in 1983 to become the BR-10 with a capacity of 8 MWt which is now used to investigate fuel endurance, to study materials and to produce isotopes.
The main US effort was under Admiral Hyman Rickover, which developed the Pressurised Water Reactor (PWR) for naval (particularly submarine) use. The PWR used enriched uranium oxide fuel and was moderated and cooled by ordinary (light) water. The Mark 1 prototype naval reactor started up in March 1953 in Idaho, and the first nuclear-powered submarine, USS Nautilus, was launched in 1954. In 1959 both USA and USSR launched their first nuclear-powered surface vessels.
The Mark 1 reactor led to the US Atomic Energy Commission building the 90 MWe Shippingport demonstration PWR reactor in Pennsylvania, which started up in 1957 and operated until 1982.
Since the USA had a virtual monopoly on uranium enrichment in the West, British development took a different tack and resulted in a series of reactors fuelled by natural uranium metal, moderated by graphite, and gas-cooled. The first of these 50 MWe Magnox types, Calder Hall-1, started up in 1956 and ran until 2003. However, after 1963 (and 26 units) no more were commenced. Britain next embraced the Advanced Gas-Cooled Reactor (using enriched oxide fuel) before conceding the pragmatic virtues of the PWR design.

In the USA, Westinghouse designed the first fully commercial PWR of 250 MWe, Yankee Rowe, which started up in 1960 and operated to 1992. Meanwhile the boiling water reactor (BWR) was developed by the Argonne National Laboratory, and the first one, Dresden-1 of 250 MWe, designed by General Electric, was started up earlier in 1960. A prototype BWR, Vallecitos, ran from 1957 to 1963. By the end of the 1960s, orders were being placed for PWR and BWR reactor units of more than 1000 MWe.
Canadian reactor development headed down a quite different track, using natural uranium fuel and heavy water as a moderator and coolant. The first unit started up in 1962. This CANDU design continues to be refined.
France started out with a gas-graphite design similar to Magnox and the first reactor started up in 1956. Commercial models operated from 1959. It then settled on three successive generations of standardised PWRs, which was a very cost-effective strategy.
In 1964 the first two Soviet nuclear power plants were commissioned. A 100 MW boiling water graphite channel reactor began operating in Beloyarsk (Urals). In Novovoronezh (Volga region) a new design -- a small (210 MW) pressurised water reactor (PWR) known as a VVER (veda-vodyanoi energetichesky reaktor -- water cooled power reactor) was built.
The first large RBMK (1,000 MW - high-power channel reactor) started up at Sosnovy Bor near Leningrad in 1973 and the same year saw the commissioning of the first of four small (12 MW) boiling water channel-type units in the eastern Arctic town of Bilibino for the production of both power and heat.
In the Arctic northwest a slightly bigger VVER with a rated capacity of 440 MW began operating and this became a standard design.
In Shevchenko in Kazakhstan the world's first commercial prototype FBR started up in 1972, the BN-350 (bystry neutron -- fast neutron) producing 120 MW electricity and heat to desalinate Caspian seawater. A prototype, BOR-60, had started at Obninsk in 1959, generating 12 MWe of electricity.
Around the world, with few exceptions, other countries have chosen light-water designs for their nuclear power programs, so that today 65% of the world capacity is PWR and 23% BWR.

From the late 1970s to about 2002 the nuclear power industry suffered some decline and stagnation. Few new reactors were ordered, the number coming on line from mid 1980s little more than matched retirements, though capacity increased by nearly one third and output increased 60% due to capacity plus improved load factors. The share of nuclear in world electricity from mid 1980s was fairly constant at 16-17%. Many reactor orders from the 1970s were cancelled. The uranium price dropped accordingly, and also because of an increase in secondary supplies. Oil companies which had entered the uranium field bailed out, and there was a consolidation of uranium producers.
However, by the late 1990s the first of the third-generation reactors was commissioned - Kashiwazaki-Kariwa 6 - a 1350 MWe Advanced BWR, in Japan. This was a sign of the recovery to come.

In the new century several factors have combined to revive the prospects for nuclear power. First is realisation of the scale of projected increased electricity demand worldwide, but particularly in rapidly-developing countries. Secondly is awareness of the importance of energy security, and thirdly is the need to limit carbon emissions due to concern about global warming.
These factors coincide with the availability of a new generation of nuclear power reactors, and in 2004 the first of the late third-generation units was ordered for Finland - a 1600 MWe European PWR (EPR). A similar unit is planned for France as the first of a full fleet replacement there. In the USA the 2005 Energy Policy Act provided incentives for establishing new-generation power reactors there.

link to doe history of nuclear energy

No comments: