Showing posts with label Enrico Fermi. Show all posts
Showing posts with label Enrico Fermi. Show all posts

Tuesday, September 8, 2009

Nuclear reactor







The invention: 



The first nuclear reactor to produce substantial

quantities of plutonium, making it practical to produce usable

amounts of energy from a chain reaction.



The people behind the invention:



Enrico Fermi (1901-1954), an American physicist

Martin D. Whitaker (1902-1960), the first director of Oak Ridge

National Laboratory

Eugene Paul Wigner (1902-1995), the director of research and

development at Oak Ridge









The Technology to End a War



The construction of the nuclear reactor at Oak Ridge National

Laboratory in 1943 was a vital part of the Manhattan Project, the effort

by the United States during World War II (1939-1945) to develop

an atomic bomb. The successful operation of that reactor

was a major achievement not only for the project itself but also for

the general development and application of nuclear technology.

The first director of the Oak Ridge National Laboratory was Martin

D. Whitaker; the director of research and development was Eugene

Paul Wigner.

The nucleus of an atom is made up of protons and neutrons. “Fission”

is the process by which the nucleus of certain elements is split

in two by a neutron from some material that emits an occasional

neutron naturally. When an atom splits, two things happen: A tremendous

amount of thermal energy is released, and two or three

neutrons, on the average, escape from the nucleus. If all the atoms in

a kilogram of “uranium 235” were to fission, they would produce as

much heat energy as the burning of 3 million kilograms of coal. The

neutrons that are released are important, because if at least one of

them hits another atom and causes it to fission (and thus to release

more energy and more neutrons), the process will continue. It will

become a self-sustaining chain reaction that will produce a continuing

supply of heat.

Inside a reactor, a nuclear chain reaction is controlled so that it

proceeds relatively slowly. The most familiar use for the heat thus

released is to boil water and make steam to turn the turbine generators

that produce electricity to serve industrial, commercial, and

residential needs. The fissioning process in a weapon, however, proceeds

very rapidly, so that all the energy in the atoms is produced

and released virtually at once. The first application of nuclear technology,

which used a rapid chain reaction, was to produce the two

atomic bombs that ended World War II.





Breeding Bomb Fuel



The work that began at Oak Ridge in 1943 was made possible by a

major event that took place in 1942. At the University of Chicago,

Enrico Fermi had demonstrated for the first time that it was possible to

achieve a self-sustaining atomic chain reaction. More important, the reaction

could be controlled: It could be started up, it could generate heat

and sufficient neutrons to keep itself going, and it could be turned off.

That first chain reaction was very slow, and it generated very little heat;

but it demonstrated that controlled fission was possible.

Any heat-producing nuclear reaction is an energy conversion

process that requires fuel. There is only one readily fissionable element

that occurs naturally and can be used as fuel. It is a form of

uranium called uranium 235. It makes up less than 1 percent of all

naturally occurring uranium. The remainder is uranium 238, which

does not fission readily. Even uranium 235, however, must be enriched

before it can be used as fuel.

The process of enrichment increases the concentration of uranium

235 sufficiently for a chain reaction to occur. Enriched uranium is used

to fuel the reactors used by electric utilities. Also, the much more plentiful

uranium 238 can be converted into plutonium 239, a form of the

human-made element plutonium, which does fission readily. That

conversion process is the way fuel is produced for a nuclear weapon.

Therefore, the major objective of the Oak Ridge effort was to develop a

pilot operation for separating plutonium from the uranium in which it

was produced. Large-scale plutonium production, which had never

been attempted before, eventually would be done at the Hanford Engineer

Works in Washington. First, however, plutonium had to be pro-

duced successfully on a small scale at Oak Ridge.

The reactor was started up on November 4, 1943. By March 1,

1944, the Oak Ridge laboratory had produced several grams of plutonium.

The material was sent to the Los Alamos laboratory in New

Mexico for testing. By July, 1944, the reactor operated at four times

its original power level. By the end of that year, however, plutonium

production at Oak Ridge had ceased, and the reactor thereafter was

used principally to produce radioisotopes for physical and biological

research and for medical treatment. Ultimately, the Hanford Engineer

Works’ reactors produced the plutonium for the bomb that

was dropped on Nagasaki, Japan, on August 9, 1945.

The original objectives for which Oak Ridge had been built had

been achieved, and subsequent activity at the facility was directed

toward peacetime missions that included basic studies of the structure

of matter.



Impact



The most immediate impact of the work done at Oak Ridge was

its contribution to ending World War II. When the atomic bombs

were dropped, the war ended, and the United States emerged intact.

The immediate and long-range devastation to the people of Japan,

however, opened the public’s eyes to the almost unimaginable

death and destruction that could be caused by a nuclear war. Fears

of such a war remain to this day, especially as more and more nations

develop the technology to build nuclear weapons.

On the other hand, great contributions to human civilization

have resulted from the development of nuclear energy. Electric

power generation, nuclear medicine, spacecraft power, and ship

propulsion have all profited from the pioneering efforts at the Oak

Ridge National Laboratory. Currently, the primary use of nuclear

energy is to produce electric power. Handled properly, nuclear energy

may help to solve the pollution problems caused by the burning

of fossil fuels.



See also Breeder reactor; Compressed-air-accumulating powerplant; Fuel cell;

Geothermal power; Heat pump; Nuclear power plant; Solar thermal engine; Nuclear reactor




















Nuclear power plant







The invention: 



The first full-scale commercial nuclear power plant,
which gave birth to the nuclear power industry.

 







The people behind the invention:



Enrico Fermi (1901-1954), an Italian American physicist who

won the 1938 Nobel Prize in Physics

Otto Hahn (1879-1968), a German physical chemist who won the

1944 Nobel Prize in Chemistry

Lise Meitner (1878-1968), an Austrian Swedish physicist

Hyman G. Rickover (1898-1986), a Polish American naval officer









Discovering Fission



Nuclear fission involves the splitting of an atomic nucleus, leading

to the release of large amounts of energy. Nuclear fission was

discovered in Germany in 1938 by Otto Hahn after he had bombarded

uranium with neutrons and observed traces of radioactive

barium. When Hahn’s former associate, Lise Meitner, heard of this,

she realized that the neutrons may have split the uranium nuclei

(each of which holds 92 protons) into two smaller nuclei to produce

barium (56 protons) and krypton (36 protons). Meitner and her

nephew, Otto Robert Frisch, were able to calculate the enormous energy

that would be released in this type of reaction. They published

their results early in 1939.

Nuclear fission was quickly verified in several laboratories, and

the Danish physicist Niels Bohr soon demonstrated that the rare uranium

235 (U-235) isotope is much more likely to fission than the common

uranium 238 (U-238) isotope, which makes up 99.3 percent of

natural uranium. It was also recognized that fission would produce

additional neutrons that could cause new fissions, producing even

more neutrons and thus creating a self-sustaining chain reaction. In

this process, the fissioning of one gram of U-235 would release about

as much energy as the burning of three million tons of coal.

The first controlled chain reaction was demonstrated on December

2, 1942, in a nuclear reactor at the University of Chicago, under

 the leadership of Enrico Fermi. He used a graphite moderator to

slow the neutrons by collisions with carbon atoms. “Critical mass”

was achieved when the mass of graphite and uranium assembled

was large enough that the number of neutrons not escaping from

the pile would be sufficient to sustain a U-235 chain reaction. Cadmium

control rods could be inserted to absorb neutrons and slow

the reaction.

It was also recognized that the U-238 in the reactor would absorb

accelerated neutrons to produce the new element plutonium, which

is also fissionable. During World War II (1939-1945), large reactors

were built to “breed” plutonium, which was easier to separate than

U-235. An experimental breeder reactor at Arco, Idaho, was the first

to use the energy of nuclear fission to produce a small amount of

electricity (about 100 watts) on December 20, 1951.





Nuclear Electricity



Power reactors designed to produce substantial amounts of

electricity use the heat generated by fission to produce steam or

hot gas to drive a turbine connected to an ordinary electric generator.

The first power reactor design to be developed in the United

States was the pressurized water reactor (PWR). In the PWR, water

under high pressure is used both as the moderator and as the coolant.

After circulating through the reactor core, the hot pressurized

water flows through a heat exchanger to produce steam. Reactors

moderated by “heavy water” (in which the hydrogen in the water

is replaced with deuterium, which contains an extra neutron) can

operate with natural uranium.

The pressurized water system was used in the first reactor to

produce substantial amounts of power, the experimental Mark I

reactor. It was started up on May 31, 1953, at the Idaho National

Engineering Laboratory. The Mark I became the prototype for the

reactor used in the first nuclear-powered submarine. Under the

leadership of Hyman G. Rickover, who was head of the Division of

Naval Reactors of the Atomic Energy Commission (AEC), Westinghouse

Electric Corporation was engaged to build a PWR system

to power the submarine USS Nautilus. It began sea trials in January

of 1955 and ran for two years before refueling.

In the meantime, the first experimental nuclear power plant for

generating electricity was completed in the Soviet Union in June of

1954, under the direction of the Soviet physicist Igor Kurchatov. It

produced 5 megawatts of electric power. The first full-scale nuclear

power plant was built in England under the direction of the British

nuclear engineer Sir Christopher Hinton. It began producing about

90 megawatts of electric power in October, 1956.

 On December 2, 1957, on the fifteenth anniversary of the first controlled

nuclear chain reaction, the Shippingport Atomic Power Station

in Shippingport, Pennsylvania, became the first full-scale commercial

nuclear power plant in the United States. It produced about

60 megawatts of electric power for the Duquesne Light Company until

1964, when its reactor core was replaced, increasing its power to

100 megawatts with a maximum capacity of 150 megawatts.





Consequences



The opening of the Shippingport Atomic Power Station marked

the beginning of the nuclear power industry in the United States,

with all of its glowing promise and eventual problems. It was predicted

that electrical energy would become too cheap to meter. The

AEC hoped to encourage the participation of industry, with government

support limited to research and development. They encouraged

a variety of reactor types in the hope of extending technical

knowledge.

The Dresden Nuclear Power Station, completed by Commonwealth

Edison in September, 1959, at Morris, Illinois, near Chicago,

was the first full-scale privately financed nuclear power station in

the United States. By 1973, forty-two plants were in operation producing

26,000 megawatts, fifty more were under construction, and

about one hundred were on order. Industry officials predicted that

50 percent of the nation’s electric power would be nuclear by the

end of the twentieth century.

The promise of nuclear energy has not been completely fulfilled.

Growing concerns about safety and waste disposal have led to increased

efforts to delay or block the construction of new plants. The

cost of nuclear plants rose as legal delays and inflation pushed costs

higher, so that many in the planning stages could no longer be competitive.

The 1979 Three Mile Island accident in Pennsylvania and

the much more serious 1986 Chernobyl accident in the Soviet Union

increased concerns about the safety of nuclear power. Nevertheless,

by 1986, more than one hundred nuclear power plants were operating

in the United States, producing about 60,000 megawatts of

power. More than three hundred reactors in twenty-five countries

provide about 200,000 megawatts of electric power worldwide.

 Many believe that, properly controlled, nuclear energy offers a

clean-energy solution to the problem of environmental pollution.





See also : Breeder reactor; Compressed-air-accumulating power

plant; Fuel cell; Geothermal power; Nuclear reactor; Solar thermal

engine; Nuclear power plant




 Further Reading :