Showing posts with label Nuclear. Show all posts
Showing posts with label Nuclear. Show all posts

Tuesday, September 8, 2009

Nuclear reactor







The invention: 



The first nuclear reactor to produce substantial

quantities of plutonium, making it practical to produce usable

amounts of energy from a chain reaction.



The people behind the invention:



Enrico Fermi (1901-1954), an American physicist

Martin D. Whitaker (1902-1960), the first director of Oak Ridge

National Laboratory

Eugene Paul Wigner (1902-1995), the director of research and

development at Oak Ridge









The Technology to End a War



The construction of the nuclear reactor at Oak Ridge National

Laboratory in 1943 was a vital part of the Manhattan Project, the effort

by the United States during World War II (1939-1945) to develop

an atomic bomb. The successful operation of that reactor

was a major achievement not only for the project itself but also for

the general development and application of nuclear technology.

The first director of the Oak Ridge National Laboratory was Martin

D. Whitaker; the director of research and development was Eugene

Paul Wigner.

The nucleus of an atom is made up of protons and neutrons. “Fission”

is the process by which the nucleus of certain elements is split

in two by a neutron from some material that emits an occasional

neutron naturally. When an atom splits, two things happen: A tremendous

amount of thermal energy is released, and two or three

neutrons, on the average, escape from the nucleus. If all the atoms in

a kilogram of “uranium 235” were to fission, they would produce as

much heat energy as the burning of 3 million kilograms of coal. The

neutrons that are released are important, because if at least one of

them hits another atom and causes it to fission (and thus to release

more energy and more neutrons), the process will continue. It will

become a self-sustaining chain reaction that will produce a continuing

supply of heat.

Inside a reactor, a nuclear chain reaction is controlled so that it

proceeds relatively slowly. The most familiar use for the heat thus

released is to boil water and make steam to turn the turbine generators

that produce electricity to serve industrial, commercial, and

residential needs. The fissioning process in a weapon, however, proceeds

very rapidly, so that all the energy in the atoms is produced

and released virtually at once. The first application of nuclear technology,

which used a rapid chain reaction, was to produce the two

atomic bombs that ended World War II.





Breeding Bomb Fuel



The work that began at Oak Ridge in 1943 was made possible by a

major event that took place in 1942. At the University of Chicago,

Enrico Fermi had demonstrated for the first time that it was possible to

achieve a self-sustaining atomic chain reaction. More important, the reaction

could be controlled: It could be started up, it could generate heat

and sufficient neutrons to keep itself going, and it could be turned off.

That first chain reaction was very slow, and it generated very little heat;

but it demonstrated that controlled fission was possible.

Any heat-producing nuclear reaction is an energy conversion

process that requires fuel. There is only one readily fissionable element

that occurs naturally and can be used as fuel. It is a form of

uranium called uranium 235. It makes up less than 1 percent of all

naturally occurring uranium. The remainder is uranium 238, which

does not fission readily. Even uranium 235, however, must be enriched

before it can be used as fuel.

The process of enrichment increases the concentration of uranium

235 sufficiently for a chain reaction to occur. Enriched uranium is used

to fuel the reactors used by electric utilities. Also, the much more plentiful

uranium 238 can be converted into plutonium 239, a form of the

human-made element plutonium, which does fission readily. That

conversion process is the way fuel is produced for a nuclear weapon.

Therefore, the major objective of the Oak Ridge effort was to develop a

pilot operation for separating plutonium from the uranium in which it

was produced. Large-scale plutonium production, which had never

been attempted before, eventually would be done at the Hanford Engineer

Works in Washington. First, however, plutonium had to be pro-

duced successfully on a small scale at Oak Ridge.

The reactor was started up on November 4, 1943. By March 1,

1944, the Oak Ridge laboratory had produced several grams of plutonium.

The material was sent to the Los Alamos laboratory in New

Mexico for testing. By July, 1944, the reactor operated at four times

its original power level. By the end of that year, however, plutonium

production at Oak Ridge had ceased, and the reactor thereafter was

used principally to produce radioisotopes for physical and biological

research and for medical treatment. Ultimately, the Hanford Engineer

Works’ reactors produced the plutonium for the bomb that

was dropped on Nagasaki, Japan, on August 9, 1945.

The original objectives for which Oak Ridge had been built had

been achieved, and subsequent activity at the facility was directed

toward peacetime missions that included basic studies of the structure

of matter.



Impact



The most immediate impact of the work done at Oak Ridge was

its contribution to ending World War II. When the atomic bombs

were dropped, the war ended, and the United States emerged intact.

The immediate and long-range devastation to the people of Japan,

however, opened the public’s eyes to the almost unimaginable

death and destruction that could be caused by a nuclear war. Fears

of such a war remain to this day, especially as more and more nations

develop the technology to build nuclear weapons.

On the other hand, great contributions to human civilization

have resulted from the development of nuclear energy. Electric

power generation, nuclear medicine, spacecraft power, and ship

propulsion have all profited from the pioneering efforts at the Oak

Ridge National Laboratory. Currently, the primary use of nuclear

energy is to produce electric power. Handled properly, nuclear energy

may help to solve the pollution problems caused by the burning

of fossil fuels.



See also Breeder reactor; Compressed-air-accumulating powerplant; Fuel cell;

Geothermal power; Heat pump; Nuclear power plant; Solar thermal engine; Nuclear reactor




















Nuclear power plant







The invention: 



The first full-scale commercial nuclear power plant,
which gave birth to the nuclear power industry.

 







The people behind the invention:



Enrico Fermi (1901-1954), an Italian American physicist who

won the 1938 Nobel Prize in Physics

Otto Hahn (1879-1968), a German physical chemist who won the

1944 Nobel Prize in Chemistry

Lise Meitner (1878-1968), an Austrian Swedish physicist

Hyman G. Rickover (1898-1986), a Polish American naval officer









Discovering Fission



Nuclear fission involves the splitting of an atomic nucleus, leading

to the release of large amounts of energy. Nuclear fission was

discovered in Germany in 1938 by Otto Hahn after he had bombarded

uranium with neutrons and observed traces of radioactive

barium. When Hahn’s former associate, Lise Meitner, heard of this,

she realized that the neutrons may have split the uranium nuclei

(each of which holds 92 protons) into two smaller nuclei to produce

barium (56 protons) and krypton (36 protons). Meitner and her

nephew, Otto Robert Frisch, were able to calculate the enormous energy

that would be released in this type of reaction. They published

their results early in 1939.

Nuclear fission was quickly verified in several laboratories, and

the Danish physicist Niels Bohr soon demonstrated that the rare uranium

235 (U-235) isotope is much more likely to fission than the common

uranium 238 (U-238) isotope, which makes up 99.3 percent of

natural uranium. It was also recognized that fission would produce

additional neutrons that could cause new fissions, producing even

more neutrons and thus creating a self-sustaining chain reaction. In

this process, the fissioning of one gram of U-235 would release about

as much energy as the burning of three million tons of coal.

The first controlled chain reaction was demonstrated on December

2, 1942, in a nuclear reactor at the University of Chicago, under

 the leadership of Enrico Fermi. He used a graphite moderator to

slow the neutrons by collisions with carbon atoms. “Critical mass”

was achieved when the mass of graphite and uranium assembled

was large enough that the number of neutrons not escaping from

the pile would be sufficient to sustain a U-235 chain reaction. Cadmium

control rods could be inserted to absorb neutrons and slow

the reaction.

It was also recognized that the U-238 in the reactor would absorb

accelerated neutrons to produce the new element plutonium, which

is also fissionable. During World War II (1939-1945), large reactors

were built to “breed” plutonium, which was easier to separate than

U-235. An experimental breeder reactor at Arco, Idaho, was the first

to use the energy of nuclear fission to produce a small amount of

electricity (about 100 watts) on December 20, 1951.





Nuclear Electricity



Power reactors designed to produce substantial amounts of

electricity use the heat generated by fission to produce steam or

hot gas to drive a turbine connected to an ordinary electric generator.

The first power reactor design to be developed in the United

States was the pressurized water reactor (PWR). In the PWR, water

under high pressure is used both as the moderator and as the coolant.

After circulating through the reactor core, the hot pressurized

water flows through a heat exchanger to produce steam. Reactors

moderated by “heavy water” (in which the hydrogen in the water

is replaced with deuterium, which contains an extra neutron) can

operate with natural uranium.

The pressurized water system was used in the first reactor to

produce substantial amounts of power, the experimental Mark I

reactor. It was started up on May 31, 1953, at the Idaho National

Engineering Laboratory. The Mark I became the prototype for the

reactor used in the first nuclear-powered submarine. Under the

leadership of Hyman G. Rickover, who was head of the Division of

Naval Reactors of the Atomic Energy Commission (AEC), Westinghouse

Electric Corporation was engaged to build a PWR system

to power the submarine USS Nautilus. It began sea trials in January

of 1955 and ran for two years before refueling.

In the meantime, the first experimental nuclear power plant for

generating electricity was completed in the Soviet Union in June of

1954, under the direction of the Soviet physicist Igor Kurchatov. It

produced 5 megawatts of electric power. The first full-scale nuclear

power plant was built in England under the direction of the British

nuclear engineer Sir Christopher Hinton. It began producing about

90 megawatts of electric power in October, 1956.

 On December 2, 1957, on the fifteenth anniversary of the first controlled

nuclear chain reaction, the Shippingport Atomic Power Station

in Shippingport, Pennsylvania, became the first full-scale commercial

nuclear power plant in the United States. It produced about

60 megawatts of electric power for the Duquesne Light Company until

1964, when its reactor core was replaced, increasing its power to

100 megawatts with a maximum capacity of 150 megawatts.





Consequences



The opening of the Shippingport Atomic Power Station marked

the beginning of the nuclear power industry in the United States,

with all of its glowing promise and eventual problems. It was predicted

that electrical energy would become too cheap to meter. The

AEC hoped to encourage the participation of industry, with government

support limited to research and development. They encouraged

a variety of reactor types in the hope of extending technical

knowledge.

The Dresden Nuclear Power Station, completed by Commonwealth

Edison in September, 1959, at Morris, Illinois, near Chicago,

was the first full-scale privately financed nuclear power station in

the United States. By 1973, forty-two plants were in operation producing

26,000 megawatts, fifty more were under construction, and

about one hundred were on order. Industry officials predicted that

50 percent of the nation’s electric power would be nuclear by the

end of the twentieth century.

The promise of nuclear energy has not been completely fulfilled.

Growing concerns about safety and waste disposal have led to increased

efforts to delay or block the construction of new plants. The

cost of nuclear plants rose as legal delays and inflation pushed costs

higher, so that many in the planning stages could no longer be competitive.

The 1979 Three Mile Island accident in Pennsylvania and

the much more serious 1986 Chernobyl accident in the Soviet Union

increased concerns about the safety of nuclear power. Nevertheless,

by 1986, more than one hundred nuclear power plants were operating

in the United States, producing about 60,000 megawatts of

power. More than three hundred reactors in twenty-five countries

provide about 200,000 megawatts of electric power worldwide.

 Many believe that, properly controlled, nuclear energy offers a

clean-energy solution to the problem of environmental pollution.





See also : Breeder reactor; Compressed-air-accumulating power

plant; Fuel cell; Geothermal power; Nuclear reactor; Solar thermal

engine; Nuclear power plant




 Further Reading :

















Friday, September 4, 2009

Nuclear magnetic resonance

The invention: Procedure that uses hydrogen atoms in the human body, strong electromagnets, radio waves, and detection equipment to produce images of sections of the brain. The people behind the invention: Raymond Damadian (1936- ), an American physicist and inventor Paul C. Lauterbur (1929- ), an American chemist Peter Mansfield (1933- ), a scientist at the University of Nottingham, England Peering into the Brain Doctors have always wanted the ability to look into the skull and see the human

brain without harming the patient who is being examined. Over the years, various attempts were made to achieve this ability. At one time, the use of X rays, which were first used byWilhelm Conrad Röntgen in 1895, seemed to be an option, but it was found that X rays are absorbed by bone, so the skull made it impossible to use X-ray technology to view the brain. The relatively recent use of computed tomography (CT) scanning, a computer-assisted imaging technology, made it possible to view sections of the head and other areas of the body, but the technique requires that the part of the body being “imaged,” or viewed, be subjected to a small amount of radiation, thereby putting the patient at risk. Positron emission tomography (PET) could also be used, but it requires that small amounts of radiation be injected into the patient, which also puts the patient at risk. Since the early 1940’s, however, a new technology had been developing. This technology, which appears to pose no risk to patients, is called “nuclear magnetic resonance spectroscopy.” It was first used to study the molecular structures of pure samples of chemicals. This method developed until it could be used to follow one chemical as it changed into another, and then another, in a living cell. By 1971, Raymond Damadian had proposed that body images that were more vivid and more useful than X rays could be produced by means of nuclear magnetic resonance spectroscopy. In 1978, he founded his own company, FONAR, which manufactured the scanners that are necessary for the technique. Magnetic Resonance Images The first nuclear magnetic resonance images (MRIs) were published by Paul Lauterbur in 1973. Although there seemed to be no possibility that MRI could be harmful to patients, everyone involved in MRI research was very cautious. In 1976, Peter Mansfield, at the University of Nottingham, England, obtained an MRI of his partner’s finger. The next year, Paul Bottomley, a member ofWaldo Hinshaw’s research group at the same university, put his left wrist into an experimental machine that the group had developed. A vivid cross section that showed layers of skin, muscle, bone, muscle, and skin, in that order, appeared on the machine’s monitor. Studies with animals showed no apparent memory or other brain problems. In 1978, Electrical and Musical Industries (EMI), a British corporate pioneer in electronics that merged with Thorn in 1980, obtained the first MRI of the human head. It took six minutes. An MRI of the brain, or any other part of the body, is made possible by the water content of the body. The gray matter of the brain contains more water than the white matter does. The blood vessels and the blood itself also have water contents that are different from those of other parts of the brain. Therefore, the different structures and areas of the brain can be seen clearly in an MRI. Bone contains very little water, so it does not appear on the monitor. This is why the skull and the backbone cause no interference when the brain or the spinal cord is viewed. Every water molecule contains two hydrogen atoms and one oxygen atom. A strong electromagnetic field causes the hydrogen molecules to line up like marchers in a parade. Radio waves can be used to change the position of these parallel hydrogen molecules. When the radio waves are discontinued, a small radio signal is produced as the molecules return to their marching position. This distinct radio signal is the basis for the production of the image on a computer screen.Hydrogen was selected for use in MRI work because it is very abundant in the human body, it is part of the water molecule, and it has the proper magnetic qualities. The nucleus of the hydrogen atom consists of a single proton, a particle with a positive charge. The signal from the hydrogen’s proton is comparatively strong. There are several methods by which the radio signal from the hydrogen atom can be converted into an image. Each method uses a computer to create first a two-dimensional, then a threedimensional, image. Peter Mansfield’s team at the University of Nottingham holds the patent for the slice-selection technique that makes it possible to excite and image selectively a specific cross section of the brain or any other part of the body. This is the key patent in MRI technology. Damadian was granted a patent that described the use of two coils, one to drive and one to pick up signals across selected portions of the human body. EMI, the company that introduced the X-ray scanner for CT images, developed a commercial prototype for the MRI. The British Technology Group, a state-owned company that helps to bring innovations to the marketplace, has sixteen separate MRIrelated patents. Ten years after EMI produced the first image of the human brain, patents and royalties were still being sorted out. Consequences MRI technology has revolutionized medical diagnosis, especially in regard to the brain and the spinal cord. For example, in multiple sclerosis, the loss of the covering on nerve cells can be detected. Tumors can be identified accurately. The painless and noninvasive use of MRI has almost completely replaced the myelogram, which involves using a needle to inject dye into the spine. Although there is every indication that the use of MRI is very safe, there are some people who cannot benefit from this valuable tool. Those whose bodies contain metal cannot be placed into the MRI machine. No one instrument can meet everyone’s needs. The development of MRI stands as an example of the interaction of achievements in various fields of science. Fundamental physics, biochemistry, physiology, electronic image reconstruction, advances in superconducting wires, the development of computers, and advancements in anatomy all contributed to the development of MRI. Its development is also the result of international efforts. Scientists and laboratories in England and the United States pioneered the technology, but contributions were also made by scientists in France, Switzerland, and Scotland. This kind of interaction and cooperation can only lead to greater understanding of the human brain.