Friday, April 25, 2014

Thermal cracking process







The invention: 



Process that increased the yield of refined gasoline

extracted from raw petroleum by using heat to convert complex

hydrocarbons into simpler gasoline hydrocarbons, thereby making

possible the development of the modern petroleum industry.



The people behind the invention:



William M. Burton (1865-1954), an American chemist

Robert E. Humphreys (1942- ), an American chemist










Gasoline, Motor Vehicles, and Thermal Cracking



Gasoline is a liquid mixture of hydrocarbons (chemicals made up

of only hydrogen and carbon) that is used primarily as a fuel for internal

combustion engines. It is produced by petroleum refineries

that obtain it by processing petroleum (crude oil), a naturally occurring

mixture of thousands of hydrocarbons, the molecules of which

can contain from one to sixty carbon atoms.

Gasoline production begins with the “fractional distillation” of

crude oil in a fractionation tower, where it is heated to about 400 degrees

Celsius at the tower’s base. This heating vaporizes most of the

hydrocarbons that are present, and the vapor rises in the tower,

cooling as it does so. At various levels of the tower, various portions

(fractions) of the vapor containing simple hydrocarbon mixtures become

liquid again, are collected, and are piped out as “petroleum

fractions.” Gasoline, the petroleum fraction that boils between 30

and 190 degrees Celsius, is mostly a mixture of hydrocarbons that

contain five to twelve carbon atoms.

Only about 25 percent of petroleum will become gasoline via

fractional distillation. This amount of “straight run” gasoline is not

sufficient to meet the world’s needs. Therefore, numerous methods

have been developed to produce the needed amounts of gasoline.

The first such method, “thermal cracking,” was developed in 1913

by William M. Burton of Standard Oil of Indiana. Burton’s cracking

process used heat to convert complex hydrocarbons (whose molecules

contain many carbon atoms) into simpler gasoline hydrocarbons

(whose molecules contain fewer carbon atoms), thereby increasing

the yield of gasoline from petroleum. Later advances in

petroleum technology, including both an improved Burton method

and other methods, increased the gasoline yield still further.





More Gasoline!



Starting in about 1900, gasoline became important as a fuel for

the internal combustion engines of the new vehicles called automobiles.

By 1910, half a million automobiles traveled American roads.

Soon, the great demand for gasoline—which was destined to grow

and grow—required both the discovery of new crude oil fields

around the world and improved methods for refining the petroleum

mined from these new sources. Efforts were made to increase

the yield of gasoline—at that time, about 15 percent—from petroleum.

The Burton method was the first such method.

At the time that the cracking process was developed, Burton was

the general superintendent of the Whiting refinery, owned by Standard

Oil of Indiana. The Burton process was developed in collaboration

with Robert E. Humphreys and F. M. Rogers. This three-person

research group began work knowing that heating petroleum

fractions that contained hydrocarbons more complex than those

present in gasoline—a process called “coking”—produced kerosene,

coke (a form of carbon), and a small amount of gasoline. The

process needed to be improved substantially, however, before it

could be used commercially.

Initially, Burton and his coworkers used the “heavy fuel” fraction

of petroleum (the 66 percent of petroleum that boils at a temperature

higher than the boiling temperature of kerosene). Soon, they

found that it was better to use only the part of the material that contained

its smaller hydrocarbons (those containing fewer carbon atoms),

all of which were still much larger than those present in gasoline.

The cracking procedure attempted first involved passing the

starting material through a hot tube. This hot-tube treatment vaporized

the material and broke down 20 to 30 percent of the larger hydrocarbons

into the hydrocarbons found in gasoline. Various tarry

products were also produced, however, that reduced the quality of

the gasoline that was obtained in this way.



Next, the investigators attempted to work at a higher temperature

by bubbling the starting material through molten lead. More

gasoline was made in this way, but it was so contaminated with

gummy material that it could not be used. Continued investigation

showed, however, that moderate temperatures (between those used

in the hot-tube experiments and that of molten lead) produced the

best yield of useful gasoline.

The Burton group then had the idea of using high pressure to

“keep starting materials still.” Although the theoretical basis for the

use of high pressure was later shown to be incorrect, the new

method worked quite well. In 1913, the Burton method was patented

and put into use. The first cracked gasoline, called Motor

Spirit, was not very popular, because it was yellowish and had a

somewhat unpleasant odor. The addition of some minor refining

procedures, however, soon made cracked gasoline indistinguishable

from straight run gasoline. Standard Oil of Indiana made huge

profits from cracked gasoline over the next ten years. Ultimately,

thermal cracking subjected the petroleum fractions that were

utilized to temperatures between 550 and 750 degrees Celsius, under

pressures between 250 and 750 pounds per square inch.



Impact



In addition to using thermal cracking to make gasoline for sale,

Standard Oil of Indiana also profited by licensing the process for use

by other gasoline producers. Soon, the method was used throughout

the oil industry. By 1920, it had been perfected as much as it

could be, and the gasoline yield from petroleum had been significantly

increased. The disadvantages of thermal cracking include a

relatively low yield of gasoline (compared to those of other methods),

the waste of hydrocarbons in fractions converted to tar and

coke, and the relatively high cost of the process.

A partial solution to these problems was found in “catalytic

cracking”—the next logical step from the Burton method—in which

petroleum fractions to be cracked are mixed with a catalyst (a substance

that causes a chemical reaction to proceed more quickly,

without reacting itself). The most common catalysts used in such

cracking were minerals called “zeolites.” The wide use of catalytic

cracking soon enabled gasoline producers to work at lower temperatures

(450 to 550 degrees Celsius) and pressures (10 to 50 pounds

per square inch). This use decreased manufacturing costs because

catalytic cracking required relatively little energy, produced only

small quantities of undesirable side products, and produced high quality

gasoline.

Various other methods of producing gasoline have been developed—

among them catalytic reforming, hydrocracking, alkylation,

and catalytic isomerization—and now about 60 percent of the petroleum

starting material can be turned into gasoline. These methods,

and others still to come, are expected to ensure that the world’s

needs for gasoline will continue to be satisfied—as long as petroleum

remains available.



See also: Fuel cell; Gas-electric car; Geothermal power; Internal

combustion engine; Oil-well drill bit; Solar thermal engine.


Thursday, April 3, 2014

Tevatron accelerator







The invention: 



A particle accelerator that generated collisions between

beams of protons and antiprotons at the highest energies

ever recorded.



The people behind the invention:



Robert Rathbun Wilson (1914- ), an American physicist and

director of Fermilab from 1967 to 1978

John Peoples (1933- ), an American physicist and deputy

director of Fermilab from 1987








Putting Supermagnets to Use



The Tevatron is a particle accelerator, a large electromagnetic device

used by high-energy physicists to generate subatomic particles

at sufficiently high energies to explore the basic structure of matter.

The Tevatron is a circular, tubelike track 6.4 kilometers in circumference

that employs a series of superconducting magnets to accelerate

beams of protons, which carry a positive charge in the atom, and

antiprotons, the proton’s negatively charged equivalent, at energies

up to 1 trillion electron volts (equal to 1 teraelectronvolt, or 1 TeV;

hence the name Tevatron). An electronvolt is the unit of energy that

an electron gains through an electrical potential of 1 volt.

The Tevatron is located at the Fermi National Accelerator Laboratory,

which is also known as Fermilab. The laboratory was one of

several built in the United States during the 1960’s.

The heart of the original Fermilab was the 6.4-kilometer main accelerator

ring. This main ring was capable of accelerating protons to

energies approaching 500 billion electron volts, or 0.5 teraelectronvolt.

The idea to build the Tevatron grew out of a concern for the

millions of dollars spent annually on electricity to power the main

ring, the need for higher energies to explore the inner depths of the

atom and the consequences of new theories of both matter and energy,

and the growth of superconductor technology. Planning for a

second accelerator ring, the Tevatron, to be installed beneath the

main ring began in 1972.

Robert Rathbun Wilson, the director of Fermilab at that time, realized

that the only way the laboratory could achieve the higher energies

needed for future experiments without incurring intolerable

electricity costs was to design a second accelerator ring that employed

magnets made of superconducting material. Extremely powerful

magnets are the heart of any particle accelerator; charged particles

such as protons are given a “push” as they pass through an electromagnetic

field. Each successive push along the path of the circular

accelerator track gives the particle more and more energy. The enormous

magnetic fields required to accelerate massive particles such

as protons to energies approaching 1 trillion electronvolts would require

electricity expenditures far beyond Fermilab’s operating budget.

Wilson estimated that using superconducting materials, however,

which have virtually no resistance to electrical current, would

make it possible for the Tevatron to achieve double the main ring’s

magnetic field strength, doubling energy output without significantly

increasing energy costs.





Tevatron to the Rescue



The Tevatron was conceived in three phases. Most important,

however, were Tevatron I and Tevatron II, where the highest energies

were to be generated and where it was hoped new experimental findings

would emerge. Tevatron II experiments were designed to be

very similar to other proton beam experiments, except that in this

case, the protons would be accelerated to an energy of 1 trillion

electron volts. More important still are the proton-anti proton colliding

beam experiments of Tevatron I. In this phase, beams of protons

and antiprotons rotating in opposite directions are caused to collide

in the Tevatron, producing a combined, or center-of-mass, energy

approaching 2 trillion electron volts, nearly three times the energy

achievable at the largest accelerator at Centre Européen de Recherche

Nucléaire (the European Center for Nuclear Research, or CERN).

John Peoples was faced with the problem of generating a beam of

antiprotons of sufficient intensity to collide efficiently with a beam

of protons. Knowing that he had the use of a large proton accelerator—

the old main ring—Peoples employed the two-ring mode in

which 120 billion electron volt protons from the main ring are aimed

at a fixed tungsten target, generating antiprotons, which scatter

from the target. These particles were extracted and accumulated in a

smaller storage ring. These particles could be accelerated to relatively

low energies. After sufficient numbers of antiprotons were

collected, they were injected into the Tevatron, along with a beam of

protons for the colliding beam experiments. On October 13, 1985,

Fermilab scientists reported a proton-antiproton collision with a

center-of-mass energy measured at 1.6 trillion electron volts, the

highest energy ever recorded.





Consequences



The Tevatron’s success at generating high-energy proton antiproton

collisions affected future plans for accelerator development

in the United States and offered the potential for important

discoveries in high-energy physics at energy levels that no other accelerator

could achieve.

Physics recognized four forces in nature: the electromagnetic

force, the gravitational force, the strong nuclear force, and the weak

nuclear force. A major goal of the physics community is to formulate

a theory that will explain all these forces: the so-called grand

unification theory. In 1967, one of the first of the so-called gauge theories

was developed that unified the weak nuclear force and the

electromagnetic force. One consequence of this theory was that the

weak force was carried by massive particles known as “bosons.”

The search for three of these particles—the intermediate vector bosons

W+, W-, and Z0—led to the rush to conduct colliding beam experiments

to the early 1970’s. Because the Tevatron was in the planning

phase at this time, these particles were discovered by a team of

international scientists based in Europe. In 1989, Tevatron physicists

reported the most accurate measure to date of the Z0 mass.

The Tevatron is thought to be the only particle accelerator in the

world with sufficient power to conduct further searches for the elusive

Higgs boson, a particle attributed to weak interactions by University

of Edinburgh physicist Peter Higgs in order to account for

the large masses of the intermediate vector bosons. In addition, the

Tevatron has the ability to search for the so-called top quark. Quarks

are believed to be the constituent particles of protons and neutrons.

Evidence has been gathered of five of the six quarks believed to exist.

Physicists have yet to detect evidence of the most massive quark,

the top quark.



See also:



Atomic bomb; Cyclotron; Electron microscope; Field ion

microscope; Geiger counter; Hydrogen bomb; Mass spectrograph;

Neutrino detector; Scanning tunneling microscope; Synchrocyclotron.

Monday, February 10, 2014

Television







The invention:



System that converts moving pictures and sounds

into electronic signals that can be broadcast at great distances.



The people behind the invention:



Vladimir Zworykin (1889-1982), a Soviet electronic engineer and

recipient of the National Medal of Science in 1967

Paul Gottlieb Nipkow (1860-1940), a German engineer and

inventor

Alan A. Campbell Swinton (1863-1930), a Scottish engineer and

Fellow of the Royal Society

Charles F. Jenkins (1867-1934), an American physicist, engineer,

and inventor










The Persistence of Vision



In 1894, an American inventor, Charles F. Jenkins, described a

scheme for electrically transmitting moving pictures. Jenkins’s idea,

however, was only one in an already long tradition of theoretical

television systems. In 1842, for example, the English physicist Alexander

Bain had invented an automatic copying telegraph for sending

still pictures. Bain’s system scanned images line by line. Similarly,

the wide recognition of the persistence of vision—the mind’s

ability to retain a visual image for a short period of time after the image

has been removed—led to experiments with systems in which

the image to be projected was repeatedly scanned line by line. Rapid

scanning of images became the underlying principle of all television

systems, both electromechanical and all-electronic.

In 1884, a German inventor, Paul Gottlieb Nipkow, patented a

complete television system that utilized a mechanical sequential

scanning system and a photoelectric cell sensitized with selenium

for transmission. The selenium photoelectric cell converted the light

values of the image being scanned into electrical impulses to be

transmitted to a receiver where the process would be reversed. The

electrical impulses led to light of varying brightnesses being produced

and projected on to a rotating disk that was scanned to reproduce

the original image. If the system—that is, the transmitter and

the receiver—were in perfect synchronization and if the disk rotated

quickly enough, persistence of vision enabled the viewer to

see a complete image rather than a series of moving points of light.

For a television image to be projected onto a screen of reasonable

size and retain good quality and high resolution, any system employing

only thirty to one hundred lines (as early mechanical systems

did) is inadequate.A few systems were developed that utilized

two hundred or more lines, but the difficulties these presented

made the possibility of an all-electronic system increasingly attractive.

These difficulties were not generally recognized until the early

1930’s, when television began to move out of the laboratory and into

commercial production.

Interest in all-electronic television paralleled interest in mechanical

systems, but solutions to technical problems proved harder to

achieve. In 1908, a Scottish engineer, Alan A. Campbell Swinton,

proposed what was essentially an all-electronic television system.

Swinton theorized that the use of magnetically deflected cathode-ray

tubes for both the transmitter and receiver in a system was possible.

In 1911, Swinton formally presented his idea to the Röntgen

Society in London, but the technology available did not allow for

practical experiments.





Zworykin’s Picture Tube









In 1923, Vladimir Zworykin, a Soviet electronic engineer working

for the Westinghouse Electric Corporation, filed a patent application

for the “iconoscope,” or television transmission tube. On

March 17, 1924, Zworykin applied for a patent for a two-way system.

The first cathode-ray tube receiver had a cathode, a modulating

grid, an anode, and a fluorescent screen.

Zworykin later admitted that the results were very poor and the

system, as shown, was still far removed from a practical television

system. Zworykin’s employers were so unimpressed that they admonished

him to forget television and work on something more

useful. Zworykin’s interest in television was thereafter confined to

his non working hours, as he spent the next year working on photographic

sound recording.

It was not until the late 1920’s that he was able to devote his full

attention to television. Ironically, Westinghouse had by then resumed

research in television, but Zworykin was not part of the

team. After he returned from a trip to France, where in 1928 he had

witnessed an exciting demonstration of an electrostatic tube, Westinghouse

indicated that it was not interested. This lack of corporate

support in Pittsburgh led Zworykin to approach the Radio Corporation

of America (RCA). According to reports, Zworykin demonstrated

his system to the Institute of Radio Engineers at Rochester,

New York, on November 18, 1929, claiming to have developed a

working picture tube, a tube that would revolutionize television development.

Finally, RCA recognized the potential.



Impact





The picture tube, or “kinescope,” developed by Zworykin changed

the history of television. Within a few years, mechanical systems

disappeared and television technology began to utilize systems

similar to Zworykin’s by use of cathode-ray tubes at both ends of

the system. At the transmitter, the image is focused upon a mosaic

screen composed of light-sensitive cells.A stream of electrons sweeps

the image, and each cell sends off an electric current pulse as it is hit

by the electrons, the light and shade of the focused image regulating

the amount of current.

This string of electrical impulses, after amplification and modification

into ultrahigh frequency wavelengths, is broadcast by antenna

to be picked up by any attuned receiver, where it is retransformed

into a moving picture in the cathode-ray tube receiver. The

cathode-ray tubes contain no moving parts, as the electron stream is

guided entirely by electric attraction.

Although both the iconoscope and the kinescope were far from

perfect when Zworykin initially demonstrated them, they set the

stage for all future television development.





Vladimir Zworykin



Born in 1889, Vladimir Kosma Zworykin grew up in Murom,

a small town two hundred miles east of Moscow. His father ran

a riverboat service, and Zworykin sometimes helped him, but

his mind was on electricity, which he studied on his own while

aboard his father’s boats. In 1906, he entered the St. Petersburg

Institute of Technology, and there he became acquainted with

the idea of television through the work of Professor Boris von

Rosing.

Zworykin assisted Rosing in his attempts to transmit pictures

with a cathode-ray tube. He served with the Russian Signal

Corps during World War I, but then fled to the United States

after the Bolshevist Revolution. In 1920 he got a job at Westinghouse’s

research laboratory in Pittsburgh, helping develop radio

tubes and photoelectric cells. He became an American citizen

in 1924 and completed a doctorate at the University of

Pittsburgh in 1926. By then he had already demonstrated his

iconoscope and applied for a patent. Unable to interest Westinghouse

in his invention, he moved to the Radio Corporation

of America (RCA) in 1929, and later became director of its electronics

research laboratory. RCA’s president, David Sarnoff,

also a Russian immigrant, had faith in Zworykin and his ideas.

Before Zworykin retired in 1954, RCA had invested $50 million

in television.

Among the many awards Zworykin received for his culture changing

invention was the National Medal of Science, presented

by President Lyndon Johnson in 1966. Zworykin died on

his birthday in 1982





See also : Color television; Community antenna television; Communications

satellite; Fiber-optics; FM radio; Holography; Internet;

Radio; Talking motion pictures.

Thursday, January 23, 2014

Telephone switching











The invention: 



The first completely automatic electronic system

for switching telephone calls.



The people behind the invention:



Almon B. Strowger (1839-1902), an American inventor

Charles Wilson Hoover, Jr. (1925- ), supervisor of memory

system development

Wallace Andrew Depp (1914- ), director of Electronic

Switching

Merton Brown Purvis (1923- ), designer of switching

matrices











Electromechanical Switching Systems



The introduction of electronic switching technology into the telephone

network was motivated by the desire to improve the quality

of the telephone system, add new features, and reduce the cost of

switching technology. Telephone switching systems have three features:

signaling, control, and switching functions. There were several

generations of telephone switching equipment before the first

fully electronic switching “office” (device) was designed.

The first automatic electromechanical (partly electronic and partly

mechanical) switching office was the Strowger step-by-step switch.

Strowger switches relied upon the dial pulses generated by rotary

dial telephones to move their switching elements to the proper positions

to connect one telephone with another. In the step-by-step process,

the first digit dialed moved the first mechanical switch into position,

the second digit moved the second mechanical switch into

position, and so forth, until the proper telephone connection was established.

These Strowger switching offices were quite large, and

they lacked flexibility and calling features.

The second generation of automatic electromechanical telephone

switching offices was of the “crossbar” type. Initially, crossbar

switches relied upon a specialized electromechanical controller called

a “marker” to establish call connections. Electromechanical telephone

switching offices had difficulty implementing additional features

and were unable to handle large numbers of incoming calls.







Electronic Switching Systems



In the early 1940’s, research into the programmed control of

switching offices began at the American Telephone and Telegraph

Company’s Bell Labs. This early research resulted in a trial office being

put into service in Morris, Illinois, in 1960. The Morris switch

used a unique memory called the “flying spot store.” It used a photographic

plate as a program memory, and the memory was accessed

optically. In order to change the memory, one had to scratch

out or cover parts of the photographic plate.

Before the development of the Morris switch, gas tubes had been

used to establish voice connections. This was accomplished by applying

a voltage difference across the end points of the conversation.

When this voltage difference was applied, the gas tubes would

conduct electricity, thus establishing the voice connection. The Morris

trial showed that gas tubes could not support the voltages that

the new technology required to make telephones ring or to operate

pay telephones.

The knowledge gained from the Morris trial led to the development

of the first full-scale, commercial, computer-controlled

electronic switch, the electronic switching system 1 (ESS-1). The

first ESS-1 went into service in New Jersey in 1965. In the ESS-1,

electromechanical switching elements, or relays, were controlled

by computer software. A centralized computer handled call processing.

Because the telephone service of an entire community

depends on the reliability of the telephone switching office, the

ESS-1 had two central processors, so that one would be available

if the other broke down. The switching system of the ESS-1 was

composed of electromechanical relays; the control of the switching

system was electronic, but the switching itself remained mechanical.

Bell Labs developed models to demonstrate the concept of integrating

digital transmission and switching systems. Unfortunately,

the solid state electronics necessary for such an undertaking had not

developed sufficiently at that time, so the commercial development

of digital switching was not pursued. New versions of the ESS continued

to employ electromechanical technology, although mechanical

switching elements can cause impulse noise in voice signals and

are larger and more difficult to maintain than electronic switching

elements. Ten years later, however, Bell Labs began to develop a digital

toll switch, the ESS-4, in which both switching and control functions

were electronic.

Although the ESS-1 was the first electronically controlled switching

system, it did not switch voices electronically. The ESS-1 used

computer control to move mechanical contacts in order to establish

a conversation. In a fully electronic switching system, the voices are

digitized before switching is performed. This technique, which is

called “digital switching,” is still used.

The advent of electronically controlled switching systems made

possible features such as call forwarding, call waiting, and detailed

billing for long-distance calls. Changing these services became a

matter of simply changing tables in computer programs. Telephone

maintenance personnel could communicate with the central processor

of the ESS-1 by using a teletype, and they could change numbers

simply by typing commands on the teletype. In electromechanically

controlled telephone switching systems, however, changing numbers

required rewiring.





Consequences





Electronic switching has greatly decreased the size of switching

offices. Digitization of the voice prior to transmission improves

voice quality. When telephone switches were electromechanical, a

large area was needed to house the many mechanical switches that

were required. In the era of electronic switching, voices are switched

digitally by computer. In this method, voice samples are read into a

computer memory and then read out of the memory when it is time

to connect a caller with a desired number. Basically, electronic telephone

systems are specialized computer systems that move digitized

voice samples between customers.

Telephone networks are moving toward complete digitization.

Digitization was first applied to the transmission of voice signals.

This made it possible for a single pair of copper wires to be shared

by a number of telephone users. Currently, voices are digitized

upon their arrival at the switching office. If the final destination of

the telephone call is not connected to the particular switching office,

the voice is sent to the remote office by means of digital circuits.

Currently, voice signals are sent between the switching office and

homes or businesses. In the future, digitization of the voice signal

will occur in the telephone sets themselves. Digital voice signals

will be sent directly from one telephone to another. This will provide

homes with direct digital communication. Anetwork that provides

such services is called the “integrated services digital network”

(ISDN).



See also : Cell phone; Long-distance telephone; Rotary dial telephone;




Thursday, October 24, 2013

Teflon

















The invention: 



Afluorocarbon polymer whose chemical inertness
and physical properties have made it useful for many applications,
from nonstick cookware coatings to suits for astronauts.


The person behind the invention:


Roy J. Plunkett (1910-1994), an American chemist










Nontoxic Refrigerant Sought


As the use of mechanical refrigeration increased in the late 1930’s,
manufacturers recognized the need for a material to replace sulfur
dioxide and ammonia, which, although they were the commonly
used refrigerants of the time, were less than ideal for the purpose.
The material sought had to be nontoxic, odorless, colorless, and not
flammable. Thomas Midgley, Jr., and Albert Henne of General Motors
Corporation’s Frigidaire Division concluded, from studying
published reports listing properties of a wide variety of chemicals,
that hydrocarbon-like materials with hydrogen atoms replaced by
chlorine and fluorine atoms would be appropriate.
Their conclusion led to the formation of a joint effort between the
General Motors Corporation’s Frigidaire Division and E. I. Du Pont
de Nemours to research and develop the chemistry of fluorocarbons.
In this research effort, a number of scientists began making
and studying the large number of individual chemicals in the general
class of compounds being investigated. It fell to Roy J. Plunkett
to do a detailed study of tetrafluoroethylene, a compound consisting
of two carbon atoms, each of which is attached to the other as
well as to two fluorine atoms.



The “Empty” Tank


Tetrafluoroethylene, at normal room temperature and pressure,
is a gas that is supplied to users in small pressurized cylinders. On
the morning of the day of the discovery, Plunkett attached such a
tank to his experimental apparatus and opened the tank’s valve. To

his great surprise, no gas flowed from the tank. Plunkett’s subsequent
actions transformed this event from an experiment gone
wrong into a historically significant discovery. Rather than replacing
the tank with another and going on with the work planned for
the day, Plunkett, who wanted to know what had happened, examined
the “empty” tank. When he weighed the tank, he discovered
that it was not empty; it did contain the chemical that was listed on
the label. Opening the valve and running a wire through the opening
proved that what had happened had not been caused by a malfunctioning
valve. Finally, Plunkett sawed the cylinder in half and
discovered what had happened. The chemical in the tank was no
longer a gas; instead, it was a waxy white powder.
Plunkett immediately recognized the meaning of the presence of
the solid. The six-atom molecules of the tetrafluoroethylene gas had
somehow linked with one another to form much larger molecules.
The gas had polymerized, becoming polytetrafluoroethylene, a solid
with a high molecular weight. Capitalizing on this occurrence,
Plunkett, along with other Du Pont chemists, performed a series of
experiments and soon learned to control the polymerization reaction
so that the product could be produced, its properties could be
studied, and applications for it could be developed.
The properties of the substance were remarkable indeed. It was
unaffected by strong acids and bases, withstood high temperatures
without reacting or melting, and was not dissolved by any solvent
that the scientists tried. In addition to this highly unusual behavior,
the polymer had surface properties that made it very slick. It was so
slippery that other materials placed on its surface slid off in much
the same way that beads of water slide off the surface of a newly
waxed automobile.

Although these properties were remarkable, no applications were
suggested immediately for the new material. The polymer might
have remained a laboratory curiosity if a conversation had not
taken place between Leslie R. Groves, the head of the Manhattan
Project (which engineered the construction of the first atomic bombs),
and a Du Pont chemist who described the polymer to him. The
Manhattan Project research team was hunting for an inert material
to use for gaskets to seal pumps and piping. The gaskets had to be
able to withstand the highly corrosive uranium hexafluoride with

which the team was working. This uranium compound is fundamental
to the process of upgrading uranium for use in explosive devices
and power reactors. Polytetrafluoroethylene proved to be just
the material that they needed, and Du Pont proceeded, throughout
World War II and after, to manufacture gaskets for use in uranium
enrichment plants.
The high level of secrecy of the Manhattan Project in particular
and atomic energy in general delayed the commercial introduction
of the polymer, which was called Teflon, until the late 1950’s. At that
time, the first Teflon-coated cooking utensils were introduced.



Impact


Plunkett’s thoroughness in following up a chance observation
gave the world a material that has found a wide variety of uses, ranging
from home kitchens to outer space. Some applications make use

of Teflon’s slipperiness, othersmake use of its inertness, and others take

advantage of both properties.
The best-known application of Teflon is as a nonstick coating for cookware.
Teflon’s very slippery surface initially was troublesome, when it proved to be
difficult to attach to other materials. Early versions of Teflon-coated cookware

shed their surface coatings easily, even when care was taken to avoid scraping it off.

A suitable bonding process was soon developed, however, and the present coated

surfaces are very rugged and provide a noncontaminating coating that can be cleaned
easily.
Teflon has proved to be a useful material in making devices that
are implanted in the human body. It is easily formed into various
shapes and is one of the few materials that the human body does not
reject. Teflon has been used to make heart valves, pacemakers, bone
and tendon substitutes, artificial corneas, and dentures.
Teflon’s space applications have included its use as the outer skin
of the suits worn by astronauts, as insulating coating on wires and
cables in spacecraft that must resist high-energy cosmic radiation,
and as heat-resistant nose cones and heat shields on spacecraft.















Roy J. Plunkett






Roy J. Plunkett was born in 1910 in New Carlisle, Ohio. In
1932 he received a bachelor’s degree in chemistry from Manchester
College and transferred to Ohio State University for
graduate school, earning a master’s degree in 1933 and a doctorate
in 1936. The same year he went to work for E. I. Du Pont
de Nemours and Company as a research chemist at the Jackson
Laboratory in Deepwater, New Jersey. Less then two years later,
when he was only twenty-seven years old, he found the strange
polymer tetrafluoroethylene, whose trade name became Teflon.
It would turn out to be among Du Pont’s most famous products.
In 1938 Du Pont appointed Plunkett the chemical supervisor
at its largest plant, the Chamber Works in Deepwater, which
produced tetraethyl lead. He held the position until 1952 and
afterward directed the company’s Freon Products Division. He
retired in 1975. In 1985 he was inducted into the Inventor’s Hall
of Fame, and after his death in 1994, Du Pont created the
Plunkett Award, presented to inventors who find new uses for
Teflon and Tefzel, a related fluoropolymer, in



See also :



Buna rubber; Neoprene; Nylon; Plastic; Polystyrene;



Saturday, June 1, 2013

Talking motion pictures





The invention:



The first practical system for linking sound with

moving pictures.



The people behind the invention:



Harry Warner (1881-1958), the brother who used sound to

fashion a major filmmaking company

Albert Warner (1884-1967), the brother who persuaded theater

owners to show Warner films

Samuel Warner (1887-1927), the brother who adapted soundrecording

technology to filmmaking

Jack Warner (1892-1978), the brother who supervised the

making of Warner films











Taking the Lead



The silent films of the early twentieth century had live sound accompaniment

featuring music and sound effects. Neighborhood

theaters made do with a piano and violin; larger “picture palaces”

in major cities maintained resident orchestras of more than seventy

members. During the late 1920’s, Warner Bros. led the American

film industry in producing motion pictures with their own soundtracks,

which were first recorded on synchronized records and later

added on to the film beside the images.

The ideas that led to the addition of sound to film came from corporate-

sponsored research by American Telephone and Telegraph

Company (AT&T) and the Radio Corporation of America (RCA).

Both companies worked to improve sound recording and playback,

AT&T to help in the design of long-distance telephone equipment

and RCAas part of the creation of better radio sets. Yet neither company

could, or would, enter filmmaking. AT&T was willing to contract

its equipment out to Paramount or one of the other major Hollywood

studios of the day; such studios, however, did not want to

risk their sizable profit positions by junking silent films. The giants

of the film industry were doing fine with what they had and did not

want to switch to something that had not been proved.

In 1924,Warner Bros. was a prosperous, though small, corporation

that produced films with the help of outside financial backing. That

year, HarryWarner approached the importantWall Street investment

banking house of Goldman, Sachs and secured the help he needed.

As part of this initial wave of expansion,Warner Bros. acquired a

Los Angeles radio station in order to publicize its films. Through

this deal, the four Warner brothers learned of the new technology

that the radio and telephone industries had developed to record

sound, and they succeeded in securing the necessary equipment

from AT&T. During the spring of 1925, the brothers devised a plan

by which they could record the most popular musical artists on film

and then offer these “shorts” as added attractions to theaters that

booked its features. As a bonus, Warner Bros. could add recorded

orchestral music to its feature films and offer this music to theaters

that relied on small musical ensembles.





“Vitaphone”



On August 6, 1926,Warner Bros. premiered its new “Vitaphone”

technology. The first package consisted of a traditional silent film

(Don Juan) with a recorded musical accompaniment, plus six recordings

of musical talent highlighted by a performance from Giovanni

Martineli, the most famous opera tenor of the day.

The first Vitaphone feature was The Jazz Singer, which premiered

in October, 1927. The film was silent during much of the movie, but

as soon as Al Jolson, the star, broke into song, the new technology

would be implemented. The film was an immediate hit. The Jazz

Singer package, which included accompanying shorts with sound,

forced theaters in cities that rarely held films over for more than a

single week to ask to have the package stay for two, three, and

sometimes four straight weeks.

The Jazz Singer did well at the box office, but skeptics questioned

the staying power of talkies. If sound was so important, they wondered,

why hadn’t The Jazz Singer moved to the top of the all-time

box-office list? Such success, though, would come a year later with

The Singing Fool, also starring Jolson. From its opening day (September

20, 1928), it was the financial success of its time; produced for an

estimated $200,000, it took in $5 million. In New York City, The

Singing Fool registered the heaviest business in Broadway history,

with an advance sale that exceeded more than $100,000 (equivalent

to more than half a million dollars in 1990’s currency).





Impact



The coming of sound transformed filmmaking, ushering in what

became known as the golden age of Hollywood. By 1930, there were

more reporters stationed in the filmmaking capital of the world

than in any capital of Europe or Asia.

As a result of its foresight,Warner Bros. was the sole small competitor

of the early 1920’s to succeed in the Hollywood elite, producing

successful films for consumption throughout the world.

After Warner Bros.’ innovation, the soundtrack became one of

the features that filmmakers controlled when making a film. Indeed,

sound became a vital part of the filmmaker’s art; music, in

particular, could make or break a film.

Finally, the coming of sound helped make films a dominant medium

of mass culture, both in the United States and throughout the

world. Innumerable fashions, expressions, and designs were soon created

or popularized by filmmakers. Many observers had not viewed

the silent cinema as especially significant; with the coming of the talkies,

however, there was no longer any question about the social and

cultural importance of films. As one clear consequence of the new

power of the movie industry, within a few years of the coming of

sound, the notorious Hays Code mandating prior restraint of film content

went into effect. The pairing of images and sound caused talking

films to be deemed simply too powerful for uncensored presentation

to audiences; although the Hays Code was gradually weakened and

eventually abandoned, less onerous “rating systems” would continue

to be imposed on filmmakers by various regulatory bodies.





The Warner Brothers


Businessmen rather than inventors, the four Warner brothers
were hustlers who knew a good thing when they saw it.
They started out running theaters in 1903, evolved into film distributors,
and began making their own films in 1909, in defiance
of the Patents Company, a trust established by Thomas A. Edison
to eliminate competition from independent filmmakers.
HarryWarner was the president of the company, Sam and Jack
were vice presidents in charge of production, and Abe (or Albert)
was the treasurer.
Theirs was a small concern. Their silent films and serials attracted
few audiences, and during World War I they made
training films for the government. In fact, their film about syphilis,
Open Your Eyes, was their first real success. In 1918, however,
they released My Four Years in Germany, a dramatized
documentary, and it was their first blockbuster. Although considered
gauche upstarts, they were suddenly taken seriously by
the movie industry.
When Sam first heard an actor talk on screen in an experimental
film at the Bell lab in New York in 1925, he recognized a
revolutionary opportunity. He soon convinced Jack that talking
movies would be a gold mine. However, Harry and Abe were
against the idea because of its costs—and because earlier attempts
at “talkies” had been dismal failures. Sam and Jack
tricked Harry into a seeing a experimental film of an orchestra,
however, and he grew enthusiastic despite his misgivings.Within
a year, the brothers released the all-music Don Juan. The rave
notices from critics astounded Harry and Abe.
Still, they thought sound in movies was simply a novelty.
When Sam pointed out that they could make movies in which
the actors talked, as on stage, Harry, who detested actors, snorted,
“Who the hell wants to hear actors talk?” Sam and Jack pressed
for dramatic talkies, nonetheless, and prevailed upon Harry to
finance them. The silver screen has seldom been silent since.






See also :



Autochrome plate; Dolby noise reduction; Electronicsynthesizer;



Further Reading :











Friday, February 15, 2013

Syphilis test











The invention: 



The first simple test for detecting the presence of

the venereal disease syphilis led to better syphilis control and

other advances in immunology.



The people behind the invention:



Reuben Leon Kahn (1887-1974), a Soviet-born American

serologist and immunologist



August von Wassermann (1866-1925), a German physician and

bacteriologist









Columbus’s Discoveries



Syphilis is one of the chief venereal diseases, a group of diseases

whose name derives from Venus, the Roman goddess of love. The

term “venereal” arose from the idea that the diseases were transmitted

solely by sexual contact with an infected individual. Although

syphilis is almost always passed from one person to another in this

way, it occasionally arises after contact with objects used by infected

people in highly unclean surroundings, particularly in the underdeveloped

countries of the world.

It is believed by many that syphilis was introduced to Europe by

the members of Spanish explorer Christopher Columbus’s crew—

supposedly after they were infected by sexual contact withWest Indian

women—during their voyages of exploration. Columbus is reported

to have died of heart and brain problems very similar to

symptoms produced by advanced syphilis. At that time, according

to many historians, syphilis spread rapidly over sixteenth century

Europe. The name “syphilis” was coined by the Italian physician

Girolamo Fracastoro in 1530 in an epic poem he wrote.

Modern syphilis is much milder than the original disease and relatively

uncommon. Yet, if it is not identified and treated appropriately,

syphilis can be devastating and even fatal. It can also be passed from

pregnant mothers to their unborn children. In these cases, the afflicted

children will develop serious health problems that can include

paralysis, insanity, and heart disease. Therefore, the understanding,

detection, and cure of syphilis are important worldwide.

Syphilis is caused by a spiral-shaped germ called a “spirochete.”

Spirochetes enter the body through breaks in the skin or through the

mucous membranes, regardless of how they are transmitted. Once

spirochetes enter the body, they spread rapidly. During the first four

to six weeks after infection, syphilis—said to be in its primary

phase—is very contagious. During this time, it is identified by the

appearance of a sore, or chancre, at the entry site of the infecting spirochetes.

The chancre disappears quickly, and within six to twenty-four

weeks, the disease shows itself as a skin rash, feelings of malaise,

and other flulike symptoms (secondary-phase syphilis). These problems

also disappear quickly in most cases, and here is where the real

trouble—latent syphilis—begins. In latent syphilis, now totally without

symptoms, spirochetes that have spread through the body may

lodge in the brain or the heart. When this happens, paralysis, mental

incapacitation, and death may follow.





Testing Before Marriage







Because of the danger to unborn children, Americans wishing to

marry must be certified as being free of the disease before a marriage

license is issued. The cure for syphilis is easily accomplished

through the use of penicillin or other types of antibiotics, though no

vaccine is yet available to prevent the disease. It is for this reason

that syphilis detection is particularly important.

The first viable test for syphilis was originated by August von

Wassermann in 1906. In this test, blood samples are taken and

treated in a medical laboratory. The treatment of the samples is

based on the fact that the blood of infected persons has formed antibodies

to fight the syphilis spirochete, and that these antibodies will

react with certain body chemicals to cause the blood sample to clot.

This indicates the person has the disease. After the syphilis has been

cured, the antibodies disappear, as does the clotting.

Although the Wassermann test was effective in 95 percent of all

infected persons, it was very time-consuming (requiring a two-day

incubation period) and complex. In 1923, Reuben Leon Kahn developed

a modified syphilis test, “the standard Kahn test,” that was

simpler and faster: The test was complete after only a few minutes.

By 1925, Kahn’s test had become the standard syphilis test of the

United States Navy and later became a worldwide test for the detection

of the disease.

Kahn soon realized that his test was not perfect and that in some

cases, the results were incorrect. This led him to a broader study of

the immune reactions at the center of the Kahn test. He investigated

the role of various tissues in immunity, as compared to the role of

white blood antibodies and white blood cells. Kahn showed, for example,

that different tissues of immunized or nonimmunized animals

possessed differing immunologic capabilities. Furthermore,

the immunologic capabilities of test animals varied with their

age, being very limited in newborns and increasing as they matured.

This effort led, by 1951, to Kahn’s “universal serological reaction,”

a precipitation reaction in which blood serum was tested

against a reagent composed of tissue lipids. Kahn viewed it as a potentially

helpful chemical indicator of how healthy or ill an individual

was. This effort is viewed as an important landmark in the development

of the science of immunology.



Impact



At the time that Kahn developed his standard Kahn test for syphilis,

theWassermann test was used all over the world for the diagnosis

of syphilis. As has been noted, one of the great advantages of the

standard Kahn test was its speed, minutes versus days. For example,

in October, 1923, Kahn is reported to have tested forty serum

samples in fifteen minutes.

Kahn’s efforts have been important to immunology and to medicine.

Among the consequences of his endeavors was the stimulation

of other developments in the field, including the VDRL test (originated

by the Venereal Disease Research Laboratory), which has replaced

the Kahn test as one of the most often used screening tests for

syphilis. Even more specific syphilis tests developed later include a

fluorescent antibody test to detect the presence of the antibody to

the syphilis spirochete.





See also: Abortion pill;Antibacterial drugs; Birth control pill;

Mammography; Pap test; Penicillin;







Further Reading :