Sunday, November 30, 2008

Apple II computer



The invention:

The first commercially available, preassembled
personal computer, the Apple II helped move computers out of
the workplace and into the home.


The people behind the invention:


Stephen Wozniak (1950- ), cofounder of Apple and designer
of the Apple II computer
Steven Jobs (1955-2011 ), cofounder of Apple
Regis McKenna (1939- ), owner of the Silicon Valley public
relations and advertising company that handled the Apple
account
Chris Espinosa (1961- ), the high school student who wrote
the BASIC program shipped with the Apple II
Randy Wigginton (1960- ), a high school student and Apple
software programmer







 Inventing the Apple


As late as the 1960’s, not many people in the computer industry
believed that a small computer could be useful to the average person.
It was through the effort of two friends from the Silicon Valley—
the high-technology area between San Francisco and San Jose—
that the personal computer revolution was started.
Both Steven Jobs and StephenWozniak had attended Homestead
High School in Los Altos, California, and both developed early interests
in technology, especially computers. In 1971, Wozniak built
his first computer from spare parts. Shortly after this, he was introduced
to Jobs. Jobs had already developed an interest in electronics
(he once telephoned William Hewlett, cofounder of Hewlett-
Packard, to ask for parts), and he and Wozniak became friends.
Their first business together was the construction and sale of “blue
boxes,” illegal devices that allowed the user to make long-distance
telephone calls for free.
After attending college, the two took jobs within the electronics
industry. Wozniak began working at Hewlett-Packard, where he

studied calculator design, and Jobs took a job at Atari, the video
company. The friendship paid off again whenWozniak, at Jobs’s request,
designed the game “Breakout” for Atari, and the pair was
paid seven hundred dollars.
In 1975, the Altair computer, a personal computer in kit form,
was introduced by Micro Instrumentation and Telemetry Systems
(MITS). Shortly thereafter, the first personal computer club, the
Homebrew Computer Club, began meeting in Menlo Park, near
Stanford University. Wozniak and Jobs began attending the meeting
regularly. Wozniak eagerly examined the Altairs that others
brought. He thought that the design could be improved. In only a
few more weeks, he produced a circuit board and interfaces that
connected it to a keyboard and a video monitor. He showed the machine
at a Homebrew meeting and distributed photocopies of the
design.
In this new machine, which he named an “Apple,” Jobs saw a big
opportunity. He talked Wozniak into forming a partnership to develop
personal computers. Jobs sold his car, and Wozniak sold his
two Hewlett-Packard calculators; with the money, they ordered
printed circuit boards made. Their break came when Paul Terrell, a
retailer, was so impressed that he ordered fifty fully assembled Apples.
Within thirty days, the computers were completed, and they
sold for a fairly high price: $666.66.
During the summer of 1976,Wozniak kept improving the Apple.
The new computer would come with a keyboard, an internal power
supply, a built-in computer language called the Beginner’s All-
Purpose Symbolic Instruction Code” (BASIC), hookups for adding
printers and other devices, and color graphics, all enclosed in a plastic
case. The output would be seen on a television screen. The machine
would sell for twelve hundred dollars.


Selling the Apple


Regis McKenna was the head of the Regis McKenna Public Relations
agency, the best of the public relations firms that served the
high-technology industries of the valley, which Jobs wanted to handle
the Apple account. At first, McKenna rejected the offer, but
Jobs’s constant  pleading finally convinced him. The agency’s first

contributions to Apple were the colorful striped Apple logo and a
color ad in Playboy magazine.
In February, 1977, the first Apple Computer office was opened in
Cupertino, California. By this time, two of Wozniak’s friends from
Homebrew, Randy Wigginton and Chris Espinosa—both high school
students—had joined the company. Their specialty was writing software.
Espinosa worked through his Christmas vacation so that BASIC
(the built-in computer language) could ship with the computer.

The team pushed ahead to complete the new Apple in time to
display it at the First West Coast Computer Faire in April, 1977. At
this time, the name “Apple II” was chosen for the new model. The
Apple II computer debuted at the convention and included many
innovations. The “motherboard” was far simpler and more elegantly
designed than that of any previous computer, and the ease of
connecting the Apple II to a television screen made it that much
more attractive to consumers.

Consequences
The introduction of the Apple II computer launched what was to
be a wave of new computers aimed at the home and small-business
markets.Within a few months of the Apple II’s introduction, Commodore
introduced its PET computer and Tandy Corporation/Radio
Shack brought out its TRS-80. Apple continued to increase the
types of things that its computers could do and worked out a distribution
deal with the new ComputerLand chain of stores.
In December, 1977, Wozniak began work on creating a floppy
disk system for the Apple II. (Afloppy disk is a small, flexible plastic
disk coated with magnetic material. The magnetized surface enables
computer data to be stored on the disk.) The cassette tape storage
on which all personal computers then depended was slow and
unreliable. Floppy disks, which had been introduced for larger computers
by the International Business Machines (IBM) Corporation in
1970, were fast and reliable. As he did with everything that interested
him,Wozniak spent almost all of his time learning about and
designing a floppy disk drive. When the final drive shipped in June,
1978, it made possible development of more powerful software for
the computer.
By 1980, Apple had sold 130,000 Apple II’s. That year, the company
went public, and Jobs and Wozniak, among others, became
wealthy. Three years later, Apple became the youngest company to
make the Fortune 500 list of the largest industrial companies. By
then, IBM had entered the personal computer field and had begun
to dominate it, but the Apple II’s earlier success ensured that personal
computers would not be a market fad. By the end of the
1980’s, 35 million personal computers would be in use.







                                                                     Steven Jobs










While IBM and other corporations were devoting massive
resources and talent to designing a small computer in 1975,
Steven Paul Jobs and Stephen Wozniak, members of the tiny
Homebrew Computer Club, put together the first truly userfriendly
personal computer in Wozniak’s home. Jobs admitted
later that “Woz” was the engineering brains. Jobs himself was
the brains of design and marketing. Both had to scrape together
money for the project from their small salaries as low-level electronics
workers. Within eight years, Jobs headed the most progressive
company in the new personal computer industry and
was worth an estimated $210 million.
Little in his background foretold such fast, large material
success. Jobs was born in 1955 and became an orphan. Adopted
by Paul and Clara Jobs, he grew up in California towns near the
area that became known as Silicon Valley. He did not like school
much and was considered a loner, albeit one who always had a
distinctive way of thinking about things. Still in high school, he
impressed William Hewlett, founder of Hewlett-Packard in
Palo Alto, and won a summer job at the company, as well as
some free equipment for one of his school projects.
However, he dropped out of Reed College after one semester
and became a hippie. He studied philosophy and Chinese
and Indian mysticism. He became a vegetarian and practiced
meditation. He even shaved his head and traveled to India on a
spiritual pilgrimage. When he returned to America, however,
he also returned to his interest in electronics and computers.
Through various jobs at his original company, Apple, and elsewhere,
he stayed there.





See also : BINAC computer ; Colossus computer ; ENIAC computer ; Floppy disk ; Hard disk ; IBM Model 1401 Computer ; Personal computer  ; Wikipedia - Steven Jobs




Thursday, November 27, 2008

Antibacterial drugs



Mechanisms of genetic resistance to antimicrobial agents:

Bacteria have developed, or will develop, genetic resistance to all known antimicrobial agents that are now in the marketplace. The five main mechanisms that bacteria use to resist antibacterial drugs are shown in the figure.
a | The site of action (enzyme, ribosome or cell-wall precursor) can be altered. For example, acquiring a plasmid or transposon that codes for a resistant dihydrofolate reductase confers trimethoprim resistance to bacteria52.
b | The inhibited steps can be by-passed.
c | Bacteria can reduce the intracellular concentration of the antimicrobial agent, either by reducing membrane permeability, for example, as shown by Pseudomonas aeruginosa53, or by active efflux of the agent54.
d | They can inactivate the drug. For example, some bacteria produce beta-lactamase, which destroys the penicillin beta-lactam ring50, 51 .
e | The target enzyme can be overproduced by the bacteria.



The invention:

Sulfonamides and other drugs that have proved effective
in combating many previously untreatable bacterial diseases.

The people behind the invention:

Gerhard Domagk (1895-1964), a German physician who was
awarded the 1939 Nobel Prize in Physiology or Medicine
Paul Ehrlich (1854-1915), a German chemist and bacteriologist
who was the cowinner of the 1908 Nobel Prize in Physiology
or Medicine.

The Search for Magic Bullets

Although quinine had been used to treat malaria long before the
twentieth century, Paul Ehrlich, who discovered a large number of
useful drugs, is usually considered the father of modern chemotherapy.
Ehrlich was familiar with the technique of using dyes to stain
microorganisms in order to make them visible under a microscope,
and he suspected that some of these dyes might be used to poison
the microorganisms responsible for certain diseases without hurting
the patient. Ehrlich thus began to search for dyes that could act
as “magic bullets” that would destroy microorganisms and cure
diseases. From 1906 to 1910, Ehrlich tested numerous compounds
that had been developed by the German dye industry. He eventually
found that a number of complex trypan dyes would inhibit the
protozoans that caused African sleeping sickness.
Ehrlich and his coworkers also synthesized hundreds of organic
compounds that contained arsenic. In 1910, he found that one of
these compounds, salvarsan, was useful in curing syphilis, a sexually
transmitted disease caused by the bacterium Treponema. This
was an important discovery, because syphilis killed thousands of
people each year. Salvarsan, however, was often toxic to patients,
because it had to be taken in large doses for as long as two years to
effect a cure. Ehrlich thus searched for and found a less toxic arsenic
compound, neosalvarsan, which replaced salvarsan in 1912.

In 1915, tartar emetic (a compound containing the metal antimony)
was found to be useful in treating kala-azar, which was
caused by a protozoan. Kala-azar affected millions of people in Africa,
India, and Asia, causing much suffering and many deaths each
year. Two years later, it was discovered that injection of tartar emetic
into the blood of persons suffering from bilharziasis killed the
flatworms infecting the bladder, liver, and spleen. In 1920, suramin,
a colorless compound developed from trypan red, was introduced
to treat African sleeping sickness. It was much less toxic to the patient
than any of the drugs Ehrlich had developed, and a single dose
would give protection for more than a month. From the dye methylene
blue, chemists made mepacrine, a drug that was effective
against the protozoans that cause malaria. This chemical was introduced
in 1933 and used duringWorldWar II; its principal drawback
was that it could cause a patient’s skin to become yellow.

Well Worth the Effort

Gerhard Domagk had been trained in medicine, but he turned to
research in an attempt to discover chemicals that would inhibit or
kill microorganisms. In 1927, he became director of experimental
pathology and bacteriology at the Elberfeld laboratories of the German
chemical firm I. G. Farbenindustrie. Ehrlich’s discovery that
trypan dyes selectively poisoned microorganisms suggested to Domagk
that he look for antimicrobials in a new group of chemicals
known as azo dyes. A number of these dyes were synthesized
from sulfonamides and purified by Fritz Mietzsch and Josef Klarer.
Domagk found that many of these dyes protected mice infected
with the bacteria Streptococcus pyogenes. In 1932, he discovered that
one of these dyes was much more effective than any tested previously.
This red azo dye containing a sulfonamide was named prontosil
rubrum.
From 1932 to 1935, Domagk began a rigorous testing program to
determine the effectiveness and dangers of prontosil use at different
doses in animals. Since all chemicals injected into animals or humans
are potentially dangerous, Domagk determined the doses that
harmed or killed. In addition, he worked out the lowest doses that
would eliminate the pathogen. The firm supplied samples of the drug to physicians to carry out clinical trials on humans. (Animal
experimentation can give only an indication of which chemicals
might be useful in humans and which doses are required.)
Domagk thus learned which doses were effective and safe. This
knowledge saved his daughter’s life. One day while knitting, Domagk’s
daughter punctured her finger with a needle and was infected
with a virulent bacteria, which quickly multiplied and spread
from the wound into neighboring tissues. In an attempt to alleviate
the swelling, the infected area was lanced and allowed to drain, but
this did not stop the infection from spreading. The child became
critically ill with developing septicemia, or blood poisoning.
In those days, more than 75 percent of those who acquired blood
infections died. Domagk realized that the chances for his daughter’s
survival were poor. In desperation, he obtained some of the powdered
prontosil that had worked so well on infected animals. He extrapolated
from his animal experiments how much to give his
daughter so that the bacteria would be killed but his daughter
would not be poisoned. Within hours of the first treatment, her fever
dropped, and she recovered completely after repeated doses of
prontosil.

Impact

Directly and indirectly, Ehrlich’s and Domagk’s work served to
usher in a new medical age. Prior to the discovery that prontosil
could be use to treat bacterial infection and the subsequent development
of a series of sulfonamides, or “sulfa drugs,” there was no
chemical defense against this type of disease; as a result, illnesses
such as streptococcal infection, gonorrhea, and pneumonia held terrors
of which they have largely been shorn.Asmall injury could easily
lead to death.
By following the clues presented by the synthetic sulfa drugs and
how they worked to destroy bacteria, other scientists were able to
develop an even more powerful type of drug, the antibiotic. When
the American bacteriologist Rene Dubos discovered that natural organisms
could also be used to fight bacteria, interest was renewed in
an earlier discovery by the Scottish bacteriologist Sir Alexander: the
development of penicillin.
Antibiotics such as penicillin and streptomycin have become
some of the most important tools in fighting disease. Antibiotics
have replaced sulfa drugs for most uses, in part because they cause
fewer side effects, but sulfa drugs are still used for a handful of purposes.
Together, sulfonamides and antibiotics have offered the possibility
of a cure to millions of people who previously would have
had little chance of survival.

Sunday, November 23, 2008

Amniocentesis



The invention:



A technique for removing amniotic fluid from
pregnant women, amniocentesis became a life-saving tool for diagnosing
fetal maturity, health, and genetic defects.


The people behind the invention:


Douglas Bevis, an English physician
Aubrey Milunsky (1936- ), an American pediatrician









How Babies Grow


For thousands of years, the inability to see or touch a fetus in the
uterus was a staggering problem in obstetric care and in the diagnosis
of the future mental and physical health of human offspring. A
beginning to the solution of this problem occurred on February 23,
1952, when The Lancet published a study called “The Antenatal Prediction
of a Hemolytic Disease of the Newborn.” This study, carried
out by physician Douglas Bevis, described the use of amniocentesis
to assess the risk factors found in the fetuses of Rh-negative women
impregnated by Rh-positive men. The article is viewed by many as a
landmark in medicine that led to the wide use of amniocentesis as a
tool for diagnosing fetal maturity, fetal health, and fetal genetic
deects.
At the beginning of a human pregnancy (conception) an egg and
a sperm unite to produce the fertilized egg that will become a new
human being. After conception, the fertilized egg passes from the
oviduct into the uterus, while dividing and becoming an organized
cluster of cells capable of carrying out different tasks in the ninemonth-
long series of events leading up to birth.
About a week after conception, the cluster of cells, now a “vesicle”
(a fluid-filled sac containing the new human cells), attaches
to the uterine lining, penetrates it, and becomes intimately intertwined
with uterine tissues. In time, the merger between the vesicle
and the uterus results in formation of a placenta that connects the
mother and the embryo, and an amniotic sac filled with the amniotic
fluid in which the embryo floats.

Eight weeks after conception,the embryo (now afetus) is about 2.5 centimeters
long and possessesall the anatomic elements it will have when it is born.

At this time, about two and one-half months after her last menstruation,
the expectant mother typically visits a physician and finds out she is pregnant.
Also at this time, expecting mothers often begin to worry about possible birth
defects in the babies they carry.

Diabetic mothers and mothers older than thirtyfive years have higher than usual

chances of delivering babies who have birth defects.

 Many other factors inferred from the medical history an expecting
mother provides to her physician can indicate the possible appearance
of birth defects. In some cases, knowledge of possible
physical problems in a fetus may allow their treatment in the uterus
and save the newborn from problems that could persist throughout
life or lead to death in early childhood. Information is obtained
through the examination of the amniotic fluid in which the fetus is
suspended throughout pregnancy. The process of obtaining this
fluid is called “amniocentesis.”





 Diagnosing Diseases Before Birth


Amniocentesis is carried out in several steps. First, the placenta
and the fetus are located by the use of ultrasound techniques. Next,
the expecting mother may be given a local anesthetic; a long needle
is then inserted carefully into the amniotic sac. As soon as amniotic
fluid is seen, a small sample (about four teaspoons) is drawn into a
hypodermic syringe and the syringe is removed. Amniocentesis is

 nearly painless, and most patients feel only a little abdominal pressure
during the procedure.
The amniotic fluid of early pregnancy resembles blood serum.
As pregnancy continues, its content of substances from fetal urine
and other fetal secretions increases. The fluid also contains fetal cells
from skin and from the gastrointestinal, reproductive, and respiratory
tracts. Therefore, it is of great diagnostic use. Immediately after
the fluid is removed from the fetus, the fetal cells are separated out.
Then, the cells are used for genetic analysis and the amniotic fluid is
examined by means of various biochemical techniques.
One important use of the amniotic fluid from amniocentesis is
the determination of its lecithin and sphingomyelin content. Lecithins
and sphingomyelins are two types of body lipids (fatty molecules)
that are useful diagnostic tools. Lecithins are important because
they are essential components of the so-called pulmonary
surfactant of mature lungs. The pulmonary surfactant acts at lung
surfaces to prevent the collapse of the lung air sacs (alveoli) when a
person exhales.

 Subnormal lecithin production in a fetus indicates that it most
likely will exhibit respiratory distress syndrome or a disease called
“hyaline membrane disease” after birth. Both diseases can be fatal,
so it is valuable to determine whether fetal lecithin levels are adequate
for appropriate lung function in the newborn baby. This is
particularly important in fetuses being carried by diabetic mothers,
who frequently produce newborns with such problems. Often, when
the risk of respiratory distress syndrome is identified through amniocentesis,
the fetus in question is injected with hormones that help it
produce mature lungs. This effect is then confirmed by the repeated
use of amniocentesis. Many other problems can also be identified by
the use of amniocentesis and corrected before the baby is born.





Consequences


In the years that have followed Bevis’s original observation, many
improvements in the methodology of amniocentesis and in the techniques
used in gathering and analyzing the genetic and biochemical
information obtained have led to good results. Hundreds of debilitating
hereditary diseases can be diagnosed and some ameliorated—by

 the examination of amniotic fluid and fetal cells isolated by amniocentesis.
For many parents who have had a child afflicted by some hereditary
disease, the use of the technique has become a major consideration
in family planning. Furthermore, many physicians recommend strongly
that all mothers over the age of thirty-four be tested by amniocentesis
to assist in the diagnosis of Down syndrome, a congenital but nonhereditary
form of mental deficiency.
There remains the question of whether such solutions are morally
appropriate, but parents—and society—now have a choice resulting
from the techniques that have developed since Bevis’s 1952
observation. It is also hoped that these techniques will lead to
means for correcting and preventing diseases and preclude the need
for considering the therapeutic termination of any pregnancy.





See also :  Abortion pill ; Birth control pill ; CAT scanner ; Electrocardiogram ; Electroencephalogram ; Mammography ; Nuclear magnetic resonance ; Pap testAmniocentesis





Wednesday, November 19, 2008

Ammonia



The invention:



The first successful method for converting nitrogen
from the atmosphere and combining it with hydrogen to synthesize
ammonia, a valuable compound used as a fertilizer.


The person behind the invention:


Fritz Haber (1868-1934), a German chemist who won the 1918
Nobel Prize in Chemistry









The Need for Nitrogen


The nitrogen content of the soil, essential to plant growth, is
maintained normally by the deposition and decay of old vegetation
and by nitrates in rainfall. If, however, the soil is used extensively
for agricultural purposes, more intensive methods must be used to
maintain soil nutrients such as nitrogen. One such method is crop
rotation, in which successive divisions of a farm are planted in rotation
with clover, corn, or wheat, for example, or allowed to lie fallow
for a year or so. The clover is able to absorb nitrogen fromthe air and
deposit it in the soil through its roots. As population has increased,
however, farming has become more intensive, and the use of artificial
fertilizers—some containing nitrogen—has become almost universal.
Nitrogen-bearing compounds, such as potassium nitrate and
ammonium chloride, have been used for many years as artificial fertilizers.
Much of the nitrate used, mainly potassium nitrate, came
from Chilean saltpeter, of which a yearly amount of half a million
tons was imported at the beginning of the twentieth century into
Europe and the United States for use in agriculture. Ammonia was
produced by dry distillation of bituminous coal and other lowgrade
fuel materials. Originally, coke ovens discharged this valuable
material into the atmosphere, but more economical methods
were found later to collect and condense these ammonia-bearing
vapors.
At the beginning of the twentieth century, Germany had practically
no source of fertilizer-grade nitrogen; almost all of its supply

came from the deserts of northern Chile. As demand for nitrates increased,
it became apparent that the supply from these vast deposits
would not be enough. Other sources needed to be found, and the almost
unlimited supply of nitrogen in the atmosphere (80 percent nitrogen)
was an obvious source.





Temperature and Pressure
When Fritz Haber and coworkers began his experiments on ammonia
production in 1904, Haber decided to repeat the experiments
of the British chemist Sir William Ramsay and Sydney Young, who
in 1884 had studied the decomposition of ammonia at about 800 degrees
Celsius. They had found that a certain amount of ammonia
was always left undecomposed. In other words, the reaction between
ammonia and its constituent elements—nitrogen and hydrogen—
had reached a state of equilibrium.
Haber decided to determine the point at which this equilibrium
took place at temperatures near 1,000 degrees Celsius. He tried several
approaches, reacting pure hydrogen with pure nitrogen, and
starting with pure ammonia gas and using iron filings as a catalyst.
(Catalytic agents speed up a reaction without affecting it otherwise).
Having determined the point of equilibrium, he next tried different
catalysts and found nickel to be as effective as iron, and calcium
and manganese even better. At 1,000 degrees Celsius, the rate of reaction
was enough to produce practical amounts of ammonia continuously.
Further work by Haber showed that increasing the pressure also
increased the percentage of ammonia at equilibrium. For example,
at 300 degrees Celsius, the percentage of ammonia at equilibrium at
1 atmosphere of pressure was very small, but at 200 atmospheres,
the percentage of ammonia at equilibrium was far greater. A pilot
plant was constructed and was successful enough to impress a
chemical company, Badische Anilin-und Soda-Fabrik (BASF). BASF
agreed to study Haber’s process and to investigate different catalysts
on a large scale. Soon thereafter, the process became a commercial
success.





Impact


With the beginning of World War I, nitrates were needed more
urgently for use in explosives than in agriculture. After the fall of
Antwerp, 50,000 tons of Chilean saltpeter were discovered in the

harbor and fell into German hands. Because the ammonia from
Haber’s process could be converted readily into nitrates, it became
an important war resource. Haber’s other contribution to the German
war effort was his development of poison gas, which was used
for the chlorine gas attack on Allied troops at Ypres in 1915. He also
directed research on gas masks and other protective devices.
At the end of the war, the 1918 Nobel Prize in Chemistry was
awarded to Haber for his development of the process for making
synthetic ammonia. Because the war was still fresh in everyone’s
memory, it became one of the most controversial Nobel awards ever
made. Aheadline in The New York Times for January 26, 1920, stated:
“French Attack Swedes for Nobel Prize Award: Chemistry Honor
Given to Dr. Haber, Inventor of German Asphyxiating Gas.” In a letter
to the Times on January 28, 1920, the Swedish legation in Washington,
D.C., defended the award.
Haber left Germany in 1933 under duress from the anti-Semitic
policies of the Nazi authorities. He was invited to accept a position
with the University of Cambridge, England, and died on a trip to
Basel, Switzerland, a few months later, a great man whose spirit had
been crushed by the actions of an evil regime.







Fritz Haber





Fritz Haber’s career is a warning to inventors: Beware of
what you create, even if your intentions are honorable.
Considered a leading chemist of his age, Haber was born in
Breslau (nowWroclaw, Poland) in 1868. Abrilliant student, he
earned a doctorate quickly, specializing in organic chemistry,
and briefly worked as an industrial chemist. Although he soon
took an academic job, throughout his career Haber believed
that science must benefit society—new theoretical discoveries
must find practical applications.

Beginning in 1904, he applied new chemical techniques
to fix atmospheric nitrogen in the form of ammonia.
Nitrogen in the form of nitrates was urgently
sought because nitrates were necessary to fertilize
crops and natural sources were becoming rare. Only
artificial nitrates could sustain the amount of agriculture
needed to feed expanding populations.

 In 1908 Haber succeeded in finding an efficient, cheap process
to make ammonia and convert it to nitrates, and
by 1910 German manufacturers had built large plants
to exploit his techniques.

He was lauded as a great benefactor to humanity.
However, his efforts to help Germany during World War I,
even though he hated war, turned his life into a nightmare. His
wife committed suicide because of his chlorine gas research,
which also poisoned his international reputation and tainted
his 1918 Nobel Prize in Chemistry. After the war he redirected
his energies to helping Germany rebuild its economy. Eight
years of experiments in extracting gold from seawater ended in
failure, but he did raise the Kaiser Wilhelm Institute for Physical
Chemistry, which he directed, to international prominence.
Nonetheless, Haber had to flee Adolf Hitler’s Nazi regime in
1933 and died a year later, better known for his war research
than for his fundamental service to agriculture and industry.





See also :  Fuel cell ; Refrigerant gas ; Silicones ;

Monday, November 17, 2008

Alkaline storage battery





The invention:



The nickel-iron alkaline battery was a lightweight,
inexpensive portable power source for vehicles with electric motors.



The people behind the invention:


Thomas Alva Edison (1847-1931), American chemist, inventor,
and industrialist
Henry Ford (1863-1947), American inventor and industrialist
Charles F. Kettering (1876-1958), American engineer and
inventor







A Three-Way Race


The earliest automobiles were little more than pairs of bicycles
harnessed together within a rigid frame, and there was little agreement
at first regarding the best power source for such contraptions.
The steam engine, which was well established for railroad and ship
transportation, required an external combustion area and a boiler.
Internal combustion engines required hand cranking, which could
cause injury if the motor backfired. Electric motors were attractive
because they did not require the burning of fuel, but they required
batteries that could store a considerable amount of energy and
could be repeatedly recharged. Ninety percent of the motorcabs in
use in New York City in 1899 were electrically powered.
The first practical storage battery, which was invented by the
French physicist Gaston Planté in 1859, employed electrodes (conductors
that bring electricity into and out of a conducting medium)
of lead and lead oxide and a sulfuric acid electrolyte (a solution
that conducts electricity). In somewhat improved form, this
remained the only practical rechargeable battery at the beginning
of the twentieth century. Edison considered the lead acid cell (battery)
unsuitable as a power source for electric vehicles because using
lead, one of the densest metals known, resulted in a heavy
battery that added substantially to the power requirements of a
motorcar. In addition, the use of an acid electrolyte required that

the battery container be either nonmetallic or coated with a nonmetal
and thus less dependable than a steel container.





The Edison Battery


In 1900, Edison began experiments aimed at developing a rechargeable
battery with inexpensive and lightweight metal electrodes and an
alkaline electrolyte so that a metal container could be used. He had already
been involved in manufacturing the nonrechargeable battery
known as the Lalande cell, which had zinc and copper oxide electrodes
and a highly alkaline sodium hydroxide electrolyte. Zinc electrodes
could not be used in a rechargeable cell because the zinc would
dissolve in the electrolyte. The copper electrode also turned out to be
unsatisfactory. After much further experimentation, Edison settled
on the nickel-iron system for his new storage battery. In this system,
the power-producing reaction involved the conversion of nickel oxide
to nickel hydroxide together with the oxidation of iron metal to
iron oxide, with both materials in contact with a potassium hydroxide
solution. When the battery was recharged, the nickel hydroxide
was converted into oxide and the iron oxide was converted back to
the pure metal. Although the basic ingredients of the Edison cell were
inexpensive, they could not readily be obtained in adequate purity
for battery use.

Edison set up a new chemical works to prepare the needed materials.
He purchased impure nickel alloy, which was then dissolved
in acid, purified, and converted to the hydroxide. He prepared
pure iron powder by using a multiple-step process. For use
in the battery, the reactant powders had to be packed in pockets
made of nickel-plated steel that had been perforated to al-

low the iron and nickel powders to come into contact with the electrolyte.
Because the nickel compounds were poor electrical conductors,
a flaky type of graphite was mixed with the nickel hydroxide at
this stage.
Sales of the new Edison storage battery began in 1904, but within
six months it became apparent that the battery was subject to losses
in power and a variety of other defects. Edison took the battery off

the market in 1905 and offered full-price refunds for the defective
batteries. Not a man to abandon an invention, however, he spent the
next five years examining the failed batteries and refining his design.
He discovered that the repeated charging and discharging of
the battery caused a shift in the distribution of the graphite in the
nickel hydroxide electrode. By using a different type of graphite, he
was able to eliminate this problem and produce a very dependable
power source.
The Ford Motor Company, founded by Henry Ford, a former
Edison employee, began the large-scale production of gasolinepowered
automobiles in 1903 and introduced the inexpensive, easyto-
drive Model T in 1908. The introduction of the improved Edison
battery in 1910 gave a boost to electric car manufacturers, but their
new position in the market would be short-lived. In 1911, Charles
Kettering invented an electric starter for gasoline-powered vehicles
that eliminated the need for troublesome and risky hand cranking.
By 1915, this device was available on all gasoline-powered automobiles,
and public interest in electrically powered cars rapidly diminished.
Although the Kettering starter required a battery, it required
much less capacity than an electric motor would have and was almost
ideally suited to the six-volt lead-acid battery.





Impact


Edison lost the race to produce an electrical power source that
would meet the needs of automotive transportation. Instead, the internal
combustion engine developed by Henry Ford became the standard.
Interest in electrically powered transportation diminished as
immense reserves of crude oil, from which gasoline could be obtained,
were discovered first in the southwestern United States and
then on the Arabian peninsula. Nevertheless, the Edison cell found
a variety of uses and has been manufactured continuously throughout
most of the twentieth century much as Edison designed it.
Electrically powered trucks proved to be well suited for local deliveries,
and some department stores maintained fleets of such
trucks into the mid-1920’s. Electrical power is still preferable to internal
combustion for indoor use, where exhaust fumes are a significant
problem, so forklifts in factories and passenger transport vehi-

cles at airports still make use of the Edison-type power source. The
Edison battery also continues to be used in mines, in railway signals,
in some communications equipment, and as a highly reliable
source of standby emergency power.





                                                                Thomas Alva Edison








Thomas Alva Edison (1847-1931) was America’s most famous
and prolific inventor. His astonishing success story, rising
from a home-schooled child who worked as a newsboy to
a leader in American industry, was celebrated in children’s
books, biographies, and movies. Corporations still bear his
name, and his inventions and improvements of others’ inventions—
such as the light bulb, phonograph, and motion picture—
shaped the way Americans live, work, and entertain
themselves. The U.S. Patent Office issued Edison 1,093 patents
during his lifetime, the most granted to one person.
Hailed as a genius, Edison himself emphasized the value of
plain determination. Genius is one percent inspiration and 99
percent perspiration, he insisted. He also understood the value
of working with others. In fact, one of his greatest contributions
to American technology involved organized research. At age
twenty-three he sold the rights to his first major invention,
an improved ticker-tape machine for Wall Street brokers, for
$40,000. He invested the money in building an industrial research
laboratory, the first ever. It led to his large facilities at
Menlo Park, New Jersey, and, later, labs in other locations. At
times as many as one hundred people worked for him, some of
whom, such as Nikola Tesla and Reginald Fessenden, became
celebrated inventors in their own right.
At his labs Edison not only developed electrical items, such
as the light bulb and storage battery; he also produced an efficient
mimeograph and worked on innovations in metallurgy,
organic chemistry, photography and motion pictures, and phonography.
The phonograph, he once said, was his favorite invention.
Edison never stopped working. He was still receiving patents
the year he died.

Saturday, November 15, 2008

Airplane





 The invention:



The first heavier-than-air craft to fly, the airplane
revolutionized transportation and symbolized the technological
advances of the twentieth century.


The people behind the invention:


Wilbur Wright (1867-1912), an American inventor
Orville Wright (1871-1948), an American inventor
Octave Chanute (1832-1910), a French-born American civil
engineer









 A Careful Search


Although people have dreamed about flying since the time of the
ancient Greeks, it was not until the late eighteenth century that hotair
balloons and gliders made human flight possible. It was not until
the late nineteenth century that enough experiments had been done
with kites and gliders that people could begin to think seriously
about powered, heavier-than-air flight. Two of these people were
Wilbur and Orville Wright.

TheWright brothers were more than just tinkerers who accidentally
found out how to build a flying machine. In 1899,Wilbur wrote
the Smithsonian Institution for a list of books to help them learn
about flying. They used the research of people such as George
Cayley, Octave Chanute, Samuel Langley, and Otto Lilienthal to
help them plan their own experiments with birds, kites, and gliders.
They even built their own wind tunnel. They never fully trusted the
results of other people’s research, so they repeated the experiments
of others and drew their own conclusions. They shared these results
with Octave Chanute, who was able to offer them lots of good advice.
They were continuing a tradition of excellence in engineering
that began with careful research and avoided dangerous trial and
error.



Slow Success


Before the brothers had set their minds to flying, they had built
and repaired bicycles. This was a great help to them when they put
their research into practice and actually built an airplane. From
building bicycles, they knew how to work with wood and metal to
make a lightweight but sturdy machine. Just as important, from riding
bicycles, they got ideas about how an airplane needed to work.
They could see that both bicycles and airplanes needed to be fast
and light. They could also see that airplanes, like bicycles, needed to
be kept under constant control to stay balanced, and that this control
would probably take practice. This was a unique idea. Instead
of building something solid that was controlled by levers and wheels
like a car, theWright brothers built a flexible airplane that was controlled
partly by the movement of the pilot, like a bicycle.
The result was the 1903 Wright Flyer. The Flyer had two sets of
wings, one above the other, which were about 12 meters from tip to
tip. They made their own 12-horsepower engine, as well as the two
propellers the engine spun. The craft had skids instead of wheels.
On December 14, 1903, theWright brothers took the Wright Flyer to
the shores of Kitty Hawk, North Carolina, where Wilbur Wright
made the first attempt to fly the airplane.
The first thingWilbur found was that flying an airplane was not
as easy as riding a bicycle. One wrong move sent him tumbling into

 the sand only moments after takeoff.Wilbur was not seriously hurt,
but a few more days were needed to repair the Wright Flyer.
On December 17, 1903, at 10:35 a.m., after eight years of research
and planning, OrvilleWright took to the air for a historic twelve sec-

onds. He covered 37 meters of ground and 152 meters of air space.
Both brothers took two flights that morning. On the fourth flight,
Wilbur flew for fifty-nine seconds over 260 meters of ground and
through more than 800 meters of air space. After he had landed, a
sudden gust of wind struck the plane, damaging it beyond repair.
Yet no one was able to beat their record for three years.



 Impact


Those first flights in 1903 got little publicity. Only a few people,
such as Octave Chanute, understood the significance of the Wright
brothers’ achievement. For the next two years, they continued to
work on their design, and by 1905 they had built theWright Flyer III.
Although Chanute tried to get them to enter flying contests, the
brothers decided to be cautious and try to get their machine patented
first, so that no one would be able to steal their ideas.
News of their success spread slowly through the United States
and Europe, giving hope to others who were working on airplanes
of their own. When theWright brothers finally went public with the
Wright Flyer III, they inspired many new advances. By 1910, when
the brothers started flying in air shows and contests, their feats were
matched by another American, Glen Hammond Curtiss. The age of
the airplane had arrived.
Later in the decade, the Wright brothers began to think of military
uses for their airplanes. They signed a contract with the U.S.
Army Signal Corps and agreed to train military pilots.
Aside from these achievements, the brothers from Dayton, Ohio,
set the standard for careful research and practical experimentation.
They taught the world not only how to fly but also how to design
airplanes. Indeed, their methods of purposeful, meaningful, and
highly organized research had an impact not only on airplane design
but also on the field of aviation science in general.





The Wright Brothers











Orville and his older brother Wilbur first got interested in
aircraft when their father gave them a toy helicopter in 1878.
Theirs was a large, supportive family. Their father, a minister,
and their mother, a college graduate and inventor of household
gadgets, encouraged all five of the children to be creative. AlthoughWilbur,
born in 1867, was four years older than Orville,
they were close as children. While in high school, they put out a
weekly newspaper together, West Side News, and they opened
their bicycle shop in 1892. Orville was the mechanically adept
member of the team, the tinkerer; Wilbur was the deliberative
one, the planner and designer.
Since the bicycle business was seasonal, they had time to
pursue their interest in aircraft, puzzling out the technical problems
and studying the successes and failures of others. They
started with gliders, flying their first, which had a five-foot
wing span, in 1899. They developed their own technique to control
the gliders, the “wing-warping technique,” after watching
how birds fly. They attached wires to the trailing edges of the
wings and pulled the wires to deform the wings’ shape. They
built a sixteen-foot glider in 1900 and spent a vacation in North
Carolina gaining flying experience. Further designs and many
more tests followed, including more than two hundred shapes
of wing studied in their home-built wind tunnel, before their
first successful engine-powered flight in 1903.
Neither man ever married. After Wilbur died of typhoid in
1912, Orville was stricken by the loss of his brother but continued
to run their business until 1915. He last piloted an airplane
himself in 1918 and died thirty years later.
Their first powered airplane, theWright Flyer, lives on at the
National Air and Space Museum in Washington, D.C. Small
parts from the aircraft were taken to the Moon by Neil Armstrong
and Edwin Aldrin when they made the first landing
there in 1969.



See also here !

Friday, November 14, 2008

Abortion pill






The invention:

RU-486 was the first commercially available drug
that prevented fertilized eggs from implanting themselves in the
walls of women’s uteruses.

The people behind the invention:

Étienne-Émile Baulieu (1926- ), a French biochemist and endocrinologist
Georges Teutsch, a French chemist

Alain Bélanger a French chemist

Daniel Philibert, a French physicist and pharmacologist







Developing and Testing



In 1980, Alain Bélanger, a research chemist, was working with

Georges Teutsch at Roussel Uclaf, a French pharmaceutical company.

Teutsch and Bélanger were interested in understanding how

changes in steroids affect the chemicals’ ability to bind to their steroid

receptors. (Receptors are molecules on cells that can bind with

certain chemical substances such as hormones. Receptors therefore

act as connecting links to promote or prevent specific bodily activities

or processes.) Bélanger synthesized several steroids that bonded

to steroid receptors. Among these steroids was a compound that

came to be called “RU-486.”

Another member of the research project, Daniel Philibert, found

that RU-486 blocked the activities of progesterone by binding tightly

to the progesterone receptor. Progesterone is a naturally occurring

steroid hormone that prepares the wall of the uterus to accept a fertilized

egg. Once this is done, the egg can become implanted and

can begin to develop. The hormone also prevents the muscles of the

uterus from contracting, which might cause the uterus to reject the

egg. Therefore RU-486, by acting as a kind of shield between hormone

and receptor, essentially stopped the progesterone from doing

its job.

At the time, Teutsch’s group did not consider that RU-486 might

be useful for deliberately interrupting human pregnancy. It was

Étienne-Émile Baulieu, a biochemist and endocrinologist and a consultant

for Roussel Uclaf, who made this connection. He persuaded

the company to test RU-486 for its effects on fertility control.

Many tests were performed on rabbits, rats, and monkeys; they

showed that, even in the presence of progesterone, RU-486 could

prevent secretory tissue from forming in the uterus, could change

the timing of the menstrual cycle, and could terminate a pregnancy—

that is, cause an abortion. The compound also seemed to be

nontoxic, even in high doses.

In October of 1981, Baulieu began testing the drug with human

volunteers. By 1985, major tests of RU-486 were being done in

France, Great Britain, The Netherlands, Sweden, and China. When a

relatively low dose of RU-486 was given orally, there was an 85 percent

success rate in ending pregnancy; the woman’s body expelled

the embryo and all the endometrial surface. Researchers found that

if a low dose of a prostaglandin (a hormonelike substance that

causes the smooth muscles of the uterus to contract, thereby expelling

the embryo) was given two days later, the success rate rose to 96

percent. There were few side effects, and the low doses of RU-486

did not interfere with the actions of other steroid hormones that are

necessary to keep the body working.

In the March, 1990, issue of The New England Journal of Medicine,

Baulieu and his coworkers reported that with one dose of RU-486,

followed in thirty-six to forty-eight hours with a low dose of prostaglandin,

96 percent of the 2,040 women they studied had a complete

abortion with few side effects. The women were monitored after receiving

the prostaglandin to watch for side effects, which included

nausea, vomiting, abdominal pain, and diarrhea. When they returned

for a later checkup, fewer than 2 percent of the women complained

of side effects. The researchers used two different prostaglandins;

they found that one caused a quicker abortion but also

brought about more pain and a longer period of bleeding.



Using the Drug



In September, 1988, the French government approved the distribution

of RU-486 for use in government-controlled clinics. The next

month, however, Roussel Uclaf stopped selling the drug because

people opposed to abortion did not want RU-486 to be available and

were threatening to boycott the company.

Then, however, there were threats and pressure from the other

side. For example, members of the World Congress of Obstetrics

and Gynecology announced that they might boycott Roussel Uclaf

if it did not make RU-486 available. The French government, which

controlled a 36 percent interest in Roussel Uclaf, ordered the company

to start distributing the drug once more.

By the fall of 1989, more than one-fourth of all early abortions in

France were being done with RU-486 and a prostaglandin. The French

government began helping to pay the cost of using RU-486 in 1990.

Testing for approval of RU-486 was completed in Great Britain

and The Netherlands, but Roussel Uclaf’s parent company, Hoechst

AG, did not try to market the drug there or in any other country outside

France. (In the United States, government regulations did not

allow RU-486 to be tested using government funds.)

Medical researchers believe that RU-486 may be useful not only

for abortions but also in other ways. For example, it may help in

treating certain breast cancers and other tumors. RU-486 is also being

investigated as a possible treatment for glaucoma—to lower

pressure in the eye that may be caused by a high level of steroid hormone.

It may be useful in promoting the healing of skin wounds

and softening the cervix at birth, easing delivery. Researchers hope

as well that some form of RU-486 may prove useful as a contraceptive—

that is, not to prevent a fertilized egg from implanting itself in

the mother’s uterus but to prevent ovulation in the first place.



Impact



Groups opposed to abortion rights have spoken out against RU-

486, while those who favor the right to abortion have urged its acceptance.

The drug has been approved for use in China as well as in

France. In the United States, however, the government has avoided

giving its approval to the drug. Officials of theWorld Health Organization

(WHO) have argued that RU-486 could prevent the deaths

of women who undergo botched abortions. Under international

law,WHOhas the right to take control of the drug and make it available

in poor countries at low cost. Because of the controversy surrounding

the drug, however,WHOcalled for more testing to ensure

that RU-486 is quite safe for women.



                                             Étienne-Emile Baulieu







 Étienne-Émile Baulieu was born in Strasbourg, France, in
1926. He moved to Paris for his advanced studies at the Faculty
of Medicine and Faculty of Science of Pasteur College. He was
an Intern of Paris from 1951 until he received a medical degree
in 1955. He passed examinations qualifying him to become a
teacher at state schools in 1958 and during the 1961-1962 academic
year was a visiting scientist in Columbia University’s
Department of Biochemistry.
In 1963 Baulieu was made a Doctor of Science and appointed
director of a research unit at France’s National Institute of
Health and Medical Science, a position he held until he retired
in 1997. He also served as Head of Service of Hormonal Biochemistry
of the Hospital of Bicêtre (1970-1997), professor of
biochemistry at University of Paris-South (1970-1993), and consultant
for Roussel Uclaf (1963-1997).
Among his many honors are the Gregory Pincus Memorial
Award (1978), awards from the National Academy of Medicine,
the Christopher Columbus Discovery Award in Biomedical Research
(1992), the Joseph Bolivar DeLee Humanitarian Award
(1994), and Commander of the Legion of Honor (1990). Although
busy with research and teaching duties, Baulieu was on
the editorial board of several French and international newspapers,
a member of scientific councils, and a participant in the
Special Program in Human Reproduction of the World Health
Organization.



See also here !