Algo de mucha importancia para todos quienes visitan este Blog.
The medical discoveries of the past 1,000 years have saved countless lives and have doubled the life expectancy of people in many parts of the world. In this October 1999 Encarta Yearbook article, physicians Meyer Friedman and Gerald W. Friedland explore ten discoveries that fundamentally changed the way scientists and physicians were able to improve human health.
The Greatest Medical Discoveries of the Millennium
By Meyer Friedman and Gerald W. Friedland
During the last 1,000 years, a series of monumental discoveries revolutionized the practice of medicine. These discoveries have saved millions of lives and brought about remarkable improvements in the health of entire populations.
One thousand years ago the average person in Europe could expect to live about 30 years. Of 100 children born alive, 40 would die before their first birthday. Disease and infections, largely the result of squalid living conditions, claimed untold numbers of lives. The average life span throughout the industrialized West has now more than doubled the average in 1000—to about 76.5 years for babies born in 1997 in the United States. Infant mortality rates are now a mere fraction of what they were 1,000 years ago.
Medical breakthroughs during the next millennium will probably bring even longer, healthier lives. Advances in genetics, for example, offer hope of new treatments to cure serious diseases such as cancer, eliminate genetic defects from families, and possibly even slow the aging process. And recent developments in transplantation technology suggest that soon it may be possible to grow replacement organs in the laboratory. But none of these anticipated breakthroughs would be possible—even thinkable—without the pioneering medical discoveries of the last 1,000 years.
Sorting through literally thousands of medical discoveries made during the past millennium to determine the ten most important is a challenge. How is the significance of a discovery measured? One factor is paramount: The most fundamental breakthroughs led to multiple other discoveries that eventually reshaped medicine and affected millions, even billions, of people. To be sure, changes in medical practices often lagged far behind the initial discovery; effective therapies sometimes emerged only after many years. These discoveries have attained the status of millennial importance not because they quickly helped physicians save lives, but because they fundamentally shifted the way scientists and physicians thought about human health. In so doing, the discoveries opened vast new fields of research that would revolutionize medicine and save the lives of incalculable numbers of people.
These seminal breakthroughs are, in the order of their discovery, human anatomy, circulation of blood, bacteria, vaccination, surgical anesthesia, X rays, blood typing, tissue culture, antibiotics, and the structure of deoxyribonucleic acid (DNA).
Setting the Stage
The practice of medicine is as old as civilization itself, but the Greeks are generally credited with inventing the science of medicine—using observation and experience rather than appeals to supernatural forces to treat disease. Although Greek medical knowledge was passed on to the conquering Romans, it fell into obscurity as the Roman Empire collapsed in the early Middle Ages.
A period of stagnation in the sciences, combined with sporadic epidemics of the bubonic plague, smallpox, and other diseases, reinforced the turn toward superstition and magical treatments in medieval Europe. Only fragments of the ancient medical learning survived. Many people viewed disease as a form of punishment for sins or as the result of demonic forces. Prayer was a standard form of treatment.
Western medicine received a major boost when the Italian universities of Salerno, Bologna, and Padua established medical faculties in the 9th and 10th centuries. By the 12th century, the University of Paris in France and Oxford University in England had also founded faculties of medicine. These institutions provided facilities for research, set examination requirements for graduating physicians, and laid the foundations for the extraordinary revival of Western medicine in the 16th and 17th centuries—a revival that has continued to this day.
Before modern medical science could emerge, medical practitioners needed an accurate understanding of human anatomy. Without clear descriptions of the structure of the human body, it was impossible to learn what different bodily parts actually do. Once researchers understood how parts of the body worked, they were better able to devise medical therapies to restore proper functions.
Amazingly, no one knew much about human anatomy until 1543, when the Belgian anatomist Andreas Vesalius wrote De Humani Corporis Fabrica, Libri Septem (On the Structure of the Human Body, in Seven Books). For centuries, exploration of the anatomy of human corpses was forbidden. In medieval Europe, knowledge about anatomy was based largely on the teachings of the Roman physician Galen (129-199?). Galen’s anatomical descriptions were based on dissections of animals, which differ in many ways from humans. But contradicting Galen was dangerous because the powerful Roman Catholic Church accepted his findings as gospel. A few brave souls had tried to correct some of Galen’s errors, but their work was lost for centuries.
Ambitious, driven, and ruthless, 23-year-old Vesalius received his medical degree in 1537 from the University of Padua and was immediately appointed head of surgery and anatomy there. As a student, and later as a scientist, he recovered human corpses from cemeteries late at night. He even encouraged his students to note patients who were at death’s door so that he could steal their bodies for dissection before they were buried. Vesalius slept, night after night, with corpses in his own bedroom, and he hired Italy’s greatest artists to draw what he found.
In 1543 Vesalius completed his seven-book masterpiece, richly illustrated with more than 200 magnificent drawings. Many consider it one of the greatest medical books ever published. This monumental work gave medicine a precious gift: For the first time, human anatomy was based on careful dissection and observation rather than on a rigid orthodoxy rooted in ancient texts.
Circulation of Blood
English physician William Harvey’s discovery of what the heart does and how the blood circulates is widely regarded as the single greatest medical achievement of all time: It established the principle of doing experiments in medicine to learn how the body’s organs and tissues function. Published in 1628, Harvey’s groundbreaking book Exercitatio Anatomica de Motu Cordis et Sanguinis in Animalibus (Anatomical Essay on the Motion of the Heart and Blood in Animals) spurred research into the mechanical functions of many bodily processes, including respiration, digestion, metabolism, and reproduction.
Harvey received his medical degree from the University of Padua, where he learned one very important fact: Veins have valves that permit blood to travel in only one direction. However, the exact role of the valves was unclear.
Realizing that it was still dangerous to contradict Galen, who had claimed that the liver not only makes the body’s blood but also pumps it through the body, Harvey decided to study blood flow by operating on live animals. For a period of 12 years Harvey conducted his experiments before members of the influential Royal College of Physicians in London, England. He wanted their support for his book, which praised Galen while challenging many of his ideas.
In the 8th chapter of his 17-chapter book, Harvey carefully introduced the revolutionary idea that blood goes in a circle in the body, traveling from the heart to the arteries to the veins and back to the heart. The next 9 chapters proved, in wonderfully clear English, that he was right.
In a series of brilliant experiments in animals and humans, Harvey demonstrated how blood circulates in the body. When an artery was blocked, the veins draining this artery collapsed. When a vein was blocked, it swelled below the blockage and collapsed above it, but the swelling disappeared when the blockage was removed. He also showed that the valves in veins allow blood to flow only in the direction of the heart. Together, these discoveries proved that blood moves in a circle in the body—that is, there is a ‘circulation.’
After the momentous medical breakthroughs of Vesalius and Harvey came the 17th-century discovery of one of the human body’s greatest enemies: bacteria. This discovery eventually led to the realization that exposure to certain microorganisms could cause disease. It also prompted new theories of antiseptics that sharply lowered mortality rates from surgery.
Antoni van Leeuwenhoek, a part-time janitor and haberdasher working in Delft, Holland, discovered bacteria and other microorganisms using a microscope that he built himself. Through the influence of a friend, a Dutch physician, he was invited to write letters to the Royal Society of London—a group dedicated to the advancement of science. These letters were translated from Dutch into English and published in the society’s journal Philosophical Transactions.
Leeuwenhoek’s most famous letter was published on March 16, 1677. In this letter he described looking at a drop of rainwater through his microscope. The drop was taken from a tub where it had been allowed to stand for several days. To his amazement, he saw exceedingly tiny animals, known today as protozoa, swimming in the water. He also observed other equally small animals that did not move at all, now known as bacteria. No one at the Royal Society knew anything about these little creatures, which Leeuwenhoek called animalcules. At the request of the stunned members of the Royal Society, several of the most respected citizens of Delft were asked to verify Leeuwenhoek’s microscopic findings. They did so, and in 1680 Leeuwenhoek was elected a fellow of the prestigious Royal Society.
Later discoveries extended the significance of Leeuwenhoek’s work, especially the superb finding by the German scientist Robert Koch in 1876. Koch found that the microscopic anthrax bacillus could actually cause a fatal human disease. Until Koch’s discovery, many scientists thought it absurd that microscopic creatures could harm much larger animals, such as humans. In 1882 Koch showed that another kind of bacterium, the tubercle bacillus, caused tuberculosis, a discovery for which he won the Nobel Prize in 1905.
Unlike Koch, who was a country physician when he made his epochal discoveries, the French chemist and biologist Louis Pasteur disliked physicians so much that he would not have them as workers in his laboratory. Despite his disdain for physicians, he was deeply fascinated by diseases of various kinds. Pasteur discovered that putrefaction (a decomposition of organic substances) is caused by microorganisms that float in the air. Pasteur learned that he could prevent putrefaction by subjecting organic substances to moderate but not extreme heat, a process known as pasteurization.
In 1865 Joseph Lister, an English surgeon, read of Pasteur’s research on putrefaction. Lister recalled that whereas simple bone fractures invariably healed, compound fractures (bone fractures bursting the skin) almost always began to putrefy. Lister was certain that this dangerous, infectious process was caused by the wretched microorganisms that Pasteur had described. To test his theory, Lister covered his patients’ compound fractures, previously exposed to the air, with linen strips soaked in carbolic acid. He believed the carbolic acid might exterminate the airborne microorganisms.
Lister treated compound fractures and open surgical wounds with carbolic acid for nine months. During this time, he did not observe a single infection in his surgical ward. The results of his experiments, published in 1867, gave rise to antiseptic surgery. Although Lister’s antiseptic technique initially encountered resistance from other physicians, it soon became widely accepted, and deaths due to infections in the operating room plummeted.
Smallpox, once a common viral infection that claimed millions of lives in periodic epidemics around the globe, is now considered fully eradicated. Much of the credit for this medical triumph belongs to the English physician Edward Jenner, who in 1796 developed the first effective vaccine against smallpox. Jenner’s discovery laid the foundations for the science of immunology. Vaccines are now used to control and prevent diphtheria, hepatitis, influenza, meningitis, polio, tetanus, typhoid fever, whooping cough, and many other diseases that once plagued humankind.
In Jenner’s day a procedure called variolation was used to protect people against smallpox. The procedure involved scratching a bit of the substance from smallpox blisters—obtained from a person with a mild case of the disease—into a healthy person’s arm. Hopefully, a mild case of the disease would develop and pass, but the procedure was often deadly.
Jenner, orphaned at the age of five, was born and raised in the tiny English village of Berkeley, near Bristol. At the age of 13, Jenner was apprenticed to a country surgeon. Shortly after, milkmaids told him that after they contracted cowpox, a harmless disease confined usually to their hands and arms, they never got smallpox.
Following his training with the famous surgeon John Hunter in London, Jenner returned to Berkeley and devised an experiment to learn whether cowpox could protect against smallpox. On May 14, 1796, Jenner made two small scratches on the arm of an eight-year-old boy named James Phipps. On those scratches he rubbed fluid from a milkmaid’s cowpox blister. Eight days later, Phipps developed small cowpox blisters on the scratches. On July 1 Jenner variolated Phipps with fluid from a smallpox blister. Phipps never got even a mild case of smallpox.
Jenner had made two important discoveries: Cowpox protects against smallpox, and cowpox could be transmitted from person to person. He subsequently vaccinated another eight children, including his own son, experimenting further with his new technique. In 1798 Jenner submitted his findings to Philosophical Transactions, but his work was rejected. After further experiments, he published his results himself, paying for the printing.
Vaccination was initially viewed as unnatural, and the technique encountered significant opposition for decades. More than 80 years passed before Pasteur, drawing on Jenner’s work, opened the way for the development of modern preventive vaccines. In the end, however, Jenner received an honorary degree from Oxford University for his groundbreaking work.
Until the discovery of anesthesia by Crawford Williamson Long in 1842, surgery was an excruciating ordeal, usually attempted only in cases of dire injury or illness. Some patients used alcohol or opium to lessen the pain; others recited verse. Passing out was a blessing. Surgeons worked at breakneck speed to get in and out of the patient’s body as quickly as possible. Anesthesia changed all this, permitting surgeons to work at a slower, more careful pace.
The Spaniard Lullius discovered ether, an organic solvent, in 1275, but its anesthetic properties were unknown. In the early 1800s people inhaled ether at parties to make themselves high. Long, a physician in Jefferson, Georgia, frequently prepared ether at the request of his friends. One evening he used it himself at a so-called ether frolic and badly bruised himself while he was high. Yet, Long noticed, he felt no pain.
On March 30, 1842, Long convinced James Venable, who had two cysts in his neck and was terrified at the prospect of surgery, to try ether. Venable did, and the ether made him unconscious. The operation was a success, and Venable, amazed, felt no pain at all. During the next four years, Long successfully used ether as anesthesia on eight patients.
But in 1842 the physician Charles Jackson and a dentist, William Morton, learned what Long was doing with ether and stole his secret, possibly after visiting Jefferson. Morton used ether to anesthetize two patients at Boston’s Massachusetts General Hospital on October 16, 1846, in front of an audience of famous surgeons. The results were published, and anesthesia was soon used around the world. Long received no credit for the discovery.
The claims of Jackson, Morton, Long, and others created bitter quarrels about who actually invented surgical anesthesia. The Congress of the United States even took up the matter, debating the issue for 16 years without ever deciding who first introduced ether as an anesthetic procedure. But the use of anesthesia developed rapidly. Later, scientists found new anesthetic agents, developed better methods of administering anesthetic gases, and eventually discovered the use of local anesthesia.
The development of X-ray photography, or radiology, was an enormous leap into the future: For the first time, physicians could see inside the body without opening it. With X rays surgeons could quickly diagnose fractures, tumors, and other ailments and plan more intricate operations. As a result, surgery rapidly grew in sophistication.
Early in 1895 German physicist Wilhelm Conrad Roentgen was experimenting with a Crookes tube—a pear-shaped glass tube emptied of air with electrodes (metal wires) sealed into opposite ends of the tube. When the negative electrode, or cathode, received a high-voltage electric current, it glowed white-hot and emitted a stream of invisible, electrically charged particles called cathode rays. These rays moved towards the positive electrode, or anode, of the tube. If only a little air remained inside the Crookes tube, the cathode rays striking the glass at the other end of the tube would produce a yellow-green fluorescence.
Roentgen’s experiments had confirmed another physicist’s observation that cathode rays could pass through an aluminum-covered window in the wall of a Crookes tube. To discover whether cathode rays could also go through the glass wall of a Crookes tube, he placed a piece of paper coated with a barium salt near the tube’s anode. Such paper was known to fluoresce when hit with cathode rays. He expected the fluorescence to be faint, so he covered the tube with black cardboard to block the tube’s fluorescence and to better help him see. He also darkened his laboratory.
While testing the tube to see whether any fluorescence was visible through the cardboard, Roentgen noticed a strange glow some distance away. Lighting a match for light to see, he discovered that the glow was coming from another piece of coated paper that was about a meter away from the Crookes tube. Repeatedly turning his tube on and off, he learned that the paper only glowed when the tube was on. The paper still fluoresced when he moved it even farther away and even when he shielded it, first with a pack of cards and then with a book.
Roentgen knew that cathode rays were not strong enough to cause this distant fluorescence. There could only be one explanation: The Crookes tube was producing previously unknown kinds of electromagnetic waves, which he later called X rays. Further experiments showed that these new X rays would not go through lead and would go only partly through other metals.
In December 1895 he X-rayed his own fingers holding a small lead pipe. To his astonishment, the developed pictures revealed not only the shadow of the pipe but also the bones of two of his fingers. He later X-rayed the left hand of his wife, Bertha, who had two gold rings on her fourth finger, and showed the terrified woman a picture of her own bones, complete with rings.
Roentgen’s preliminary report, entitled On a New Kind of X Ray, was published only a few days after he submitted it, a record for rapid publication in science. In 1901 Roentgen became the first scientist ever to receive the Nobel Prize in physics.
X rays were used immediately for medical diagnosis but did not become a routine procedure until the 1920s. In the decades that followed, technological advances allowed X rays to outline individual organs and organ systems, as well as arteries and veins. In 1972 researchers developed the computerized tomography (CAT) scan, a sophisticated X-ray technology that produces computer-generated cross-sectional views of the body.
At the dawn of the 20th century, the Austrian physician Karl Landsteiner made the extraordinary discovery that human blood could be grouped into several different types. This discovery made possible the transfer of blood from one human to another—a medical breakthrough that has saved countless lives.
Prior to Landsteiner’s pioneering work, there were few reports on the transfer of blood from one human to another. In 1668 Jean Baptiste Denis, the French physician to King Louis XIV, dared to transfuse a man with sheep’s blood. The man eventually died and Denis was arrested for murder. Transfusions were quickly banned in France and England. Other attempted transfusions using human blood were frequently unsuccessful, and patients often died due to blood incompatibility.
In 1900 Landsteiner made the brilliant observation that human blood contains what he called isoagglutinins. These proteins are capable of agglutinating, or clotting, the red blood cells of blood samples containing isoagglutinins different from their own. He thus was able to divide blood into three types: A, B, and O. A rare fourth type, AB, was later discovered.
Landsteiner showed that the sera of two blood samples containing the same isoagglutinins would not clot the red cells of either blood. This discovery permitted the development of a system for safe blood transfusions. For this gift to humankind, Landsteiner received the Nobel Prize in physiology or medicine in 1930.
In 1907 American biologist Ross Granville Harrison stumbled upon the amazing discovery that living tissues could be cultured, or grown, outside the body. Although Harrison could not have known it at the time, his discovery would become one of the most valued techniques in medicine. Tissue culture has opened up new ways to study the development of genes (the basic units of heredity), embryos, tumors, toxins, and the pathogens that cause numerous diseases. The technique is also used to produce medicines, vaccines, and replacement tissues, as well as to clone animals, such as Dolly, the famous sheep.
In late summer of 1906 Harrison, an expert on embryos, wanted to solve what was then an important problem in biology. Harrison set out to determine whether nerve fibers grow in local tissues of the body or originate in nerve cells of the brain. He had to devise a new way to study the problem because all the available living specimens contained both nerves and surrounding tissue.
Harrison decided to study living nerves in the absence of any surrounding tissue. To do this, he isolated a portion of the hindbrain of a tiny living frog embryo. To keep his specimen alive, he immersed it in fresh frog lymph (a diluted blood plasma carried by the lymphatic system) and placed it under a cover slip so he could examine it with a microscope. The frog lymph quickly clotted, like blood, and he sealed it with wax to prevent evaporation or contamination of the specimen.
Using a microscope, Harrison discovered that the nerve fiber did actually come from the brain, not the surrounding tissue. Harrison noticed something else: The frog’s brain cells were still growing, even though they were no longer in the frog’s body. Harrison found the answer to the question he was asking and, at the same time, invented what would become the science of tissue culture. Harrison reported his result in May 1907. Since then tissue culture has allowed researchers to learn more about the basic mechanisms of disease than had been learned in the previous 500 years.
The discovery of antibiotics opened a whole new front in the war against disease. Armed with antibiotics, which act by killing bacteria or inhibiting their growth, scientists have mounted a major assault on cholera, pneumonia, tetanus, tuberculosis, and many other deadly bacterial infections that had previously struck people down relentlessly.
Some of the most important breakthroughs in science occur unexpectedly, and the discovery of penicillin—perhaps the world’s most widely used antibiotic—is one such example. The British bacteriologist Sir Alexander Fleming is credited with discovering penicillin, although other scientists before him had noticed that the mold Penicillium notatum prevented the growth of some types of bacteria. So Fleming’s discovery was actually a rediscovery.
In September 1928 Fleming was preparing to take a short vacation with his family, when a series of almost unbelievably lucky events occurred. Just before leaving, Fleming decided to cultivate staphylococci to study when he returned. This was the first piece of luck. He could have picked any bacterium to study, but he happened to pick one that would turn out to be susceptible to penicillin.
Fleming opened a petri dish for a few seconds to put the staphylococci inside. Ordinarily, no mold spores would have a chance to get in the dish, but two floors below his laboratory, another scientist was studying the mold Penicillium notatum. Millions of very light mold spores floated in the air, up the staircase and the elevator shaft, through the always-open doors of Fleming’s laboratory, and into the open dish where, luckily, he was just putting the staphylococci.
Fleming, preoccupied with his vacation, left the petri dish on the laboratory bench instead of putting it in a warm incubator. This was lucky, too, because the bacteria and Penicillium notatum usually grow at different temperatures. Staphylococci multiply at relatively high temperatures, while Penicillium multiplies at lower temperatures. While Fleming was away the temperature turned out to be perfect for Penicillium, but not so good for the staphylococci, which grew slowly. The Penicillium mold thrived and secreted penicillin, which oozed around the dish, preventing the growth of staphylococci and leaving the Penicillium mold isolated from small bacterial colonies in the dish.
Fleming, upon his return, immediately realized what had happened, and he conducted other tests to learn what other bacteria this mysterious mold stuff could kill. He also tried to make pure penicillin, but did not succeed. Fleming believed that the mold substance, which he named penicillin, could be rubbed onto a cut or a scrape to prevent an infection. A few years later, however, Fleming gave up studying the mold.
As a consequence, penicillin was nearly forgotten until the beginning of World War II (1939-1945). Scientists at Oxford University in England showed that penicillin could prevent bacterial infections in animals and humans, and they devised a technique to mass-produce pure penicillin. The scientists encouraged companies in the United States to manufacture penicillin in vast quantities, and the new drug was credited with saving thousands of lives during the war. In 1945 Fleming and two of the Oxford scientists, Sir Howard Florey and Ernst B. Chain, received the Nobel Prize in physiology or medicine.
Perhaps the greatest medical breakthrough of the 20th century is the discovery of the structure of deoxyribonucleic acid (DNA—the molecular basis of heredity). Knowledge of DNA’s chemical structure allowed scientists to understand for the first time how DNA replicates itself and passes information from one generation to the next. This monumental discovery has already revolutionized many aspects of medicine, permitting the development of a vast range of genetically engineered drugs, hormones, and other useful substances. Even more radical changes are afoot. In the new millennium, scientists are expected to have access to a complete map of the human genetic code, which should help them trace the genetic causes of all inherited diseases and search out possible cures.
The Swiss physician Friedrich Miescher isolated DNA for the first time in 1869, but the function of the chemical, which is found only in the nucleus of cells, was unknown. As the years passed, scientists learned that DNA contained phosphate, a sugar called deoxyribose, and four different compounds called nucleotide bases.
In 1944 the Canadian-born American physician and bacteriologist Oswald T. Avery and his colleagues showed, in a series of experiments on bacteria, that DNA transmitted genetic information. Prior to Avery’s groundbreaking work, many biochemists believed that proteins were the source of genetic information.
By 1950 two groups of scientists were in hot pursuit of the structure of DNA. One of the groups was at Cavendish Laboratory in Cambridge, England. The other group, at King’s College, London, consisted of Maurice Wilkins, a physicist, and Raymond Gosling, a graduate student. They were joined in 1951 by Rosalind Franklin, an expert in X-ray crystallography (a technique that uses a tiny beam of X rays to create images of the structural relationships between atoms and molecules of chemical substances). Photographs of images produced by X-ray crystallography are called X-ray diffraction photographs.
In 1950 Wilkins received a uniquely pure sample of DNA from a Swiss physicist. From this sample, he was able to pick out single DNA fibers with a glass rod. Wilkins and Gosling X-rayed these fibers in 1950, as did Franklin when she joined the laboratory in 1951.
However, a misunderstanding caused Wilkins and Franklin each to think they were in charge of X-ray crystallography, and they did not cooperate. When Franklin left the team in 1952, she was ordered to submit all of the X-ray diffraction photographs to Wilkins. One of these photographs showed that the DNA molecule had the shape of a double helix, a structure resembling a twisted ladder.
In the meantime the American biologist James Watson attended a meeting in Naples, Italy, in 1950, in which he saw one of Wilkins’s X-ray diffraction photographs. Watson immediately thought the molecule might be a double helix. In the fall of 1951 he joined the team of scientists at Cavendish Laboratory, where he convinced a British biophysicist, Francis Crick, that a combination of model building—using plastic balls, wires and steel plates—and X-ray crystallography could lead them to the structure of DNA.
The double helix by itself, however, was not the only secret to the DNA molecule. Its entire chemistry needed to be explained. Watson, unbeknownst to Wilkins, was now convinced that DNA had a helical structure and was working feverishly with Crick on their increasingly complex model of the molecule, which they finished during the second week of 1953. This model incorporated all the known chemical components of DNA and closely matched the diffraction pattern observed in Wilkin’s photograph. Watson and Crick accurately deduced that the two strands of the double helix separated before cellular division, providing templates, or patterns, for the creation of two new DNA molecules identical to the original.
Watson and Crick sent Wilkins a copy of their manuscript, which took advantage, of course, of what Wilkins and Franklin had done, and which Wilkins thought was his own. After reading their manuscript, Wilkins sent them a letter that began, ‘I think you are a couple of old rogues.…’
On April 25, 1953, the journal Nature published one article from the Cambridge laboratory and two from King’s College in London on the molecular structure of DNA. Many felt that the key to life itself had been revealed. Wilkins, Watson, and Crick shared the 1962 Nobel Prize in physiology or medicine.
The discovery of the structure of DNA—like the discovery of X rays at the end of the 19th century or the detection of bacteria more than two centuries before that—has radically altered medicine and opened previously unknown frontiers. Equipped with a map of the human genome (the complete genetic code), researchers in the next millennium hope to root out the genetic causes of a wide range of inherited diseases, from schizophrenia to cystic fibrosis to hemophilia to many types of cancer. Of perhaps greater significance, many scientists believe that advances in molecular genetics are setting the course for fundamental changes in the diagnosis and treatment of disease. Instead of merely treating the symptoms of disease, as they do now, physicians of the next millennium may develop the ability to routinely identify and correct the causes of disease before symptoms appear.
And yet, despite Western medicine’s stunning success in fighting disease and extending human life, the health of much of the developing world is worsening. Modern vaccines and antibiotics would save tens of millions of lives each year throughout the developing world, where people are continuously struck down by malaria, tuberculosis, polio, pneumonia, and other easily treatable disorders. And the vast majority of people with acquired immune deficiency syndrome (AIDS) now live in the developing world, where access to costly life-extending medications is beyond the reach of most of those infected. Finding ways to extend the magnificent contributions of Western medicine to those who need it most therefore constitutes one of the greatest medical challenges of the coming millennium.
Filed under: Interes National