medicine 4333


April 30, 2021

Medicine (Latin medicus, “physician”), the science and art of diagnosing, treating, and preventing disease and injury. Its goals are to help people live longer, happier, more active lives with less suffering and disability. Medicine goes beyond the bedside of patients. Medical scientists engage in a constant search for new drugs, effective treatments, and more advanced technology. In addition, medicine is a business. It is part of the health care industry, one of the largest industries in the United States, and among the leading employers in most communities.
Disease has been one of humanity’s greatest enemies. Only during the last 100 years has medicine developed weapons to fight disease effectively. Vaccines, better drugs and surgical procedures, new instruments, and understanding of sanitation and nutrition have had a huge impact on human well-being. Like detectives, physicians and other health care professionals use clues to identify, or diagnose, a specific disease or injury. They check the patient’s medical history for past symptoms or diseases, perform a physical examination, and check the results of various tests. After making a diagnosis, physicians pick the best treatment. Some treatments cure a disease. Others are palliative—that is, they relieve symptoms but do not reverse the underlying disease. Sometimes no treatment is needed because the disease will get better by itself.
While diagnosing disease and choosing the best treatment certainly require scientific knowledge and technical skills, health care professionals must apply these abilities in imaginative ways. The same disease may present very different symptoms in two patients, and a treatment that cures one patient may not work on another.
At the turn of the 20th century, many men and women were feeble by age 40. The average American born in 1900 had a life expectancy of 47.3 years. Effective treatments for disease were so scarce that doctors could carry all their drugs and instruments in a small black bag. By the end of the 20th century, medical advances had caused life expectancy to increase to 76 years. Modern health care practitioners can prevent, control, or cure hundreds of diseases. People today remain independent and physically active into their 80s and 90s. The fastest-growing age group in the population now consists of people aged 85 and over.
This medical progress has been expensive. In 1998 Americans spent $1.1 trillion on health care, an average of $4,094 per person. In the same year, health care accounted for about 13.5 percent of the gross domestic product (GDP), about one-seventh of the country’s total output. Spending has grown rapidly from earlier in the century. In 1940, for instance, the United States spent $4 billion on health care.
Some 11.6 million people work in health care in the United States. They include about 778,000 physicians, 2.1 million registered nurses, and 160,000 dentists. Most of them work in health care services, which involve diagnosing and treating patients. Others work mainly in research, teaching, or administration of medical facilities.
A Physicians
Physicians diagnose diseases and injuries, administer treatment, and advise patients on good diet and other ways to stay healthy. The United States has two kinds of physicians, the Doctor of Medicine (MD) and the Doctor of Osteopathy (DO). Both use medicines, surgery, and other standard methods of treating disease. DOs place special emphasis on problems involving the musculoskeletal system, which includes muscles, ligaments, bones, and joints.
Patients receive medical care from primary care doctors and specialists. Primary care doctors include general practitioners, family physicians, general internists, and general pediatricians. Many women also use obstetricians-gynecologists as primary care doctors. Patients usually consult a primary care doctor when they first become ill or injured. Primary care physicians can treat most common disorders, and provide comprehensive, lifelong care for individuals and families.
But medical knowledge has advanced so far that no physician can master an entire field of medicine. Primary care doctors may refer patients with unusually complicated problems to specialists with advanced training in a particular disease or field of medicine. Specialists may even concentrate in one particular area, and become subspecialists. Each specialist in internal medicine, for instance, is an expert in diagnosis and nonsurgical treatment of adult diseases. But some internists take advanced training to become subspecialists in treating adolescents, heart disease, elderly people, cancer, or arthritis. For more information about the areas that specialists treat, see the table on Medical Specialties.
B Medical Education
Preparation for a career as a physician is long and demanding. It usually takes 11 years of study after high school to become a physician. The training typically includes four years of undergraduate or premedical study at a college or university; four years of medical school; and three years of advanced training in a residency. The exact length of study varies. Some colleges have a combined undergraduate and medical school program that lasts six years.
Premedical students usually major in science, or take some courses in biology, chemistry, biochemistry, mathematics, and physics. Medicine demands well-rounded individuals with knowledge of the humanities and social sciences, and courses in English, history, literature, art, music, sociology, and other fields are important. Many premedical students gain practical experience by taking summer jobs or volunteer positions in hospitals, clinics, or research laboratories.
Acceptance into a medical school requires excellent college grades, high scores on the Medical College Admission Test (MCAT), good letters of recommendation, and a personal interview with school officials. The United States has 144 medical schools. Of those, 125 award a Doctor of Medicine degree and 19 award a Doctor of Osteopathy. Students face very tough competition for admission to medical school. In 2000, more than 37,000 people applied for admission to medical school, but only 16,303 were accepted.
Education of a physician does not end with medical school graduation. New physicians must pass an examination for a state license to practice. Many then go into postgraduate education. MDs take a residency that lasts from one to seven years. DOs take an internship, which may be followed by a residency. During postgraduate education, physicians pursue advanced training and practical experience treating patients under the supervision of more experienced doctors. This postgraduate training usually takes place in a hospital or clinic.
To be recognized as a specialist in a particular field, a physician must pass a special examination and become board-certified. Physicians earn a certificate from the American Board of Medical Specialties, an organization in Atlanta, Georgia, that oversees the certification process.
Physicians who plan to go into research may obtain a doctoral degree in genetics, immunology, biochemistry, or another field. Some obtain still more training as postdoctoral fellows on the research teams of established scientists. Physicians who plan to specialize in public health may study for a master’s degree in that field.
After completing postgraduate education, physicians begin a lifetime of learning to keep current with new advances. They regularly read medical journals, take continuing education courses, and attend medical conferences.
C Other Health Professionals
Medicine is not restricted to physicians. A wide variety of health care practitioners work in this exciting field. By far the largest professional group is nurses. Registered nurses help physicians during examinations, treatment, and surgery. They observe, evaluate, and record patients’ symptoms, administer medications, and provide other care (see Nursing). Nurse practitioners perform basic duties once reserved for physicians, such as diagnosing and treating common illnesses and prescribing medication. Certified nurse-midwives care for mothers during pregnancy and deliver babies (see Midwifery). Nurse-anesthetists administer anesthesia to patients during surgery. Licensed practical nurses provide basic bedside care for sick patients under the supervision of registered nurses and physicians.
Physician assistants deliver basic health services under the general supervision of a physician. They examine patients, order X rays and laboratory tests, and prescribe drugs or other treatment. In some rural areas, physician assistants provide all basic health care for patients, consulting with a supervising physician by telephone or electronic mail.
Dentists diagnose, treat, and help prevent diseases of the teeth, gums, and other tissues in the mouth and jaws (see Dentistry). Most are general practitioners, but many specialize in a particular area of dental health. Orthodontists treat teeth that are poorly aligned; oral surgeons operate on the jaw and mouth; periodontists specialize in gum disease; pediatric dentists care for children; endodontists perform root canals; prosthodontists make and insert artificial teeth and dentures. Other dental professionals include dental hygienists who assist dentists in surgery, clean teeth, and provide fluoride treatments. They advise patients on proper oral hygiene techniques to prevent tooth and gum disease.
For more information about other health care practitioners, see the table on Allied Health Professionals.
A sick or injured person can obtain medical care in several different places. These include provider practices such as medical offices and clinics, hospitals, nursing homes, and home care.
There are about 200,000 medical offices, clinics, and other provider practices in the United States. Earlier in the 20th century most physicians were solo practitioners working in their own offices or in partnership with another doctor. Patients visited the office, received an examination or other service, and paid a fee. This traditional solo, fee-for-service medicine has been declining. Many physicians now practice in groups where they share the same offices and equipment with other doctors. Group practices may combine primary care physicians, several kinds of specialists, laboratories, and equipment for diagnosing disease. Physicians who practice in a group reduce their own expenses and provide patients with a wider range of services.
Many doctors are joining with hospitals, insurance companies, and industrial employers to provide managed care for groups of patients. Physicians may work as employees of health maintenance organizations (HMOs) or other health care alliances. These plans oversee, or manage, care for patients, to avoid unnecessary services and reduce costs. Rather than taking a fee from each patient, managed care physicians may receive an annual salary from the HMO or a fixed sum for each patient.
Patients who are too sick for care in a doctor’s office go to a hospital. Hospitals offer patients 24-hour care from a staff of health professionals. They provide services not available elsewhere, such as major surgery, child birth, and intensive care for the critically ill. The United States has about 6,020 hospitals including more than 1 million beds. Several kinds of hospitals exist, including general hospitals, specialized hospitals that care for specific diseases, small community hospitals, and large academic medical centers that train new doctors. Hospitals also provide many outpatient services to patients being treated in doctors’ offices and clinics. These include laboratory tests, computerized imaging scans, X rays, and other diagnostic tests for people who do not require admission to the hospital.
Hospital care is the most expensive form of health care. Efforts to control health care costs have emphasized reducing the number of patients admitted to hospitals and their length of stay. During the 1980s and 1990s, these efforts led to the closing of more than 600 hospitals, which eliminated almost 200,000 beds. Physicians also try to treat more people on a nonhospital, or outpatient, basis, and these cost-control efforts have led to fast growth in outpatient treatment centers. These include ambulatory surgery centers, where patients undergo operations once available only in hospitals and return home the same day.
Patients who need long-term medical care because of advanced age or chronic illness may stay in a nursing home. The United States has about 17,000 nursing homes with about 1.8 million patients. The number of nursing homes has doubled since 1960 because there are more older people in the population. Changing lifestyles, in which adult children and parents often live far apart, also contributed to the need for more nursing homes. About 85 percent of nursing home patients are age 65 and over. Many stay for a few weeks while recovering from an acute illness. They receive medical care and help with everyday activities like eating, bathing, and using the bathroom. Then they return home and care for themselves, often with the help of family or other caregivers. Other patients stay longer.
Some patients need regular medical care and other assistance, but are not sick enough for a hospital or nursing home. Home health care allows them to receive skilled nursing and other care in their own homes. Home care services are the fastest-growing sector of the health care industry, increasing about 30 percent per year in the 1990s. This growth is largely because home care is less expensive than hospital or nursing home care. Home care also is very popular with patients because most people prefer staying at home, rather than entering a hospital or nursing home. About 15,000 home health agencies provide most home care services in the United States. Many agencies are privately owned. Hospitals, public health departments, and other organizations also offer home care.
Hospices are special health care facilities that provide care for dying patients in the final stages of a terminal illness. A hospice staff is focused on making the last days of a dying patient pain free and comfortable. Many patients choose hospice services in their homes.
The quality of care in other developed countries is comparable with that in the United States—patients have access to similar drugs, diagnostic tests, and other technology for preventing, diagnosing, and treating disease. Patients in Canada, the United Kingdom, Western Europe, and Japan use primary care physicians for most health problems. Patients are sent to specialists for more serious conditions, and may receive care in hospitals and nursing homes.
The major difference is in the way other developed countries pay for health care. Private health insurance pays for most care in the United States. About 90 percent of Americans have private health insurance. Employers usually pay a portion of the premium, or cost, as part of the benefits provided to employees besides their salaries.
Most European countries have a national health insurance plan that provides free care. Taxes paid by citizens pay the cost. In Canada, the central government and the provinces share costs for medical care. Individuals usually contribute a certain amount through payroll deductions. The central government does not own most health care facilities in these countries.
China and other countries have a completely socialized health care system. The government owns all health care facilities. Physicians and other health care personnel are government employees. The former Soviet Union established the world’s first socialized medical system in the 1920s. But Russia and other independent republics, formed when the Soviet Union broke up in 1990, are experimenting with private health insurance and other financing methods.
Billions of people in developing countries suffer greatly because medical care is not readily available and is poor in quality. Governments in many poor countries in sub-Saharan Africa and Asia spend only a few dollars per person on health care each year. Trained people, equipment, and medicines needed to provide the most basic medical care are in grave shortage. Families in these countries typically earn only a few hundred dollars each year. They must rely on the government, international aid organizations, missions, or charities for health care.
Health care personnel and facilities are not evenly distributed among the world’s population. Wealthy industrialized countries have more physicians and hospital beds per person than poorer developing countries. In the mid-1990s, the United States had one physician for every 400 people and Canada one per 454 persons. In comparison, the African country of Malawi had one physician per 45,736 people; Nigeria had one per 5,207 people; and India had one physician per 2,459 people. Hospital facilities are also distributed unequally. The United States has one hospital bed for every 244 people compared with one per 196 in Canada; one per 949 in Honduras; one per 1,252 in Haiti; and one per 1,270 in India. Major imbalances in the amount of money spent on health care also exist. The poorest developing countries spent less than $10 per person per year on health, compared to several thousand per person in developed countries.
Research is one of the most important fields of medicine. It provides health care professionals with new knowledge and technology for better diagnosis, treatment, and prevention of disease.
Medical research often combines medicine with related fields of biology, and is called biomedical research. Research can be basic or applied. Basic, or fundamental, research has no immediate practical application. Basic cancer research, for instance, may try to identify gene mutations that turn a healthy cell malignant. While this information does not have immediate clinical value, it generates knowledge that often leads to better care for patients. Applied research has a specific practical goal, such as development of a better drug for breast cancer. The early stages of biomedical research usually occur in a laboratory. As scientists gain more knowledge in a particular area, they begin studies on humans. These studies often take place in hospitals or clinics and are called clinical research.
Clinical research usually is performed by multidisciplinary teams, rather than by individual scientists working alone. These groups of men and women have knowledge and skills in different areas, or disciplines, of science. A multidisciplinary biomedical research team may include biochemists, geneticists, physiologists, and physicians. Each team member approaches the problem from a different side and shares knowledge with the group. This multidisciplinary approach increases the chances of solving a problem or developing a new treatment.
A Clinical Trials
One of the greatest advances in medicine was the introduction of a new research technique in the mid-1950s called the controlled clinical trial, which is used to determine if new drugs and other treatments are safe and effective. In the controlled clinical trial, one group of patients, the treatment group, receives the new drug or new treatment. Another group, the control group, is given an inactive pill (a placebo) or the best standard treatment. Researchers then compare the two groups over a period of time. The data collected is put through rigorous statistical techniques to determine whether the new treatment is safer and more effective than standard therapy or no treatment.
Most clinical trials are conducted on a blind or double-blind basis. In a blind trial, patients do not know whether they receive the new drug or a placebo. In a double-blind trial, neither patients nor physicians know who is receiving the new treatment. This secrecy is important because patients who know they are taking a powerful new drug may expect to feel better and report improvement to doctors. Researchers who know that a patient is receiving the test treatment may also see improvements that really do not exist.
Clinical trials usually are randomized. Researchers put patients into the treatment group or control group at random. This helps to assure that neither group contains an excess of patients with severe disease. A drug may appear more effective if the treatment group were packed with patients who had only mild symptoms.
The results of clinical trials are subjected to peer review. Researchers publish their results in scientific journals or present them to an audience of other scientists, who are their peers. This gives scientists not involved in the research a chance to spot potential errors.
B Research Funding
Until World War II (1939-1945), most money for medical research in the United States was donated by wealthy individuals, industry, and universities. Scientists resisted government funding because they feared losing the intellectual freedom to study as they chose. Since the 1940s, however, the Federal Government has taken a major role in funding biomedical research.
The National Institutes of Health (NIH) in Bethesda, Maryland, is the biggest government source of research funds. NIH is an agency within the U. S. Department of Health and Human Services. In 2001, the NIH planned to spend about $20.3 billion on biomedical research, distributed to scientists in colleges and universities to conduct specific research projects.
The pharmaceutical industry spent about $26 billion on research in 2000. The next largest source of funds is the Howard Hughes Medical Institute (HHMI), which spends about $554 million annually. Other major funding sources are private foundations and voluntary health organizations. Private foundations are organizations established by wealthy individuals. Among those active in biomedical research are the Charles A. Dana Foundation, the Lucille P. Markey Foundation, and the Whittaker Foundation. Voluntary health organizations are charities supported by contributions from members and the public. Major voluntary health organizations include the American Cancer Society, the American Heart Association, and the American Diabetes Association.
C Research Costs
Research is expensive. During the late 1990s the NIH often spent more than $130,000 per year to fund an average research project. Drug manufacturers estimate that they spend an average of $359 million to develop one new drug.
The availability of funding often determines what medical research is conducted. Voluntary health organizations and other groups act as advocates in urging or lobbying the government to spend more on their own particular disease. Governments in developed countries usually spend most heavily on diseases that affect their own citizens, and these diseases are typically different than those commonly found in developing countries. Pharmaceutical companies also emphasize development of the most profitable new drugs, usually for diseases that occur in developed countries.
As a result, little research is done on diseases that kill millions of people in developing nations. In 1998, for instance, the NIH planned to spend only $116 million on malaria and other tropical diseases. While rare in industrialized nations with developed health care programs, malaria kills 1.5 million to 2.7 million people in developing countries each year.
Our understanding of prehistoric medical practice is from the study of ancient pictographs that show medical procedures, as well as the surgical tools uncovered from anthropological sites of ancient societies.
Serious diseases were of primary interest to early humans, although they were not able to treat them effectively. Many diseases were attributed to the influence of malevolent demons who were believed to project an alien spirit, a stone, or a worm into the body of the unsuspecting patient. These diseases were warded off by incantations, dancing, magic charms and talismans, and various other measures. If the demon managed to enter the body of its victim, either in the absence of such precautions or despite them, efforts were made to make the body uninhabitable to the demon by beating, torturing, and starving the patient. The alien spirit could also be expelled by potions that caused violent vomiting, or could be driven out through a hole cut in the skull. This procedure, called trepanning, was also a remedy for insanity, epilepsy, and headache.
Surgical procedures practiced in ancient societies included cleaning and treating wounds by cautery (burning or searing tissue), poultices, and sutures, resetting dislocations and fractures, and using splints to support or immobilize broken bones. Additional therapy included laxatives and enemas to treat constipation and other digestive ills. Perhaps the greatest success was achieved by the discovery of the narcotic and stimulating properties of certain plant extracts. So successful were these that many continue to be used today, including digitalis, a heart stimulant extracted from foxglove.
Several systems of medicine, based primarily on magic, folk remedies, and elementary surgery, existed in various diverse societies before the coming of the more advanced Greek medicine about the 6th century bc.
A Egyptian
Egyptian medicine was marked by a mystical approach to healing, as well as a more empirical or rational approach that was based on experience and observation. Common diseases of the eyes and skin were usually treated rationally by the physician because of their accessible location; internal disorders continued to be treated by the spells and incantations of the priest-magician.
The physician emerged around 2600 bc as an early form of scientist, a type distinct from the sorcerer and priest. The earliest physician whose name has survived is Imhotep (lived about 2600 bc), renowned for his studies of pathology and physiology as well as his expertise as a pyramid builder and an astrologer. The Egyptian physician normally spent years of arduous training at temple schools in the arts of interrogation, inspection, and palpation (examining the body by touch). Prescriptions contained some drugs that have continued in use through the centuries. Favorite laxatives were figs, dates, and castor oil. Tannic acid, derived principally from the acacia nut, was valued in the treatment of burns.
Although Egyptians practiced embalming to preserve bodies after death, their knowledge of anatomy was minimal. As a result, they attempted only minor surgical procedures, with the exception of trepanning. According to reports of the Greek historian Herodotus, the ancient Egyptians recognized dentistry as an important surgical specialty.
B Mesopotamian
Medicine in Assyria and Babylonia was influenced by demonology and magical practices. Surprisingly accurate terra-cotta models of the liver, then considered the seat of the soul, indicate the importance attached to the study of that organ in determining the intentions of the gods. Dreams also were studied to learn the gods’ intentions.
While magic played a role in healing, surviving cuneiform tablets indicate a surprisingly empirical approach to some diseases. The tablets present an extensive series of medical case histories, indicating a large number of medical remedies were used in Mesopotamia, including more than 500 drugs made from plants, trees, roots, seeds, and minerals. Emollient enemas were given to reduce inflammation; massage was performed to ease gastric pain; the need for rest and quiet was stressed for some diseases; and some attention was paid to diet. Water was regarded as particularly important, since it was the sacred element of the god Ea, the chief among the numerous healing gods. The serpent Sachan was also venerated as a medical deity.
C Palestinian
Hebrew medicine was mostly influenced by contact with Mesopotamian medicine during the Assyrian and Babylonian captivities. Disease was considered evidence of the wrath of God. The priesthood acquired the responsibility for compiling hygienic regulations, and the status of the midwife as an assistant in childbirth was clearly defined. Although the Old Testament contains a few references to diseases caused by the intrusion of spirits, the tone of biblical medicine is modern in its marked emphasis on preventing disease. The Book of Leviticus includes precise instructions on such varied subjects as feminine hygiene, segregation of the sick, and cleaning of materials capable of harboring and transmitting disease. Although circumcision, the surgical removal of the foreskin on the male’s penis, is the only surgical procedure clearly described in the Bible, common medical practices include wounds dressed with oil, wine, and balsam. The leprosy so frequently mentioned in the Bible is now believed to have embraced many skin diseases, including psoriasis.
D Indian
The practices of ancient Hindu, or Vedantic, medicine (1500-1000 bc) are described in the works of two later physicians, Charaka (lived about 2nd century ad) and Susruta (lived about 4th century ad). Susruta gave recognizable descriptions of malaria, tuberculosis, and diabetes. He also wrote about Indian hemp, or Cannabis, and henbane for inducing anesthesia, and included specific antidotes and highly skilled treatments for bites of venomous snakes. An ancient Hindu drug derived from the root of the Indian plant Rauwolfia serpentina was the source of the first modern tranquilizer. In the field of surgery, the Hindus are acknowledged to have attained the highest skill in all antiquity. They were probably the first to perform successful skin grafting and plastic surgery for the nose.
With the rise of Buddhism the study of anatomy was prohibited, and with the Muslim conquest of India, beginning around AD 1000, the field of medicine further declined and ultimately stagnated. Nevertheless, much valuable knowledge concerning hygiene, diet, and surgery was passed to the West through the writings of Indian physicians.
E Chinese
Chinese physicians believed that diseases result from imbalances in two life forces, Yin and Yang, that flow through the body. Drugs and other treatments were intended to restore this balance. Hundreds of ancient herbal medicines, including iron for anemia, mercury for syphilis, arsenic for skin diseases, and opium, are still used in traditional Chinese medicine. Other Chinese medicines and techniques, including acupuncture, are now commonly used in Western medicine. Most Chinese medicine was based on a famous textbook, the Nei Ching, written by Emperor Huang Ti between 479 and 300 bc. Chinese physicians specialized in treating wounds, fractured bones, allergies, and other diseases. They diagnosed patients by asking questions about symptoms, diet, and previous illnesses, and by checking the patient’s pulse.
F Greek
Greek culture, renowned for its masterpieces of art, poetry, drama, and philosophy, also made great advances in medicine. The earliest Greek medicine still depended on magic and spells. Homer considered Apollo the god of healing. Homer’s Iliad, however, reveals a considerable knowledge of the treatment of wounds and other injuries by surgery, already recognized as a specialty distinct from internal medicine.
By the 6th century bc, Greek medicine had left the magic and religious realm, instead stressing clinical observation and experience. In the Greek colony of Crotona the biologist Alcmaeon (lived about 6th century bc) identified the brain as the physiological seat of the senses. The Greek philosopher Empedocles elaborated the concept that disease is primarily an expression of a disturbance in the perfect harmony of the four elements—fire, air, water, and earth—and formulated a rudimentary theory of evolution.
Kos and Cnidus are the most famous of the Greek medical schools that flourished in the 5th century bc. Students of both schools probably contributed to the Corpus Hippocraticum (Hippocratic Collection), an anthology of the writings of several authors, although popularly attributed to Hippocrates, who is known as the father of medicine. Hippocrates was the greatest physician in antiquity. He convinced physicians that disease had identifiable causes and was not due to the supernatural. His writings were used in medical textbooks well into the 19th century. Greek physicians introduced such modern ideas as prognosis, or outcome of disease, and the use of case histories of actual patients to teach students. The highest ethical standards were imposed on physicians, who took the celebrated oath usually attributed to Hippocrates and still used in modified form today (see Hippocratic Oath).
Although not a practicing physician, the Greek philosopher Aristotle contributed greatly to the development of medicine by his dissections of numerous animals. He is known as the founder of comparative anatomy. Further progress in understanding anatomy flourished by the 3rd century bc in Alexandria, Egypt, which was firmly established as the center of Greek medical science. In Alexandria the anatomist Herophilus performed the first recorded public dissection, and the physiologist Erasistratus did important work on the anatomy of the brain, nerves, veins, and arteries. The followers of these men divided into many contending sects. The most notable were the empiricists who based their doctrine on experience gained by trial and error. The empiricists excelled in surgery and pharmacology; a royal student of empiricism, Mithridates VI Eupator, king of Pontus, developed the concept of inducing tolerance of poisons by the administration of gradually increased dosages.
G Greco-Roman
Alexandrian Greek medicine influenced conquering Rome despite initial resistance from the Romans. Asclepiades of Bithynia was important in establishing Greek medicine in Rome in the 1st century bc. Asclepiades taught that the body was composed of disconnected particles, or atoms, separated by pores. Disease was caused by restriction of the orderly motion of the atoms or by the blocking of the pores, which he attempted to cure by exercise, bathing, and variations in diet, rather than by drugs. This theory was revived periodically and in various forms as late as the 18th century.
Galen of Pergamum, also a Greek, was the most important physician of this period and is second only to Hippocrates in the medical history of antiquity. His view of medicine remained undisputed into the Middle Ages ( 5th century to 15th century). Galen described the four classic symptoms of inflammation and added much to the knowledge of infectious disease and pharmacology. His most important work, however, was in the field of the form and function of muscles and the function of the areas of the spinal cord. He also excelled in diagnosis and prognosis. Some of Galen’s teachings tended to hold back medical progress, however, such as his theory that the blood carried the pneuma, or life spirit, which gave it its red color. This theory, coupled with the erroneous notion that the blood passed through a porous wall between the ventricles of the heart, delayed the understanding of circulation and did much to discourage research in physiology. The importance of Galen’s work cannot be overestimated, however, for through his writings knowledge of Greek medicine was subsequently passed to the Western world by the Arabs.
While the Romans learned most of their medical knowledge from Egypt, Greece, and other countries that they conquered, their own contributions involved sanitation and public health. Roman engineers built aqueducts to carry pure water to residents of Rome, a sewage system to dispose of human wastes, and public baths. These measures helped to prevent infectious diseases transmitted by contaminated water.
The gradual infiltration of the Roman world by a succession of barbarian tribes was followed by a period of stagnation in the sciences. These invasions destroyed the great medical library in Alexandria (Alexandria, Library of) and many of its books and medical manuscripts were lost. Western medicine in the Middle Ages consisted of tribal folklore mingled with poorly understood remnants of classical learning. Even in sophisticated Constantinople (now ?stanbul), a series of epidemics served only to initiate a revival of magical practices, superstition, and intellectual stagnation.
H Arabic
In the 7th century ad a vast portion of the Eastern world was overrun by Arab conquerors. In Persia (now Iran), the Arabs learned of Greek medicine at the schools of the Nestorian Christians (see Nestorianism), a sect in exile from the Byzantine Empire. These schools had preserved many texts lost in the destruction of the Alexandria Library. Translations from Greek were instrumental in the development of an Arabic system of medicine throughout the Arab-speaking world. Followers of the system, known as Arabists, did much to elevate professional standards by insisting on examinations for physicians before licensure. They introduced numerous therapeutic chemical substances and excelled in the fields of ophthalmology and public hygiene.
Important among Arabist physicians was al-Razi, who was the first to identify smallpox and measles and to suggest blood as the cause of infectious diseases. Avenzoar was the first to describe the parasite causing the skin disease scabies and was among the earliest to question the authority of Galen. Maimonides wrote extensively on diet, hygiene, and toxicology, the study of chemicals and their effect on the body. Al-Quarashi, also known as Ibn al-Naf?s, wrote commentaries on the writings of Hippocrates and treatises on diet and eye diseases. He was the first to determine the pathway of blood, from the right to the left ventricle via the lungs.
I European
In early medieval Europe, religious groups established hospitals and infirmaries in monasteries and later developed charitable institutions designed to care for the victims of vast epidemics of bubonic plague, leprosy, smallpox, and other diseases that swept Europe during the Middle Ages. The Benedictines were especially active in this work, collecting and studying ancient medical texts in their library at Monte Cassino near Salerno, Italy. St. Benedict of Nursia, the founder of the order, obligated its members to study the sciences, especially medicine. The abbot of Monte Cassino, Bertharius, was himself a famous physician.
During the 9th and 10th centuries Salerno became Europe’s center for medical care and education and was the site of the first Western school of medicine. By the 12th century other medical schools were established at the universities of Bologna and Padua in Italy, the University of Paris in France, and Oxford University in England.
In the 13th century, medical licensure by examination was endorsed and strict measures were instituted for the control of public hygiene. Representative scientists of this period include the German scholastic St. Albertus Magnus, who engaged in biological research, and the English philosopher Roger Bacon, who undertook research in optics and refraction and was the first scholar to suggest that medicine should rely on remedies provided by chemistry. Bacon, often regarded as an original thinker and pioneer in experimental science, was strongly influenced by the authority of Greek and Arabic medicine.
The period of the Renaissance, which began at the end of the 14th century and lasted for about 200 years, was one of the most revolutionary and stimulating in the history of mankind. Invention of printing and gunpowder, discovery of America, the new cosmology of Copernicus, the Reformation, the great voyages of discovery—all these new forces were working to free science and medicine from the shackles of medieval stagnation. The fall of Constantinople in 1453 scattered the Greek scholars, with their precious manuscripts, all over Europe.
The revival of learning in Western civilizations brought great advances in human anatomy. Some resulted from the work of artists, including Italian Leonardo da Vinci, who dissected human corpses to portray muscles and other structures more accurately. Andreas Vesalius, a Belgian anatomist, clearly demonstrated hundreds of anatomical errors introduced by Galen centuries earlier. Gabriel Falliopius discovered the uterine tubes named after him (see Fallopian Tube) and diagnosed ear diseases with an ear speculum. He described in detail the muscles of the eye, tear ducts, and fallopian tubes. Italian physician Girolamo Fracastoro recognized that infectious diseases are spread by invisible so-called seeds that can reproduce themselves. He founded modern epidemiology, the study of how diseases spread. The term syphilis, applied to the virulent disease then devastating Europe, was derived from his famous poem, “Syphilis sive Morbus Gallicus” (Syphilis or Disease of Gauls, 1530). Ambroise Paré introduced new surgical techniques and helped to found modern surgery.
The event that dominated 17th-century medicine and marked the beginning of a new epoch in medical science was the discovery of how the blood circulates in the body by the English physician and anatomist William Harvey. Harvey’s “Essay on the Motion of the Heart and the Blood” (1628) established that the heart pumps the blood in continuous circulation. The Italian anatomist Marcello Malpighi advanced Harvey’s work by his discovery of tiny blood vessels called capillaries, and the Italian anatomist Gasparo Aselli provided the first description of the lacteals, capillaries found in the lymphatic system. In England the physician Thomas Willis investigated the anatomy of the brain and the nervous system and was the first to describe diabetes mellitus. The English physician Francis Glisson advanced the knowledge of the anatomy of the liver, described the nutritional disorder rickets (sometimes called Glisson’s disease), and was the first to prove that muscles contract when activity is performed. The English physician Richard Lower studied the anatomy of the heart, showed how blood interacts with air, and performed one of the first blood transfusions.
The French mathematician and philosopher René Descartes, who also made anatomical dissections and investigated the anatomy of the eye and the mechanism of vision, maintained that the body functioned as a machine. This view was adopted by the so-called iatrophysicists, such as Italian physician Sanctorius, who investigated metabolism, and the Italian mathematician and physicist Giovanni Alfonso Borelli, who worked in the area of physiology. Opponents of this view were the iatrochemists, who regarded life as a series of chemical processes, including Jan Baptista van Helmont, a Flemish physician and chemist, and Prussian anatomist Franciscus Sylvius, who studied the chemistry of digestion and emphasized the treatment of disease by drugs.
The English physician Thomas Sydenham, called the English Hippocrates, and later the Dutch physician Hermann Boerhaave, reestablished the significance of bedside instruction in their emphasis on the clinical approach to medicine. Sydenham carried out extensive studies on malaria and introduced the new treatment quinine, obtained from cinchona bark, into Europe in 1632. After the invention of the first compound microscope in 1590, Dutch scientist Antoni van Leeuwenhoek used this groundbreaking technology in 1676 to identify organisms later called bacteria. This was the first step toward recognition that microbes were the cause of infectious disease.
A 18th-Century Medicine
The 18th century continued to be marked by unsupported theories. The German physician and chemist Georg Ernst Stahl believed that the soul is the vital principle and that it controls organic development; in contrast, the German physician Friedrich Hoffmann considered the body a machine and life a mechanical process. These opposing theories of the vitalists and the mechanists were influential in 18th-century medicine. The British physician William Cullen attributed disease to the excess or deficiency of nervous energy; and the physician John Brown of Edinburgh taught that disease was caused by weakness or inadequate stimulation of the organism. According to his theories, known as the Brunonian system, stimulation should be increased by treatment with irritants and large dosages of drugs. In opposition to this system, the German physician Samuel Hahnemann developed the system of homeopathy late in the 18th century, which emphasized small dosages of drugs to cure disease.
Other unusual medical practices developed toward the end of the 18th century include phrenology, a theory formulated by the German physician Franz Joseph Gall, who believed that examination of the skull of an individual would reveal information about mental functions. The theory of animal magnetism developed by the Austrian physician Franz Mesmer was based on the existence of a magnetic force having a powerful influence on the human body.
Despite these unorthodox medical practices, the end of the 18th century was marked by many true medical innovations. British physicians William Smellie and William Hunter made advances in obstetrics that established this field as a separate branch of medicine. The British social reformer John Howard furthered humane treatment for hospital patients and prison inmates throughout Europe. In 1796 British physician Edward Jenner introduced vaccination to prevent smallpox. His efforts both controlled this dreaded disease and also established the science of immunization.
B 19th-Century Medicine
Many discoveries made in the 19th century led to great advances in diagnosis and treatment of disease and in surgical methods. Medicine’s single most important diagnostic tool, the stethoscope, an instrument used to detect sounds in the body such as a heart beat, was invented in 1819 by French physician René-Théophile-Hyacinthe Laënnec. A number of brilliant British clinicians studied and described diseases that today bear their names. British physician Thomas Addison discovered the disorder of the adrenal glands now known as Addison’s disease; Richard Bright diagnosed the kidney disorder, Bright’s disease; British physician Thomas Hodgkin described a cancer of lymphatic tissue now known as Hodgkin’s disease; British surgeon and paleontologist James Parkinson described the chronic nervous system disease called Parkinson disease; and the Irish physician Robert James Graves diagnosed the thyroid disorder exophthalmic goiter, sometimes called Graves’ disease.
Medicine, like all other sciences, is subject to influences from other fields of study. This was particularly true during the 19th century, renowned for its great scientific innovations. For instance, the evolutionary theory proposed by Charles Darwin in On the Origin of Species by Means of Natural Selection (1859) revived interest in the science of comparative anatomy and physiology. And the plant-breeding experiments of the Austrian biologist Gregor Johann Mendel in 1866, although initially overlooked, eventually had a similar effect in stimulating studies in human genetics (see Heredity).
German pathologist Rudolf Virchow pioneered development of pathology, the scientific study of disease. Virchow showed that all diseases result from disorders in cells, the basic units of body tissue. His doctrine that the cell is the seat of disease remains the cornerstone of modern medical science. In France, physiologist Claude Bernard performed important research on the pancreas, liver, and nervous system. His scientific studies, which emphasized that an experiment should be objective and prove or disprove a hypothesis, were the basis for the scientific method used today. Bernard’s work on the interaction of the digestive system and the vasomotor system, which controls the size of blood vessels, was developed further by the Russian physiologist Ivan Petrovich Pavlov, who developed the theory of the conditioned reflex, the basis of human behaviorism.
A milestone in medical history occurred in the 1870s when French chemist Louis Pasteur and German physician Robert Koch separately established the germ theory of disease. Important in the development of this theory was the pioneering work of the American physician and author Oliver Wendell Holmes and of the Hungarian obstetrician Ignaz Philipp Semmelweis, who showed that the high rate of mortality in women after childbirth was attributable to infectious agents transmitted by unwashed hands (see Puerperal Fever).
Soon after the germ theory was recognized, the causes of such age-old scourges as anthrax, diphtheria, tuberculosis, leprosy, and plague were isolated. Pasteur developed a way to prevent rabies using a vaccine in 1885. In the last decade of the 19th century, German physician Emil von Behring and German bacteriologist Paul Ehrlich developed techniques for immunizing against diphtheria and tetanus.
New understanding of infectious diseases made surgery safer. Until the 1800s, surgeons operated in their street clothes, often without even washing their hands. Operating rooms, like other parts of hospitals, were filthy. About half of all surgery patients who survived the actual surgery typically died of infections that developed after the operation. The era of aseptic surgery, in which physicians used sterilized instruments and techniques to avoid infecting patients, was heralded by British surgeon and biologist Joseph Lister. With his introduction of an effective antiseptic, carbolic acid, Lister was able to successfully reduce mortality from wound infection (see Antiseptics). Rubber gloves were first worn during surgery in 1890, and gauze masks in 1896.
Another great advance in surgery came with the discovery of anesthesia. Until the 19th century, doctors used alcohol, opium, and other drugs to relieve pain during surgery. These medications could sometimes dull pain but could never completely mask it—patients often suffered from shock and died during surgery. In the United States, physician Crawford Long discovered the anesthetic effects of ether in 1842, and the dentist William Morton used ether in a tooth extraction in 1846. Ether and other anesthetics reduced surgical mortality and enabled surgeons to perform longer, more complicated operations.
A new tool for diagnosing internal diseases became available in 1895 when German scientist Wilhelm Roentgen discovered X rays. The Danish physician Niels Ryberg Finsen developed an ultraviolet-ray lamp, which led to an improved prognosis for some skin diseases (see Ultraviolet Radiation). In 1898 in France, Marie and Pierre Curie discovered radium, which was later used to treat cancer.
In 1898 British physician Ronald Ross proved the role of the mosquito as a carrier of the malarial parasite, a disease that has been widespread and sometimes fatal for most of human history. In 1900 United States Army physician Walter Reed and his colleagues, acting on a suggestion made by the Cuban biologist Carlos Juan Finlay, demonstrated that the mosquito is the carrier of yellow fever. This finding lead to better sanitation and mosquito control, resulting in the virtual elimination of this disease from Cuba and other areas.
Medicine’s most revolutionary advances have occurred since 1900. By the end of the 20th century, medical advances helped to increase the average person’s life expectancy by almost 30 years. As people lived longer, new medical challenges emerged. Heart disease, cancer, stroke, and other conditions often associated with aging replaced infectious diseases as the leading causes of death. Physicians began to devote greater attention to preventing disease and keeping patients healthy into advanced age. Biomedical research also shifted focus to the most basic causes of diseases, including defects in individual genes.
A Infectious Diseases
Infectious diseases that historically have killed millions of people each year were conquered early in the 20th century by improved sanitation, antibiotics, and vaccines.
German physician Paul Ehrlich showed around 1910 that a chemical compound, arsphenamine, could treat syphilis. He opened the era of chemotherapy, in which physicians use chemical compounds that act selectively to target specific diseases.
In the early 1930s, German and French scientists showed that sulfonamide was effective in treating streptococcal bacteria infections. This discovery led to the first family of so-called wonder drugs, the sulfonamide antibiotics. In 1938 British biochemists Howard Florey and Ernst Chain purified penicillin, the bacteria-destroying compound that Alexander Fleming observed in mold ten years earlier. Streptomycin, the first antibiotic for tuberculosis, was discovered in 1944 by American microbiologist Selman Waksman. Dozens of other antibiotics were subsequently discovered, each stronger and more effective against a broader range of bacteria.
Scientists learned more about how the body’s immune system protects itself from infections, resulting in new tests for diagnosing infectious diseases and new vaccines to prevent them. The Wasserman blood test for syphilis was developed in 1906 and the tuberculin skin test for tuberculosis appeared in 1908. By the 1930s new techniques for growing viruses in the laboratory led to vaccines against viral diseases. These included a yellow fever vaccine in the late 1930s and the first effective influenza vaccine in the 1940s. The American physician Jonas E. Salk developed a polio vaccine in 1954. Later virologist Albert B. Sabin developed a safer oral polio vaccine, which was in wide use by the 1960s. Later came vaccines for other childhood diseases, including measles, German measles, mumps, and chicken pox.
Infectious diseases, once thought conquered by antibiotics, became a major concern again in the 1990s. New forms of tuberculosis and other diseases resistant to antibiotics spread. Concerns also arose over new or newly recognized microbes, such as human immunodeficiency virus (HIV), the cause of acquired immunodeficiency syndrome (AIDS), which became epidemic in 1981. As human populations grow and expand into wilderness areas, humans and animals come in closer contact. A number of diseases transmitted from animals have become problematic in recent years, including the hemorrhagic fevers caused by the Ebola and Marburg viruses, hantavirus pulmonary syndrome, and Lyme disease. In other areas, physicians recognized that an easily curable bacterial infection caused most peptic ulcers, a disease once blamed on stress and diet.
B Nutrition
Polish-born American biochemist Casimir Funk introduced the term vitamine in 1912. Researchers later identified vitamins needed by the body to prevent deficiency diseases such as beriberi, rickets, scurvy, and pellagra. As better nutrition was developed and the quality of life improved, these diseases almost disappeared from industrialized countries (see Human Nutrition). But by the end of the 20th century, other nutritional disorders emerged. Studies conducted in the United States in the 1990s showed that more than 97 million Americans were overweight and risked health problems, such as heart disease and diabetes mellitus, commonly associated with obesity.
C Surgery
Operations that people once regarded as impossible became routine in the 20th century. Many of these surgical advances resulted from improved drugs or medical technology. Better drugs to prevent rejection of transplanted organs made transplantation of hearts, kidneys, livers, lungs, and other organs removed from donors possible. Patients were kept alive with artificial kidneys and temporary artificial hearts while awaiting a transplant (see Medical Transplantation). The heart-lung machine made it possible to stop and restart the heart during coronary bypass surgery. Small fiber-optic instruments called endoscopes led to the new field of minimally invasive surgery. These new tools made it possible to remove a diseased gallbladder or appendix, for example, through small slits rather than large incisions, greatly reducing the amount of anesthesia required during the surgery and lessening recovery time. Transfusions of blood, plasma, and other saline solutions, which went into use in the 1930s, helped prevent deaths from shock in surgery patients. In the 1990s, physicians even began performing surgery to repair defects in unborn infants.
D Radiology
New methods for viewing diseased structures inside the body improved diagnosis of disease beginning in the 1970s (see Radiology). A gamma camera detects radioactive medication that attaches to certain forms of cancer cells. Computed tomography (CT) scanners use X rays to produce lifelike three-dimensional images of body structures. Magnetic resonance imaging (MRI) scanners produce highly detailed images without X rays. Positron emission tomography (PET) detects very early warning signs of disease. Sonograms, or ultrasound, taken with high-frequency sound waves diagnose disease and monitor the progress of pregnancies. X rays and high-energy particles emitted by linear accelerators also are used to treat cancer. Lithotripsy uses high-frequency sound waves to destroy some kidney stones and gallstones, conditions that once required surgery.
E Mental Illness
Even in the early part of the 20th century, mental illness was almost a sentence of doom, and mentally ill persons were handled with cruel confinement and little medical aid. In the latter half of the century, successful therapy for some mental illnesses has greatly improved the prognosis for these diseases and has partly removed their stigma.
The theories advanced by Austrian physician Sigmund Freud were among the first attempts to understand malfunctioning of the mind, but the methods of psychoanalysis advocated by Freud and modified by his followers proved ineffective for treating certain serious mental illnesses. Two early attempts to treat psychotic illness were the destruction of parts of the brain in a procedure called lobotomy, introduced in 1935, and electroconvulsive therapy, devised in 1938. Lobotomy and less severe forms of psychosurgery are now used only rarely, and electroconvulsive therapy is primarily a treatment for depressive illness that has not responded to drug therapy.
A new era in treatment of schizophrenia, a severe form of mental illness, began in the early 1950s with the introduction of phenothiazine drugs. These drugs led to a new trend, deinstitutionalization, in which patients were released from mental hospitals and treated in the community. Valium (see Diazepam) and other benzodiazepine drugs went into wide use in the 1970s for treating anxiety and other emotional illness. Late in the century, there was growing awareness about the importance of diagnosing and treating clinical depression, a leading cause of suicide. Advanced imaging techniques that show the structural and functional differences in the brains of people with certain mental illnesses have opened the door for new treatment options.
F Genetics and Biotechnology
The discovery of genes and their role in heredity and disease was one of the most important medical advances in history (see Genetics). In 1953 British biophysicist Francis Crick and American biochemist James Watson identified the double-helix structure of deoxyribonucleic acid (DNA). This discovery helped to explain how DNA carried genetic information. In the 1960s American biochemist Marshall Nirenberg added key details about how DNA determines the structure of proteins.
Indian-born American biochemist Har Gobind Khorana was the first to synthesize a gene in the laboratory in 1970, forging the way for scientists to develop ways to isolate, alter, and clone, or copy, genes. They applied these genetic engineering techniques to the diagnosis and treatment of diseases. Researchers identified genes associated with cancer, heart disease, mental illness, and obesity. With the genes identified, they worked on ways of modifying the genes to treat the disease. Gene therapy emerged as an experimental medical field that used genetically modified genes to treat diseases. In 2003 scientists completed the sequence of the human genome, in which they identified all the genes needed to make a human being (see Human Genome Project).
Genetic engineering techniques enabled production of scarce human hormones and other materials for use as drugs. A new biotechnology industry started producing these materials for medical use. Scientists also began genetically modifying sheep and other animals to produce drugs in their milk.
G Endocrinology
In 1905, British scientist Ernest H. Starling introduced the word hormone to describe substances secreted by the endocrine glands that regulate body functions (see Endocrine System). The discovery of adrenaline, or epinephrine, in 1901 led to identification and isolation of other hormones. One of the most important advances was the discovery of insulin by Canadian scientists Frederick Banting and Charles H. Best and Scottish physiologist John J. Macleod in 1921. For years people with diabetes mellitus used insulin extracted from animal pancreases. In 1981, human insulin produced using biotechnology became available. American physicians made another major advance in endocrinology in 1949. They discovered that cortisone, an adrenal gland hormone, relieved inflammation. New discoveries about human sex hormones later led to the first birth control pills.
H Pregnancy and Childbirth
Great advances were made in birth control with the improvement of intrauterine devices in the 1950s and the development of the birth control pill in 1960 by the American biologist Gregory Pincus. By the 1990s long-lasting hormonal implants and contraceptive injections such as Depo-Provera were developed. These options gave women more control in deciding whether to become pregnant. Voluntary sterilization, involving vasectomies in men and tubal sterilization in women, emerged as a popular way of permanent birth control. Unwanted pregnancies, however, remained a serious problem in the late 1990s. Researchers still sought more convenient and safer methods of birth control, including a male birth control pill.
By 1975 physicians were able to diagnose some congenital or inherited diseases before childbirth (see Birth Defects). Doctors take samples of placental cells (see Chorionic Villus Sampling) or of the amniotic fluid around the fetus (see Amniocentesis) to determine whether hereditary blood diseases, Down syndrome, defects of the spine, or other congenital diseases are present. Even the sex of a fetus may be known in advance.
In addition to advances in early diagnosis, progress occurred in identifying the causes of some birth defects. Excess alcohol consumption during pregnancy was linked to fetal alcohol syndrome, and inadequate intake of the vitamin folic acid was linked to spina bifida and other neural tube defects.
Advances in treating infertility, which prevents couples from having children, began with the world’s first so-called test-tube baby born in the 1980s through in vitro fertilization. Other forms of assisted reproduction soon became available. Researchers in 1997 cloned a lamb from cells taken from an adult ewe. It led to speculation that human cloning could become another option in human reproduction.
I Heart Disease
Heart disease emerged as one of the leading causes of death in Western countries by the end of the 20th century. Great advances occurred in diagnosis, treatment, and prevention of this widespread disease.
Diagnosis improved with the widespread use of cardiac catheterization in the 1950s. This procedure involves threading a slender tube into the heart to take measurements and identify blocked arteries. Less invasive diagnostic methods, such as thallium scans in which a special imaging camera detects the movement of thallium in heart muscle, provided additional diagnostic improvements.
These techniques led to a new era in surgical treatment of coronary heart disease, artery blockages that cause most heart attacks. Physicians began treating blocked coronary arteries with a variety of new techniques. The first bypass operation was performed in 1967 and involved the creation of a new route for blood supply to reach blood-starved heart muscles. In balloon angioplasty, developed in 1977, a deflated balloon is inserted into a narrowed artery. The balloon is then inflated at the site of the narrowing to widen it. Other surgical advances included replacement of diseased heart valves with artificial valves; implantation of pacemakers that maintain normal heart rhythm; use of temporary artificial hearts; and better methods for correcting hereditary defects in the heart.
New drugs were developed to treat angina pectoris, the chest pain of heart disease; high blood pressure; dangerous abnormalities in heart rhythm; and high blood cholesterol levels. Studies showed that drug treatment could reduce the risk of a heart attack or stroke. In the 1980s, aspirin went into wide use to prevent blood clots that cause many heart attacks. Emergency medical personnel also began using drugs that dissolve clots and stop a heart attack if given soon after symptoms develop.
Advances have been made in the prevention of heart disease. Studies have identified risk factors such as high blood pressure, high blood cholesterol, cigarette smoking, diabetes, obesity, and lack of exercise. Government health agencies and public health groups began public education programs to help people reduce heart disease risks. These preventive methods seem to be working—according to the American Heart Association, the death rate from coronary heart disease declined 26.3 percent between 1988 and 1998.
J Cancer
Early detection and better treatment have resulted in major improvements in survival of patients with cancer. By 2000, 59 percent of people diagnosed with cancer were alive five years later, compared with only 25 percent in 1940. New drugs, surgical procedures, and ways of treating cancer with X rays and radioactive isotope radiation contributed to the improvement. In the 1990s, physicians used new knowledge about the human immune system to develop immunotherapy for some kinds of cancer, in which the immune system is stimulated to produce antibodies against specific invaders. Another form of immunotherapy is the use of monoclonal antibodies, genetically engineered antibodies that target specific cancer cells.
Screening tests for early detection of cancers of the cervix, prostate, breast, and colon and rectum (see Colorectal Cancer) became widely available. Researchers also made progress in identifying cancer genes that are associated with an increased risk of the disease and developed screening tests for some cancer genes. Advances in gene therapy also offered promise for new cancer treatments.
Health groups placed great emphasis in the second half of the century on cancer prevention through avoiding smoking and eating a diet rich in fresh fruits and vegetables. Despite these advances, the percentage of deaths from cancer increased from about 2 percent in 1900 to about 20 percent in 2000. Much of the rise, however, resulted from an increased proportion of older people, who are more vulnerable to cancer, and from cigarette smoking.
K Telemedicine
Advances in computer and Internet technologies created new possibilities for doctors and their patients in the early 1990s. Using computers to send live video, sound, and high-resolution images between two distant locations, doctors can easily examine patients in offices thousands of miles away. Rural patients no longer had to make long trips into urban centers to consult specialists.
In telemedicine, a computer fitted with special software and a video camera turns a live video image of a patient into a digital signal. This signal is transmitted over high-speed telephone lines to similar equipment at the doctor’s office, where it is converted back into a format that can be viewed live on a television screen. Telemedicine also includes machines specially designed to measure and record a patient’s vital signs at home, then transmit the information directly to a hospital nursing station. This electronic remote home care enables health care professionals to monitor a patient’s heart rate, temperature, blood pressure, pulse, blood-oxygen levels, and weight several times a day, without the patient ever having to leave home.
In addition to providing a vehicle for doctors and patients in remote locations to interact, telemedicine also enabled doctors in distant locations to share information. Patient charts, X rays, and other diagnostic materials can be transmitted between doctors’ offices. Moreover, doctors in rural areas of the world can observe state-of-the-art medical procedures that they would otherwise have had to travel thousands of miles to witness. Still in its infancy in the late 1990s, telemedicine may one day alleviate some of the regional inequalities inherent in modern medicine, not just between regions of North America, but also between developing countries and urban medical centers in the industrialized world.
A Medical Ethics
New medical, reproductive, and genetic technology in the second half of the 20th century led to increased concern about moral issues in medical treatment and research. By the 1990s, medical ethics, or bioethics, emerged as a recognized discipline that involved physicians, nurses, attorneys, theologians, philosophers, and sociologists.
Many bioethics issues involve the possible misuse of genetic engineering technology. The Human Genome Project led to identification of genes that raise an individual’s risk of developing cancer, heart disease, mental illness, alcoholism, violent behavior, and other conditions. Tests to detect some of these disease-susceptibility genes became available in the 1990s.
These discoveries led to debate over whether genetic tests should be performed and how the results should be used. Should parents use such tests to screen their unborn infants? If a fetus tested positive, should it be aborted? If a woman tested positive for a breast cancer susceptibility gene, should the information be made available to insurance companies? Do insurers have a right to deny coverage to people with a genetic high risk for serious diseases? Do employers have a right to demand genetic screening tests before hiring people?
Genetic technology also offers the potential of eventually replacing defective genes with normal copies in human sperm and eggs. Some fear it will lead to mandatory eugenics programs, attempts to improve the hereditary traits of individuals or even entire races. Others argue that advances in genetic technology could eliminate defective genes and hereditary diseases from future generations.
An intense discussion about bioethics occurred in 1997 and 1998, after researchers in Scotland cloned the lamb, Dolly, from udder cells from an adult ewe. The experiment showed that it was possible to clone, or produce an exact genetic copy, of an adult mammal. Medical ethicists debate whether cloning of human beings should be permitted, as well as the potential effects on society.
Although abortion became legal in the United States in 1973, it still causes heated debate over the rights of the fetus and the pregnant woman, as well as the question of when a fetus becomes a human being. The availability of RU-486, also known as mifepristone, an inexpensive drug that induces abortion, led to concern that more people would use abortion for birth control. Ethical discussions centered on whether tissue from aborted fetuses should be used in medical research, treatment of disease, and organ transplants.
The right of terminally ill people to receive assistance in dying raised other ethical dilemmas. Physician-assisted suicide came to national attention largely through the efforts of Jack Kevorkian, a Michigan physician who helps people with terminal illnesses commit suicide. Opponents claim it is unethical for physicians to help patients commit suicide. Supporters counter that terminally ill patients have a right to determine the time and manner of their death. While the U.S. Supreme Court in 1997 ruled that states can ban physician-assisted suicide, that same year Oregon voters rejected an effort to repeal their law, the nation’s first to legalize physician-assisted suicide.
B Preventive Medicine
In the 1960s and 1970s, physicians and medical educators began to recognize a basic flaw in the health care system. Medicine traditionally was concerned with treating disease after symptoms appeared, resulting in treatment that was often very expensive. About 600,000 coronary bypass operations were performed annually in the United States in the 1990s, at a cost of $44,000 each. Medical officials recognized the advantage of preventing disease in the first place, rather than just treating it.
Medical schools began teaching students the importance of disease prevention. Some physicians specialized in a new field, preventive medicine, which emphasized keeping patients healthy. Practicing physicians spent more time counseling patients about smoking, excessive drinking, and other unhealthy practices. They did so by encouraging patients to avoid risk factors for disease; take periodic screening tests that detect disease early; and treat high blood pressure.
Yet by the late 1990s, many people still failed to use preventive services. Studies in 1997 estimated that 30,000 deaths per year could have been prevented if more people were immunized against influenza, pneumococcal pneumonia, and hepatitis B. Likewise, smoking, the leading preventable cause of death in the industrialized world, causes more than 4 million deaths worldwide each year.
Another dramatic change in medicine involved the idea that individuals have an important role in preventing diseases caused by an unhealthy lifestyle. Health care consumers grew more knowledgeable about medicine. Medical pages became a regular feature of major newspapers, news magazines, and television news programs. Some people subscribed to magazines and newsletters devoted entirely to health. Laypeople consulted books, such as the Physician’s Desk Reference and The Merck Manual, once used only by professionals. They also tapped health information available on the Internet’s World Wide Web (WWW). With this knowledge, consumers sought to become partners with their physicians in deciding the best ways of preventing, diagnosing, and treating disease.
C Nontraditional Medical Practices
A resurgence of interest developed in the 1990s in medical treatments not fully accepted by conventional medicine or biomedicine, which requires stringent scientific proof of safety and effectiveness before accepting a treatment. Such evidence is lacking for many approaches used in the medical systems and treatments known as alternative medicine in the United States. In Europe, these same approaches often are called complementary medicine. Growing public interest in nontraditional treatments led the NIH to open the National Center for Complementary and Alternative Medicine (formerly the Office of Alternative Medicine) in 1992, which encourages research on alternative medicine. The number of Americans using an alternative therapy rose from 33 percent in 1990 to more than 42 percent in 1997.
Alternative medicine emphasizes improving the quality of life for people with chronic illness; disease prevention; and treatments for conditions that conventional medicine cannot adequately control, such as arthritis, chronic pain, allergies, cancer, heart disease, and depression. A cornerstone of alternative medicine is the idea that the mind influences the health of the body.
Alternative medical systems include chiropractic, holistic medicine, and homeopathy. Chiropractors treat disease with spinal manipulation, massage, diet, and many other techniques. Holistic healers emphasize treatment of the whole person, including body, mind, emotions, spirit, and interactions with the family and environment. Homeopathic healers use substances that cause the very symptoms being treated. When treating a headache or nausea, for example, homeopathic healers administer herbs that in large doses cause headache or nausea. But they use very small doses that cause the patient no discomfort.
Specific alternative medical treatments include aromatherapy, inhaling oils from aromatic plants; massage techniques, including Rolfing and reflexology; biofeedback; iridology, in which the eye is used to diagnose certain diseases; and acupuncture. Some approaches, including chiropractic manipulation and acupuncture, have gained greater acceptance in conventional medicine. Some conventional biomedical studies have concluded that chiropractic manipulation is effective for low-back pain. A 1997 NIH report gave acupuncture limited endorsement for certain medical uses.
Organizations that educate the public about health fraud and quackery expressed concern about growing interest in some alternative medicine treatments. They emphasized the importance of receiving a conventional medical diagnosis, and exploring standard treatment options, before turning to alternative medicine.
D Cost of Medical Care
The United States spends more on health care than any other country in the world. Spending in 1998 averaged $4,094 per person, compared to $2,689 in 1990, $1,052 in 1980, $341 in 1970, and $141 in 1960. The only countries that approached the United States in per capita spending were Switzerland ($2,412), Germany ($2,222), Luxembourg ($2,206), and Canada ($2,002). In the United States, spending on health care exceeded $1.1 trillion in 1998, up from $699.4 billion in 1990, $247.3 in 1980, $73.2 in 1970, and $26.9 billion in 1960.
Yet millions of Americans still do not have adequate access to health care because they lack insurance coverage. An estimated 44.2 million people had no health insurance in 1998. Access is a greater problem in the United States because most other industrialized countries have national health insurance systems that cover medical expenses. Since the 1960s, the United States Congress established and expanded programs to improve access to care. Medicare, the major program, covered about 38 million people over age 65 and people with disabilities in 1997. Another was Medicaid, a federal-state program that covers low-income people. During the 1990s, Congress considered and rejected proposals to establish a national health insurance system or extend government health care benefits to more people. The high costs of such a program were among the reasons for rejection.
Reviewed By:
Robert Sikorski
Richard Peters

Article Categories:

Leave a Reply