Malignant disease
While progress was the hallmark of medicine after the beginning of the 20th century, malignant disease, or cancer, continued to pose major challenges. Cancer became the second most common cause of death in most Western countries in the second half of the 20th century, exceeded only by deaths from heart disease. Nonetheless, some progress was achieved. The causes of many types of malignancies were unknown, but methods became available for attacking the problem. Surgery remained the principal therapeutic standby, but radiation therapy and chemotherapy were increasingly used.
Soon after the discovery of radium was announced in 1898, its potentialities in treating cancer were realized. In due course it assumed an important role in therapy. Simultaneously, deep X-ray therapy was developed, and with the atomic age came the use of radioactive isotopes. (A radioactive isotope is an unstable variant of a substance that has a stable form. During the process of breaking down, the unstable form emits radiation.) High-voltage X-ray therapy and radioactive isotopes have largely replaced radium. Whereas irradiation long depended upon X-rays generated at 250 kilovolts, machines that are capable of producing X-rays generated at 8,000 kilovolts and betatrons of up to 22,000,000 electron volts (MeV) came into clinical use.
The most effective of the isotopes was radioactive cobalt. Telecobalt machines (those that hold the cobalt at a distance from the body) were available containing 2,000 curies or more of the isotope, an amount equivalent to 3,000 grams of radium and sending out a beam equivalent to that from a 3,000-kilovolt X-ray machine.
Of even more significance were developments in the chemotherapy of cancer. In certain forms of malignant disease, such as leukemia, which cannot be treated by surgery, palliative effects were achieved that prolonged life and allowed the patient in many instances to lead a comparatively normal existence.
Fundamentally, however, perhaps the most important advance of all in this field was the increasing appreciation of the importance of prevention. The discovery of the relationship between cigarette smoking and lung cancer remains a classic example. Less publicized, but of equal import, was the continuing supervision of new techniques in industry and food manufacture in an attempt to ensure that they do not involve the use of cancer-causing substances.
Tropical medicine
The first half of the 20th century witnessed the virtual conquest of three of the major diseases of the tropics: malaria, yellow fever, and leprosy. At the turn of the century, as for the preceding two centuries, quinine was the only known drug to have any appreciable effect on malaria. With the increasing development of tropical countries and rising standards of public health, it became obvious that quinine was not completely satisfactory. Intensive research between World Wars I and II indicated that several synthetic compounds were more effective. The first of these to become available, in 1934, was quinacrine (known as mepacrine, Atabrine, or Atebrin). In World War II it amply fulfilled the highest expectations and helped to reduce disease among Allied troops in Africa, Southeast Asia, and the Far East. A number of other effective antimalarial drugs subsequently became available.
An even brighter prospect—the virtual eradication of malaria—was opened up by the introduction, during World War II, of the insecticide DDT (1,1,1-trichloro-2,2,-bis[p-chlorophenyl]ethane, or dichlorodiphenyltrichloro-ethane). It had long been realized that the only effective way of controlling malaria was to eradicate the anopheline mosquitoes that transmit the disease. Older methods of mosquito control, however, were cumbersome and expensive. The lethal effect of DDT on the mosquito, its relative cheapness, and its ease of use on a widespread scale provided the answer. An intensive worldwide campaign, sponsored by the World Health Organization, was planned and went far toward bringing malaria under control.
The major problem encountered with respect to effectiveness was that the mosquitoes were able to develop resistance to DDT, but the introduction of other insecticides, such as dieldrin and lindane (BHC), helped to overcome that difficulty. The use of these and other insecticides was strongly criticized by ecologists, however.
Yellow fever is another mosquito-transmitted disease, and the prophylactic value of modern insecticides in its control was almost as great as in the case of malaria. The forest reservoirs of the virus present a more difficult problem, but the combined use of immunization and insecticides did much to bring this disease under control.
Until the 1940s the only drugs available for treating leprosy were the chaulmoogra oils and their derivatives. These, though helpful, were far from satisfactory. In the 1940s the group of drugs known as the sulfones appeared, and it soon became apparent that they were infinitely better than any other group of drugs in the treatment of leprosy. Several other drugs later proved promising. Although there was no known cure—in the strict sense of the term—for leprosy, the outlook had changed: the age-old scourge could be brought under control and the victims of the disease saved from the mutilations that had given leprosy such a fearsome reputation throughout the ages.
William Archibald Robson Thomson Philip RhodesSurgery in the 20th century
The opening phase
Three seemingly insuperable obstacles beset the surgeon in the years before the mid-19th century: pain, infection, and shock. Once these were overcome, the surgeon believed that he could burst the bonds of centuries and become the master of his craft. There is more, however, to anesthesia than putting the patient to sleep. Infection, despite first antisepsis (destruction of microorganisms present) and later asepsis (avoidance of contamination), was an ever-present menace. And shock continued to perplex physicians. But in the 20th century surgery progressed farther, faster, and more dramatically than in all preceding ages.
The situation encountered
The shape of surgery that entered the 20th century was clearly recognizable as the forerunner of modern surgery. The operating theatre still retained an aura of the past, when the surgeon played to his audience and the patient was little more than a stage prop. In most hospitals it was a high room lit by a skylight, with tiers of benches rising above the narrow wooden operating table. The instruments, kept in glazed or wooden cupboards around the walls, were of forged steel, unplated, and with handles of wood or ivory.
The means to combat infection hovered between antisepsis and asepsis. Instruments and dressings were mostly sterilized by soaking them in dilute carbolic acid (or other antiseptic), and the surgeon often endured a gown freshly wrung out in the same solution. Asepsis gained ground fast, however. It had been born in the Berlin clinic of Ernst von Bergmann, where in 1886 steam sterilization had been introduced. Gradually, this led to the complete aseptic ritual, which has as its basis the bacterial cleanliness of everything that comes in contact with the wound. Hermann Kümmell of Hamburg devised the routine of “scrubbing up.” In 1890 William Stewart Halsted of Johns Hopkins University had rubber gloves specially made for operating, and in 1896 Polish surgeon Johannes von Mikulicz-Radecki, working at Breslau, Germany, invented the gauze mask.
Many surgeons, brought up in a confused misunderstanding of the antiseptic principle—believing that carbolic acid would cover a multitude of surgical mistakes, many of which they were ignorant of committing—failed to grasp the concept of asepsis. Thomas Annandale, for example, blew through his catheters to make sure that they were clear, and many an instrument, dropped accidentally, was simply given a quick wipe and returned to use. Tradition died hard, and asepsis had an uphill struggle before it was fully accepted. “I believe firmly that more patients have died from the use of gloves than have ever been saved from infection by their use,” wrote American surgeon W.P. Carr in 1911. Over the years, however, a sound technique was evolved as the foundation for the growth of modern surgery.
Anesthesia, at the beginning of the 20th century, progressed slowly. Few physicians made a career of the subject, and frequently the patient was rendered unconscious by a student, a nurse, or a porter wielding a rag and a bottle. Chloroform was overwhelmingly more popular than ether, on account of its ease of administration, despite the fact that it was liable to kill by stopping the heart.
Although, by the end of the first decade, nitrous oxide (laughing gas) combined with ether had displaced—but by no means entirely—the use of chloroform, the surgical problems were far from ended. For years to come, the abdominal surgeon besought the anesthetist to deepen the level of anesthesia and thus relax the abdominal muscles; the anesthetist responded to the best of his or her ability, acutely aware that the more anesthetic administered, the closer the patient was to death. When other anesthetic agents were discovered, anesthesiology came into its own as a field, and many advances in spheres such as brain and heart surgery would have been impossible without the skill of the trained anesthesiologist.
The third obstacle, shock, was perhaps the most complex and the most difficult to define satisfactorily. The only major cause properly appreciated at the start of the 20th century was loss of blood, and, once that had occurred, nothing, in those days, could be done. And so, the study of shock—its causes, its effects on human physiology, and its prevention and treatment—became all-important to the progress of surgery.
In the latter part of the 19th century, then, surgeons had been liberated from the age-old issues of pain, pus, and hospital gangrene. Hitherto, operations had been restricted to amputations, cutting for stone in the bladder, tying off arterial aneurysms (bulging and thinning of artery walls), repairing hernias, and a variety of procedures that could be done without going too deeply beneath the skin. But the anatomical knowledge, a crude skill derived from practice on cadavers, and the enthusiasm for surgical practice were there waiting. Largely ignoring the mass of problems they uncovered, surgeons launched forth into an exploration of the human body.
They acquired a reputation for showmanship, but much of their surgery, though speedy and spectacular, was rough and ready. There were a few who developed supreme skill and dexterity and could have undertaken a modern operation with but little practice. Indeed, some devised the very operations still in use today. One such was Theodor Billroth, head of the surgical clinic at Vienna, who collected a formidable list of successful “first” operations. He represented the best of his generation—a surgical genius, an accomplished musician, and a kind, gentle man who brought the breath of humanity to his work. Moreover, the men he trained, including von Mikulicz, Vincenz Czerny, and Anton von Eiselsberg, consolidated the brilliant start that he had given to abdominal surgery in Europe.
Changes before World War I
The opening decade of the 20th century was a period of transition. Flamboyant exhibitionism was falling from favour as surgeons, through experience, learned the merits of painstaking, conscientious operation—treating the tissues gently and carefully controlling every bleeding point. The individualist was not submerged, however, and for many years the development of the various branches of surgery rested on the shoulders of a few clearly identifiable men. Teamwork on a large scale arrived only after World War II. The surgeon, at first, was undisputed master in his own wards and theatre. But as time went on and he found he could not solve his problems alone, he called for help from specialists in other fields of medicine and, even more significantly, from colleagues in other scientific disciplines.
The increasing scope of surgery led to specialization. Most general surgeons had a special interest, and for a long time there had been an element of specialization in such fields as ophthalmology, orthopedics, and obstetrics and gynecology. Before long, however, it became apparent that, to achieve progress in certain areas, surgeons had to concentrate their attention on that particular subject.
Abdominal surgery
By the start of the 20th century, abdominal surgery, which provided the general surgeon with the bulk of his work, had grown beyond infancy, thanks largely to Billroth. In 1881 he had performed the first successful removal of part of the stomach for cancer. His next two cases were failures, and he was stoned in the streets of Vienna. Yet, he persisted and by 1891 had carried out 41 more of these operations with 16 deaths—a remarkable achievement for that era.
Peptic ulcers (gastric and duodenal) appeared on the surgical scene (perhaps as a new disease, but more probably because they had not been diagnosed previously), and in 1881 Ludwig Rydygier cured a young woman of her gastric ulcer by removing it. Bypass operations—gastroenterostomies—soon became more popular, however, and enjoyed a vogue that lasted into the 1930s, even though fresh ulcers at the site of the juncture were not uncommon.
The other end of the alimentary tract was also subjected to surgical intervention; cancers were removed from the large bowel and rectum with mortality rates that gradually fell from 80 to 60 to 20 to 12 percent as the surgeons developed their skill. In 1908 British surgeon Ernest Miles carried out the first abdominoperineal resection for cancer of the rectum; that is, the cancer was attacked both from the abdomen and from below through the perineum (the area between the anus and the genitals), either by one surgeon, who actually did two operations, or by two working together. This technique formed the basis for all future developments.
Much of the new surgery in the abdomen was for cancer, but not all. Appendectomy became the accepted treatment for appendicitis (in appropriate cases) in the United States before the close of the 19th century, but in Great Britain surgeons were reluctant to remove the organ until 1902, when King Edward VII’s coronation was dramatically postponed on account of his appendicitis. The publicity attached to his operation caused the disease and its surgical treatment to become fashionable despite the fact that the royal appendix remained in the king’s abdomen; the surgeon, Frederic Treves, had merely drained the abscess.
Neurosurgery
Though probably the most demanding of all the surgical specialties, neurosurgery was nevertheless one of the first to emerge. The techniques and principles of general surgery were inadequate for work in such a delicate field. William Macewen, a Scottish general surgeon of outstanding versatility, and Victor Alexander Haden Horsley, the first British neurosurgeon, showed that the surgeon had much to offer in the treatment of disease of the brain and spinal cord. Macewen, in 1893, recorded 19 patients operated on for brain abscess, 18 of whom were cured; at that time most other surgeons had 100 percent mortality rates for the condition. His achievement remained unequaled until the discovery of penicillin.
American surgeon Harvey Williams Cushing almost by himself consolidated neurosurgery as a specialty. From 1905 on, he advanced neurosurgery through a series of operations and through his writings. Tumours, epilepsy, trigeminal neuralgia, and pituitary disorders were among the conditions he treated successfully.
Radiology
In 1895 a development at the University of Würzburg had far-reaching effects on medicine and surgery, opening up an entirely fresh field of the diagnosis and study of disease and leading to a new form of treatment, radiation therapy. This was the discovery of X-rays by Wilhelm Conrad Röntgen, a professor of physics. Within months of the discovery there was an extensive literature on the subject: Robert Jones, a British surgeon, had localized a bullet in a boy’s wrist before operating; stones in the urinary bladder and gallbladder had been demonstrated; and bone fractures had been displayed.
Experiments began on introducing substances that are opaque to X-rays into the body to reveal organs and formations, both normal and abnormal. Walter Bradford Cannon, a Boston physiologist, used X-rays in 1898 in his studies of the alimentary tract. Friedrich Voelcker of Heidelberg devised retrograde pyelography (introduction of the radiopaque medium into the kidney pelvis by way of the ureter) for the study of the urinary tract in 1905; in Paris in 1921 Jean Sicard X-rayed the spinal canal with the help of an oily iodine substance, and the next year he did the same for the bronchial tree; and in 1924 Evarts Graham of St. Louis used a radiopaque contrast medium to view the gallbladder. Air was also used to provide contrast; in 1918 at Johns Hopkins, Walter Dandy injected air into the ventricles (liquid-filled cavities) of the brain.
The problems of injecting contrast media into the blood vessels took longer to solve, and it was not until 1927 that António Moniz of Lisbon succeeded in obtaining pictures of the arteries of the brain. Eleven years later George Robb and Israel Steinberg of New York overcame some of the difficulties of cardiac catheterization (introduction of a small tube into the heart by way of veins or arteries) and were able to visualize the chambers of the heart on X-ray film. After much research, a further refinement came in 1962, when Frank Sones and Earl K. Shirey of Cleveland showed how to introduce the contrast medium into the coronary arteries.
World War I
The battlefields of the 20th century stimulated the progress of surgery and taught the surgeon innumerable lessons, which were subsequently applied in civilian practice. Regrettably, though, the principles of military surgery and casualty evacuation, which can be traced back to the Napoleonic Wars, had to be learned over again.
World War I broke, quite dramatically, the existing surgical hierarchy and rule of tradition. No longer did the European surgeon have to waste his best years in apprenticeship before seating himself in his master’s chair. Suddenly, young surgeons in the armed forces began confronting problems that would have daunted their elders. Furthermore, their training had been in “clean” surgery performed under aseptic conditions. Now they found themselves faced with the need to treat large numbers of grossly contaminated wounds in improvised theatres. They rediscovered debridement (the surgical excision of dead and dying tissue and the removal of foreign matter).
The older surgeons cried “back to Lister,” but antiseptics, no matter how strong, were no match for putrefaction and gangrene. One method of antiseptic irrigation—devised by Alexis Carrel and Henry Dakin and called the Carrel–Dakin treatment—was beneficial, however, but only after the wound had been adequately debrided. The scourges of tetanus and gas gangrene were controlled to a large extent by antitoxin and antiserum injections, yet surgical treatment of the wound remained an essential requirement.
Abdominal casualties fared badly for the first year of the war, because experience in the utterly different circumstances of the South African War had led to a belief that these men were better left alone surgically. Fortunately, the error of continuing with such a policy 15 years later was soon appreciated, and every effort was made to deliver the wounded men to a suitable surgical unit with all speed. Little progress was made with chest wounds beyond opening up the wound even further to drain pus from the pleural cavity between the chest wall and the lungs.
Perhaps the most worthwhile and enduring benefit to flow from World War I was rehabilitation. For almost the first time, surgeons realized that their work did not end with a healed wound. In 1915 Robert Jones set up special facilities for orthopedic patients, and at about the same time Harold Gillies founded British plastic surgery in a hut at Sidcup, Kent. In 1917 Gillies popularized the pedicle type of skin graft (the type of graft in which skin and subcutaneous tissue are left temporarily attached for nourishment to the site from which the graft was taken). Since then plastic surgery has given many techniques and principles to other branches of surgery.
Between the World Wars
The years between the two World Wars may conveniently be regarded as the time when surgery consolidated its position. A surprising number of surgical firsts and an amazing amount of fundamental research had been achieved even in the late 19th century, but the knowledge and experience could not be converted to practical use because the human body could not survive the onslaught. In the years between World Wars I and II, it was realized that physiology—in its widest sense, including biochemistry and fluid and electrolyte balance—was of major importance along with anatomy, pathology, and surgical technique.