Effects on the incidence of cancer
Atomic-bomb survivors, certain groups of patients exposed to radiation for medical purposes, and some groups of radiation workers have shown dose-dependent increases in the incidence of certain types of cancer. The induced cancers have not appeared until years after exposure, however, and they have shown no distinguishing features by which they can be identified individually as having resulted from radiation, as opposed to some other cause. With few exceptions, moreover, the incidence of cancer has not been increased detectably by doses of less than 0.01 Sv.
Because the carcinogenic effects of radiation have not been documented over a wide enough range of doses and dose rates to define the shape of the dose-incidence curve precisely, the risk of radiation-induced cancer at low levels of exposure can be estimated only by extrapolation from observations at higher dose levels, based on assumptions about the relation between cancer incidence and dose. For most types of cancer, information about the dose-incidence relationship is rather meagre. The most extensive data available are for leukemia and cancer of the female breast.
The overall incidence of all forms of leukemia other than the chronic lymphatic type has been observed to increase roughly in proportion to dose during the first 25 years after irradiation. Different types of leukemia, however, vary in the magnitude of the radiation-induced increase for a given dose, the age at which irradiation occurs, and the time after exposure. The total excess of all types besides chronic lymphatic leukemia, averaged over all ages, amounts to approximately one to three additional cases of leukemia per year per 10,000 persons at risk per sievert to the bone marrow.
Cancer of the female breast also appears to increase in incidence in proportion to the radiation dose. Furthermore, the magnitude of the increase for a given dose appears to be essentially the same in women whose breasts were irradiated in a single, brief exposure (e.g., atomic-bomb survivors), as in those who were irradiated over a period of years (e.g., patients subjected to multiple fluoroscopic examinations of the chest or workers assigned to coating watch and clock dials with paint containing radium), implying that even small exposures widely separated in time exert carcinogenic effects on the breast that are fully additive and cumulative. Although susceptibility decreases sharply with age at the time of irradiation, the excess of breast cancer averaged over all ages amounts to three to six cases per 10,000 women per sievert each year.
Additional evidence that carcinogenic effects can be produced by a relatively small dose of radiation is provided by the increase in the incidence of thyroid tumours that has been observed to result from a dose of 0.06–2.0 Gy of X rays delivered to the thyroid gland during infancy or childhood, and by the association between prenatal diagnostic X irradiation and childhood leukemia. The latter association implies that exposure to as little as 10–50 mGy of X radiation during intrauterine development may increase the subsequent risk of leukemia in the exposed child by as much as 40–50 percent.
Although some, but not all, other types of cancer have been observed to occur with greater frequency in irradiated populations (Table 12), the data do not suffice to indicate whether the risks extend to low doses. It is apparent, however, that the dose-incidence relationship varies from one type of cancer to another. From the existing evidence, the overall excess of all types of cancer combined may be inferred to approximate 0.6–1.8 cases per 1,000 persons per sievert per year when the whole body is exposed to radiation, beginning two to 10 years after irradiation. This increase corresponds to a cumulative lifetime excess of roughly 20–100 additional cases of cancer per 1,000 persons per sievert, or to an 8–40 percent per sievert increase in the natural lifetime risk of cancer.
Estimated lifetime cancer risks attributed to low-level irradiation | |
site irradiated | cancers per 10,000 person-Sv* |
bone marrow (leukemia) | 15-20 |
thyroid | 25-120 |
breast (women only) | 40-200 |
lung | 25-140 |
stomach | |
liver | 5-60 (each) |
colon | |
bone | |
esophagus | |
small intestine | 5-30 (each) |
urinary bladder | |
pancreas | |
lymphatic tissue | |
skin | 10-20 |
total (both sexes) | 125-1,000 |
*The unit person-Sv represents the product of the average dose per person times the number of people exposed (1 sievert to each of 10,000 persons = 10,000 person-Sv); all values provided here are rounded. Source: National Academy of Sciences Advisory Committee on the Biological Effects of Ionizing Radiation, The Effects on Populations of Exposure to Low Levels of Ionizing Radiation (1972, 1980); United Nations Scientific Committee on the Effects of Atomic Radiation, Sources and Effects of Ionizing Radiation (1977 report to the General Assembly, with annexes). |
The above-cited risk estimates imply that no more than 1–3 percent of all cancers in the general population result from natural background ionizing radiation. At the same time, however, the data suggest that up to 20 percent of lung cancers in nonsmokers may be attributable to inhalation of radon and other naturally occurring radionuclides present in air.
Shortening of the life span
Laboratory animals whose entire bodies are exposed to radiation in the first half of life suffer a reduction in longevity that increases in magnitude with increasing dose. This effect was mistakenly interpreted by early investigators as a manifestation of accelerated or premature aging. The shortening of life in irradiated animals, however, has since been observed to be attributable largely, if not entirely, to the induction of benign and malignant growths. In keeping with this observation is the finding that mortality from diseases other than cancer has not been increased detectably by irradiation among atomic-bomb survivors.
Protection against external radiation
A growing number of substances have been found to provide some protection against radiation injury when administered prior to irradiation (Table 13). Many of them apparently act by producing anoxia or by competing for oxygen with normal cell constituents and radiation-produced radicals. All of the protective compounds tried thus far, however, are toxic, and anoxia itself is hazardous. As a consequence, their administration to humans is not yet practical.
Some chemicals that exert radioprotective effects in laboratory animals | ||
class | specific chemical | effective dose (in milligrams per kilogram of tissue) |
sulfur compounds | glutathione cysteine cysteamine AET* | 1,000 1,000 150 350 |
hormones | estradiolbenzoate ACTH | 12 25 for 7 days |
enzyme inhibitors | sodium cyanide carbon monoxide mercaptoethylamine (MEA) para-aminopropiophenone (PAPP) | 5 by inhalation 235 30 |
metabolites | formic acid | 90 |
vasoconstrictors | serotonin | 50 |
nervous system drugs | amphetamine chlorpromazine | 1 20 |
*Aminoethylisothiuronium bromide hydrobromide. |
Diurnal changes in the radiosensitivity of rodents indicate that the factors responsible for daily biologic rhythms may also alter the responses of tissues to radiation. Such factors include the hormone thyroxine, a normal secretion of the thyroid gland. Other sensitizers at the cellular level include nucleic-acid analogues (e.g., 5-fluorouracil) as well as certain compounds that selectively radiosensitize hypoxic cells such as metronidazole.
Radiosensitivity is also under genetic control to some degree, susceptibility varying among different inbred mouse strains and increasing in the presence of inherited deficiencies in capacity for repairing radiation-induced damage to DNA. Germ-free mice, which spend their entire lives in a sterile environment, also exhibit greater resistance to radiation than do animals in a normal microbial environment owing to elimination of the risk of infection.
For many years it was thought that radiation disease was irreversible once a lethal dose had been received. It has since been found that bone-marrow cells administered soon after irradiation may enable an individual to survive an otherwise lethal dose of X rays, because these cells migrate to the marrow of the irradiated recipient, where they proliferate and repopulate the blood-forming tissues. Under these conditions bone-marrow transplantation is feasible even between histo-incompatible individuals, because the irradiated recipient has lost the ability to develop antibodies against the injected “foreign” cells. After a period of some months, however, the transplanted tissue may eventually be rejected, or it may develop an immune reaction against the irradiated host, which also can be fatal. The transplantation of bone-marrow cells has been helpful in preventing radiation deaths among the victims of reactor accidents, as, for example, those injured in 1986 at the Chernobyl nuclear power plant in Ukraine, then in the Soviet Union. It should be noted, however, that cultured or stored marrow cells cannot yet be used for this purpose.
Control of radiation risks
In view of the fact that radiation is now assumed to play a role in mutagenic or carcinogenic activity, any procedure involving radiation exposure is considered to entail some degree of risk. At the same time, however, the radiation-induced risks associated with many activities are negligibly small in comparison with other risks commonly encountered in daily life. Nevertheless, such risks are not necessarily acceptable if they can be easily avoided or if no measurable benefit is to be gained from the activities with which they are associated. Consequently, systematic efforts are made to avoid unnecessary exposure to ionizing radiation in medicine, science, and industry. Toward this end, limits have been placed on the amounts of radioactivity (Tables 9 and 12) and on the radiation doses that the different tissues of the body are permitted to accumulate in radiation workers or members of the public at large.
Although most activities involving exposure to radiation for medical purposes are highly beneficial, the benefits cannot be assumed to outweigh the risks in situations where radiation is used to screen large segments of the population for the purpose of detecting an occasional person with an asymptomatic disease. Examples of such applications include the “annual” chest X-ray examination and routine mammography. Each use of radiation in medicine (and dentistry) is now evaluated for its merits on a case-by-case basis.
Other activities involving radiation also are assessed with care in order to assure that unnecessary exposure is avoided and that their presumed benefits outweigh their calculated risks. In operating nuclear power plants, for example, much care is taken to minimize the risk to surrounding populations. Because of such precautions, the total impact on health of generating a given amount of electricity from nuclear power is usually estimated to be smaller than that resulting from the use of coal for the same purpose, even after allowances for severe reactor accidents such as the one at Chernobyl.
Cornelius A. Tobias Arthur Canfield UptonBiologic effects of non-ionizing radiation
Effects of Hertzian waves and infrared rays
Hertzian waves
The effects of Hertzian waves (electromagnetic waves in the radar and radio range) and of infrared rays usually are regarded as equivalent to the effect produced by heating. The longer radio waves induce chiefly thermal agitation of molecules and excitation of molecular rotations, while infrared rays excite vibrational modes of large molecules and release fluorescent emission as well as heat. Both of these types of radiation are preferentially absorbed by fats containing unsaturated carbon chains.
The fact that heat production resulted from bombardment of tissue with high-frequency alternating current (wavelengths somewhat longer than the longest radio waves) was discovered in 1891, and the possibility of its utilization for medical purposes was realized in 1909, under the term diathermy. This method of internal heating is beneficial for relieving muscle soreness and sprain (see also below). Diathermy can be harmful, however, if so much internal heat is given that the normal cells of the body suffer irreversible damage. Since humans have heat receptors primarily in their skin, they cannot be forewarned by pain when they receive a deep burn from diathermy. Sensitive regions easily damaged by diathermy are those having reduced blood circulation. Cataracts of the eye lens have been produced in animals by microwave radiation applied in sufficient intensity to cause thermal denaturation of the lens protein.
Microwave ovens have found widespread use in commercial kitchens and private homes. These can heat and cook very rapidly and, if used properly, constitute no hazard to operators. In the radio-television industry and in the radar division of the military, persons are sometimes exposed to high densities of microwave radiation. The hazard is particularly pronounced with exposure to masers, capable of generating very high intensities of microwaves (e.g., carbon dioxide masers). The biologic effects depend on the absorbency of tissues. At frequencies higher than 150 megahertz, significant absorption takes place. The lens of the human eye is most susceptible to frequencies around 3,000 megahertz, which can produce cataracts. At still higher frequencies, microwaves interact with superficial tissues and skin, in much the same manner as infrared rays.
Acute effects of microwaves become significant if a considerable temperature rise occurs. Cells and tissues eventually die at temperatures of about 43° C. Microwave heating is minimized if the heat that results from energy absorption is dissipated by radiation, evaporation, and heat conduction. Normally one-hundredth of a watt (10 milliwatts) can be so dissipated, and this power limit generally has been set as the permissible dose. Studies with animals have indicated that, below the permissible levels, there are negligible effects to various organ systems. Microwaves or heat applied to testes tend strongly to decrease the viability of sperm. This effect, however, is not significant at the “safe” levels.
In the late 1980s, some investigators in the Soviet Union documented a variety of nonthermal effects of microwaves and recommended about 1,000 times lower safe occupational dose levels than are still in force in the United States today. Most prominent among the nonthermal effects appear to be those on the nervous system. Such effects have resulted in untimely tiring, excitability, and insomnia registered by persons handling high-frequency radio equipment. Nonthermal effects have been observed on the electroencephalogram of rabbits. These effects may be due to changes in the properties of neural membranes or to denaturation of macromolecules.
Infrared rays
A significant part of solar energy reaches the Earth in the form of infrared rays. Absorption and emission by the human body of these rays play an important part in temperature exchange and regulation of the body. The principles of infrared emission and absorption must be considered in the design of air conditioning and clothing.
Overdosage of infrared radiation, usually resulting from direct exposure to a hot object (including heating lamps) or flame, can cause severe burns. While infrared exposure is a hazard near any fire, it is particularly dangerous in the course of nuclear chain reactions. In the course of a nuclear detonation, a brief but very intense emission of infrared occurs, together with visible and ultraviolet light emitted from the fireball (flash burns). Of the total energy of nuclear explosion, as much as one-third may be in the form of thermal radiation, moving with the velocity of light. The rays will arrive almost instantaneously at regions removed from the source by only a few kilometres. Smoke or fog can effectively scatter or absorb the infrared components, and even thin clothing can greatly reduce the severity of burn effects.