Radiation-Induced Meningiomas

Published on 26/03/2015 by admin

Filed under Neurosurgery

Last modified 26/03/2015

Print this page

rate 1 star rate 2 star rate 3 star rate 4 star rate 5 star
Your rating: none, Average: 0 (0 votes)

This article have been viewed 2612 times

CHAPTER 5 Radiation-Induced Meningiomas

HISTORICAL PERSPECTIVE

One evening in November 1895, Wilhelm Conrad Roentgen was surprised to notice an unexplained glow on a fluorescent screen in his laboratory during cathode-ray tube experiments. For weeks, Roentgen worked intently to explain the mystery, and on December 22 he published his discovery of a new form of energy,1 which he named x-rays for the mathematical designation for something unknown. During 1896, a tremendous number of scientific and news articles about the new “Roentgen rays” appeared in newspapers and books, as well as leading scientific and medical journals such as Lancet, the British Medical Journal, Nature, and Science.2 Roentgen’s announcement, including the famous x-ray image of his wife’s hand, heralded one of the defining scientific discoveries of the modern era, and earned him the first Nobel Prize in Physics in 1901. His was the first of more than 20 Nobel Prizes awarded for research related to radioactivity in the 20th century.

Roentgen’s discovery greatly influenced important research in other laboratories. Within months, Henry Becquerel described the radiation-emitting properties of uranium, and in 1898, Marie Curie, as a young scientist conducting research for her doctoral thesis, discovered polonium and radium, two additional radioactive elements. The members of Madame Curie’s committee reported that her research possibly represented the most important scientific contribution ever made in a doctoral thesis.

Marie and her husband Pierre Curie isolated small quantities of the new elements from 100 kg of waste pitchblende and characterized their atomic properties. They refused suggestions to patent their isolation process, believing that research should be carried out for its own sake, with no profit motive, even though radium was soon being produced at the very high price of $100,000 per gram.3 For their work, the Curies shared with Becquerel the 1903 Nobel Prize in Physics.4 After her husband’s death, Marie Curie earned a second Nobel Prize in 1911, in Chemistry, for her work in radioactivity; she conducted active research until her death in 1934 (Fig. 5-1).

Potential diagnostic and therapeutic applications for radiation were quickly proposed. Roentgen’s first images so excited physicians around the world with the new ray’s ability to see inside the human body that x-rays were used to diagnose bone fractures and locate embedded bullets within weeks of his discovery.5 During a 1903 lecture and demonstration at the Royal Institution in London, Pierre Curie mentioned the possibility that radium could possibly be used to treat cancer, and described a burn-like skin injury on his forearm resulting from 10 hours of exposure to a sample of radium. Radiation was indeed used routinely for therapeutic purposes by the 1920s. While presenting, his hands trembling from significant cumulative exposure to radioactive materials, Dr. Curie spilled a small amount of radium on the podium; 50 years later some surfaces in the hall required cleaning after radioactivity was detected.4

Adverse effects of x-rays in the form of slow-healing skin lesions on the hands of radiologists and technicians were noticed early, but the full extent of dangers from exposure to x-rays was poorly understood for decades. Patients, technicians, physicians, and researchers were repeatedly exposed to large doses of ionizing radiation with no shielding. Fluoroscopists calibrated their equipment by placing their hands directly in the x-ray beam; many lost fingers as a result (Fig. 5-2). There were also more serious problems. American inventor Thomas Edison, who designed the first commercially available fluoroscope, suffered damage to his eyesight, while his assistant, Clarence Dally, succumbed to metastatic radiation-induced skin cancer. Edison halted work with x-rays in his laboratory because of their ill effects.6

X-rays also captured the public imagination. Radium was widely thought to have curative powers. A radium potion that “bathed the stomach in sunshine” was thought to cure stomach cancer. Radithor, a medical drink, was sold over the counter until 1931. Belts to be strapped onto limbs, hearing aids, toothpaste, face cream, and hair tonic, all containing radium, were sold into the 1930s,7 and shoe-fitting fluoroscopy was available as a customer service in many shoe stores until after 1950 (Fig. 5-3).8,9

Broader public awareness of the dangers of radiation exposure began to develop late in 1927, when journalist Walter Lippman, then editor of the New York World newspaper, exposed the fate of young women employed by U.S. Radium Co. to paint watch dials with radioactive materials. The women, who had worked in large rooms with no shielding, and were instructed to point their brushes with their lips, lost their teeth and developed serious bone decay in their mouths, necks, and backs. As the young women were dying, Lippman railed against delays in the courts that blocked settlements against the U.S. Radium company, which had known of the danger from chronic exposure but provided no protection for its workers.10 The case is a classic, now used to train journalists regarding the role of investigative reporting in societal change. Scholarly articles describing dangers from radiation also appeared before World War II, particularly in German and French medical journals.11,12 X-ray exposure guidelines were established in Germany in 1913,13 the United Kingdom in 1921,14 and the United States during the 1930s.13 The trend has been toward more rigorous protection in the decades since early limits were established.15

Important additional evidence of risk appeared in 1927 when Hermann Joseph Muller, a founding figure in genetic research, published evidence of a 150-fold increase in the natural mutation rate of fruit flies (Drosophilia melanogaster) exposed to x-rays. Muller showed that x-rays broke the genes apart and rearranged them.16 He earned the Nobel Prize in Medicine for his work, but only in 1947, when concern over genetic consequences from exposure to low levels of radiation became widespread after the world saw the devastation wrought from atomic bombings in Hiroshima and Nagasaki in 1945.17

Research to define the optimal parameters of radiation therapy for the treatment of brain tumors and to assess treatment risk began in the years after World War I. In 1938, Davidoff and colleagues11 documented profound histologic and morphologic changes, especially marked in glial and nerve tissues, in the brains and spinal cords of monkeys after exposure to 10 to 50 grays (Gy) in a single exposure, or 48 to 72 Gy in two fractions. Davidoff concluded that the intensity of change was determined primarily by x-ray dose, with time from radiation exposure to autopsy contributing to a lesser extent. Wachowski12 and others18,19 also showed that exposure to ionizing radiation had degenerative effects on neural tissue.

In spite of this work, and the mounting evidence of wide-ranging risks from exposure to ionizing radiation, neural tissue was long considered resistant to direct damage. The authors of a case report published in 1953, describing a patient who had received superficial x-ray therapy for a basal cell carcinoma, stated, “Brain and neural tissue are usually resistant to direct damage by x-ray radiation.”20 The patient had sustained a skin dose of approximately 25 Gy, and a dose to the temporal lobe of 12.87 Gy at a depth of 2 cm. The authors considered her development of an “underlying, expanding intracranial nontumorous mass” to be a rare event, and not the result of a radiation exposure.

Radiation is now known to induce a wide range of changes in neural tissue, including visual deterioration, hearing loss, hormonal disturbances, vasculopathy, brain and bone necrosis, atrophy, demyelination, calcification, fatty replacement of bone marrow, and induction of central nervous system (CNS) neoplasms.21 Many changes have been shown to be dose dependent.14,2226

In the more than 60 years since the atomic bombings, scientific evidence based on extensive research among these survivors and other populations exposed to ionizing radiation supports the hypothesis that there is a linear dose–response relationship between exposure to ionizing radiation and the development of solid cancer in humans. Excess lifetime risk of disease and death for all solid cancers and leukemia has been estimated based on a wide range of doses from 0.005 up to greater than 2 sieverts (Sv). A statistically significant dose–response relationship has also been shown for heart disease, stroke, and diseases of the digestive, respiratory, and hematopoietic systems, although noncancer risks at very low doses are uncertain.17

Although we have not yet achieved a full understanding of the mechanisms for carcinogenesis after exposure to radiation, research has elucidated some of the diverse responses in complex biologic systems. Ionizing radiation overcomes the binding energy of electrons orbiting atoms and molecules, resulting in a variety of directly and indirectly induced DNA lesions, including DNA base alterations, DNA–DNA and DNA–protein crosslinks, and single- and double-strand DNA breaks.27 Cellular mechanisms have the capability to repair some radiation-induced damage; however, some damage may overwhelm cells’ intrinsic capability for repair. There may also be genetic factors that modify cellular mechanisms for repair and increase susceptibility to the development of tumors after radiation in some individuals.28 Occasional misrepair can result in point mutations, chromosomal translocations, and gene fusions, all with potential to induce neoplasms. Radiation may also produce more subtle modifications that can alter gene expression, affect the intracellular oxidative state, lead to the formation of free radicals, influence signal transduction systems and transcription factor networks, and directly or indirectly impact upon metabolic pathways.29

In summary, the diagnostic and therapeutic benefits from effective use of ionizing radiation in the first century after Roentgen’s discovery have contributed greatly to a revolution in medical care. However, the potential for long-term damage has been repeatedly underestimated, often by physicians and scientists who were genuinely motivated to provide good care for patients. As protocols for optimum use of both diagnostic and therapeutic procedures continue to evolve, it remains important to carefully consider the power of Roentgen’s mysterious rays.

RADIATION-INDUCED MENINGIOMAS

After publication of small case series describing a suspected link between meningioma and exposure to ionizing radiation,3032 the causal association between irradiation and meningioma was recognized in a 1974 analytical epidemiologic study by Modan and colleagues33 In this cohort study, elevated incidence of meningiomas and other tumors of the head and neck area was shown in individuals irradiated as children for treatment of tinea capitis compared to matched nonirradiated population and sibling controls. Radiation-induced meningioma (RIM) is now considered the most common brain neoplasm known to be caused by exposure to ionizing radiation.3436

In 1991, Harrison and colleagues35 categorized RIM according to level of exposure. Doses of less than 10 Gy, such as those used for treatment of tinea capitis between 1909 and 1960, were defined as low; doses of 10 to 20 Gy, typical of irradiation for the treatment of head and neck tumors or vascular nevi, are termed intermediate; and doses greater than 20 Gy, used for treatment of primary or secondary brain tumors, were termed high. Harrison’s categorization is frequently cited in the neurosurgical literature. Others considered exposure levels greater than 10 Gy to be high.37,38 However, the National Academy of Sciences defined exposure of 0.1 Gy or lower as low dose, greater than 0.1 to 1 Gy as a medium dose, and 1 Gy or higher as high dose in its seventh report on the Biological Effects of Ionizing Radiation (BEIR VII).17

RIM After Exposure to High Doses of Therapeutic Radiation

Mann and colleagues,32 writing in 1953, are generally credited with the first report of a meningioma ascribed to previous irradiation. The patient was a 4-year-old girl who had been treated as an infant with 65 Gy for an optic nerve glioma.

Numerous case reports and small patient series describing meningiomas developing subsequent to high-dose radiation therapy for primary brain tumors have been published since the initial article appeared. Several authors have summarized their own experience together with reviews of up to 126 cases reported in the literature.3541 Radiation dose ranges from 22 to 87 Gy in these series. The majority of high-dose RIM patients were irradiated as children, adolescents, or young adults; however, meningiomas secondary to irradiation for primary brain neoplasms in middle-aged and older adults have also been described.37,39,41

Latency periods from irradiation to meningioma detection ranging from two36,42 to 59,39 and even 63 years43 have been reported, with average latency between 10 and 20 years in most series. Latency is shorter with increasing radiation dose and younger patient age at irradiation.35,37,40,44 Descriptive studies of series of patients who developed RIM, such as the studies cited above, may understate the true mean latency for tumor development. A more accurate value would require a cohort study including a large population of patients irradiated for primary brain tumor, and the maximum follow-up period possible.

In 2006, Neglia and colleagues23 published a multicenter nested case-control study of new primary CNS neoplasms in 14,361 survivors of childhood cancer, as a part of the Childhood Cancer Survivor Study (CCSS). Individuals were eligible for participation in the cohort if they were diagnosed and treated between January 1, 1970 and December 31, 1986; received a primary diagnosis of leukemia, CNS cancer, Hodgkin lymphoma, non-Hodgkin lymphoma, kidney tumor, neuroblastoma, soft tissue sarcoma, or bone sarcoma; younger than age 21; and survived for at least 5 years. Data analysis for second primary CNS neoplasms closed on December 31, 2001. The study design included analysis of administration of 28 specific chemotherapeutic agents, surgical procedures, imaging reports, and site-specific dosimetry from radiation therapy. Second primary CNS neoplasms, including 40 gliomas and 66 meningiomas, were diagnosed in 116 case patients of the CCSS cohort. Three meningiomas were malignant at first diagnosis. Each case patient was matched with four other cohort members, who had not developed a CNS neoplasm, by age at primary cancer, gender, and time since original cancer diagnosis. New primary CNS tumors were diagnosed from 5 to 28 years after the original primary tumor diagnosis. Gliomas were diagnosed at a median of 9 years after diagnosis of the primary cancer, with 52.5% diagnosed within 5 years from first cancer diagnosis. Meningiomas showed a much longer latency than gliomas, with a median diagnosis at 17 years from first cancer diagnosis, and 71.2% diagnosed 15 years or more later.23 Follow-up ranged from 15 to 31 years, and therefore no data are available for latency beyond 31 years.

Exposure to therapeutic radiation delivered for treatment of the original cancer was the most important risk factor for occurrence of a secondary CNS neoplasm. Any exposure to radiation therapy was associated with increased risk of glioma (odds ratio [OR]: 6.78; 95% confidence interval [CI]: 1.54–29.7) and meningioma (OR: 9.94; 95% CI: 2.17–45.6).23 The CCSS study is unique among assessments of high-dose RIM because of the large number of case patients, the detailed review of medical records for chemotherapy treatment and radiotherapy dosimetry, the length of follow-up (15–31 years), and the size and structure of the study. Taken together, the findings provide compelling evidence of increased risk for secondary CNS neoplasms, including meningioma, following exposure to therapeutic radiation for treatment of primary cancer during childhood.

The incidence of high-dose RIM is expected to increase as a larger proportion of patients who receive radiation therapy for primary tumors survive for extended periods.37,44,45 Continued close follow-up is warranted after high-dose cranial irradiation, particularly when it is administered to children.23,37,46,47

RIM After Exposure to Low to Moderate Doses of Radiation

Increased risk of meningioma has been reported in individuals who were irradiated for tinea capitis during childhood,30,31,33,4854 irradiated for the treatment of skin hemangioma during infancy,22 exposed to radiation after the explosions of atomic bombs in Hiroshima and Nagasaki,24,25,5557 and exposed to a series of dental radiographic studies.5861

RIM After Irradiation for Tinea Capitis

From 1909 to 1960, the international standard for treatment of tinea capitis was scalp irradiation via the Keinbock–Adamson technique, which was designed to irradiate the entire scalp uniformly with exposure to five overlapping treatment areas.62 In phantom dosimetry studies, conscientious use of the technique resulted in radiation doses of 5 to 8 Gy to the scalp, 1.4 to 1.5 Gy to the surface of the brain, and 0.7 Gy to the skull base.63,64 Tinea capitis was epidemic in some areas (Fig. 5-4), and irradiation was considered the standard of care in such situations before the introduction of griseofulvin circa 1960.6568

The first evidence of negative consequences after treatment using this protocol appeared in 1929 with a report of somnolence lasting 4 to 14 days in 30 of approximately 1100 children (ages 5–12 years) treated for tinea capitis.69 In 1932 and 1935, new reports describing children irradiated for tinea capitis added atrophic and telangiectatic changes in the scalp, epilepsy, hemiparesis, emotional changes, and dilatation of the ventricles to the list of symptoms and complaints.70,71

In 1966, evidence of long-term side effects increased with a report from the New York University Medical Center comparing 1908 tinea capitis patients treated with irradiation between 1940 and 1958 with 1801 patients who were not irradiated. In the irradiated population there were nine neoplasms—three cases of leukemia and six solid tumors, including two brain tumors—compared to one case of Hodgkin lymphoma in the nonirradiated group.63 A later report on these patients described additional effects, including increased rates of psychiatric hospitalization, long-term electroencephalographic changes, and permanent functional changes to the nervous system.64

Additional reports of RIM from Israel, the Unites States, Europe, and the former Soviet Union followed.30,31,35,48,49,54,72 In 1983, Soffer and colleagues54 described unique histopathological characteristics in a series of 42 Israeli RIM patients. His findings are discussed in detail below. From 1936 to 1960, about 20,000 Jewish children age 1 to 15 years (mean 7 years) and an unknown number of non-Jewish children were irradiated in Israel for treatment of tinea capitis, with an estimated 50,000 additional children irradiated abroad, before immigration (personal communication from S. Sadetzki, April 15, 2008).

In 1974, Modan and colleagues33 published the first results of the Israeli tinea capitis cohort study, showing that irradiation was associated with significantly increased risk of meningioma and other benign and malignant head and neck tumors in this population. The cohort includes 10,834 irradiated individuals and two nonirradiated control groups, one drawn from the general population (n = 10,834) and the second from untreated siblings (n = 5392). The nonexposed groups were matched to the irradiated group by age, gender, country of origin, and immigration period.

Research using the Israeli cohort continues. In 1988, Ron and colleagues52 showed a 9.5-fold increase in meningioma incidence following mean radiation exposure levels to the brain of 1.5 Gy (range: 1.0–6.0 Gy). More recently, Sadetzki and colleagues26 reported Excess Relative Risk per Gy (ERR/Gy) of 4.63 (95% CI: 2.43–9.12) and 1.98 (95% CI: 0.73–4.69) for benign meningiomas and malignant brain tumors, respectively, after a median 40-year follow-up of the cohort. The risk for developing both benign and malignant tumors was positively associated with dose. For meningioma a linear quadratic model gave a better fit than the linear model, but the two models were very similar up to 2.6 Gy, a point that encompasses 95% of observations. The ERR of meningioma reached 18.82 (95% CI: 5.45–32.19) when the level of exposure was greater than 2.6 Gy. While the ERR/Gy for malignant brain tumors decreased with increasing age at irradiation, no trend with age was observed for benign meningiomas.

For both malignant and benign brain tumors, risk remained elevated after a latent period of 30 years and more. While the great majority (74.6%) of benign meningiomas were diagnosed 30 years or more after exposure, only 54.8% of malignant brain tumors had latency over 30 years.26 In a separate, descriptive study comparing demographic and clinical characteristics of 253 RIM and 41 non-RIM cases, Sadetzki and colleagues53 showed lower age at diagnosis (P = .0001), higher prevalence of calvarial tumors (P = .011), higher rates of tumor multiplicity (P < .05), and higher rates of recurrence (not statistically significant in this study).

RIM After Irradiation for Skin Hemangioma in Infancy

Karlsson and colleagues22 performed a pooled analysis of the incidence of intracranial tumors in two Swedish cohorts (the Gothenburg and Stockholm cohorts) involving 26,949 individuals irradiated for the treatment of hemangioma during infancy for whom follow-up data were available. Treatment was administered between 1920 and 1965, with some differences between dates for the two cohorts. In individuals who developed meningiomas, the mean dose to the brain was 0.031 Gy (range 0–2.26 Gy) and average age of exposure was 7 months (range 2–30). In individuals who developed gliomas, mean dose to the brain was 1.02 Gy (range 0–10.0 Gy) and mean age at exposure was 4 months (range 1–15). Both cohorts were followed for intracranial tumors reported in the Swedish Tumor Registry from 1958 to 1993, with a 33-year mean time from exposure as the study closed.

There were 83 intracranial tumors in the irradiated group, including 33 gliomas and 20 meningiomas, yielding a standardized incidence ratio (SIR) of 1.43 (95% CI: 1.14–1.78) compared to the general Swedish population. A larger sample would have been required to permit separate calculations of SIR for meningiomas and gliomas. Meningiomas constituted 23% of all tumors, but 43% of tumors among individuals who received 0.10 Gy or more of radiation (P = .005). The authors found a linear dose–response relationship between absorbed dose in the brain and development of an intracranial tumor, and higher risk of tumor in those exposed at earlier ages.

RIM in Survivors of the 1945 Atomic Bombing of Hiroshima and Nagasaki

An estimated 120,000 individuals survived atomic bombings in Hiroshima and Nagasaki in 1945, and slightly less than half of this population was still alive in 2000.17

The Life Span Study (LSS),73 an ongoing cohort study, includes data for approximately 86,500 survivors who were within 2.5 km of a hypocenter, as well as a sample of survivors who were 3 to 10 km from ground zero, and thus had only negligible exposure to radiation. The LSS continues to serve as a major source of information for experts evaluating health risks from exposure to ionizing radiation. The population is large; not selected because of disease or occupation; has a long follow-up period (1950 and ongoing); and includes both sexes and all ages at exposure, allowing many comparisons of risks by these factors. Extensive data regarding illness and mortality are available for survivors who remained in Japan. Radiation doses for survivors resulted from whole-body exposure. Individual doses have been reasonably well characterized, based on consideration of differences in the biological effectiveness of the bombs, distance from hypocenter, and specific location at the time of explosion. For example, exposure for individuals who were inside a typical Japanese home was reduced by nearly 50%.17 Within 10 years of the bombings, an excess incidence of leukemia was found in survivors and linked to radiation exposure; however, elevated risk for solid tumors and a direct relationship between solid tumor incidence and proximity to hypocenter was shown only in 1960.74 A Japanese review of postmortem studies of primary brain tumors in Nagasaki survivors for the years 1946–1977 found only five cases of meningioma, and the authors concluded there was no evidence of increased meningioma incidence in this population 32 years after the bombings.75 The first report demonstrating excess risk of meningioma in atomic bomb survivors was published only in 1994, when Shibata and colleagues24 reported a significantly higher meningioma incidence among Nagasaki survivors beginning in 1981, 36 years after the bombings, and increasing annually. This report was confirmed by further research in Nagasaki55 and in Hiroshima.56,57 In 1996, Sadamori and colleagues55 reported meningioma latency ranging from 36 to 47 years in Nagasaki survivors (mean 42.5 years). Shintani and colleagues56 reported meningioma incidence of 3.0 per 105 persons per year in a population that was distant from the hypocenter and thus received very low levels of exposure, versus incidence of 6.3, 7.6, and 20.0 meningiomas per 105 persons per year in survivors who had been 1.5 to 2.0, 1.0 to 1.5, and less than 1.0 km from hypocenter, respectively. Overall incidence in survivors of the bombs, as well as incidence in each of the three groups, differed significantly from incidence in the population with very low exposure. The authors assumed this increasing incidence indicated dose dependence, as doses increased with increasing proximity to hypocenter.

Preston and colleagues25

Buy Membership for Neurosurgery Category to continue reading. Learn more here