back to Killing Our Own | radiation | rat haus
Article: 991 of sgi.talk.ratical
From: dave@sgi.com (dave "who can do? ratmandu!" ratcliffe)
Subject: Book Intro: Killing Our Own by Wasserman and Solomon (1982)
Summary: chronicling the factual public health costs in America of nuclear tech
Keywords: secrecy, deceit, classification of the public's medical history
Organization: Silicon Graphics, Inc.
Date: Tue, 2 Mar 1993 15:39:59 GMT
Lines: 2038

All of this has long since convinced me that we cannot trust these people [US officials] and, more important still, that nuclear power is too dangerous to have around. But it is clear that our government is so deeply committed to nuclear weapons and nuclear power that it will ignore damning evidence, deny the truth, mislead our people, jeopardize health and even life itself, and try to blacken the reputation of scientists who disagree with its policies. . . .

I earnestly believe that as soon as there is a definite suspicion of harm from any source as malignant as radiation, it is time to make every effort to eliminate it. I feel particularly strongly about radiation because children are much more vulnerable than adults--not only in regard to the likelihood of developing leukemia and cancer, but also of being born with physical or mental defects. And once mutations have been produced in genes, they will be passed down forever.

-- Dr. Benjamin Spock, from the Introduction


. . . anti-H-bomb scientists were in effect muzzled from openly expressing their viewpoints at critical junctures, held back by security-clearance status. Thus in the crucial months before Truman proclaimed his decision on H-bomb development, the public was allowed little information about a decision that could potentially result in millions of deaths and change the course of human history. . . .

[Edward] Teller went out of his way to tell Bulletin of the Atomic Scientists readers at the time: "The scientist is not responsible for the laws of nature. It is his job to find out how these laws operate. It is the scientist's job to find the ways in which these laws can serve the human will. However, it is not the scientist's job to determine whether a hydrogen bomb should be constructed, whether it should be used, or how it should be used. This responsibility rests with the American people and with their chosen representatives."[150] But in the real world--as Teller well knew--secrecy restrictions prevented the American people from participating in the deliberative process until the basic decisions had already been made at governmental top levels, by men very much like himself.

-- from chapter 2: 300,000 GIs Under the Mushroom Clouds





KILLING OUR OWN

The Disaster of America's Experience with Atomic Radiation


_____________________________________________________

Harvey Wasserman & Norman Solomon
with Robert Alvarez & Eleanor Walters

A Delta Book
1982


This is the introduction to the complete text of the 1982 book, Killing Our Own, The Disaster of America's Experience with Atomic Radiation, by Harvey Wasserman & Norman Solomon with Robert Alvarez & Eleanor Walters, that will follow in 18 parts. This book provides a great deal of information about how U.S. federal authorities consistently conducted all fundamental deliberations about the development of nuclear technology in secret, allowing no public debate or education about the costs to human health, as well as what many people personally experienced in terms of their own health being compromised and ruined. The authors compiled a great deal of documentation from interviews with people, both military and civilian, who suffered from effects of exposure to ionizing radiation created by bomb blasts and tests, with workers who handled radioactive isotopes and materials as part of their jobs, with people who had the misfortune of living downwind of nuclear bomb test sites, reactors, uranium mining, milling, enrichment plants and nuclear waste sites. They also obtained documents of government hearings and minutes from meetings behind closed doors held by the Atomic Energy Commission (AEC), and, when it was scandalized out of existence in 1974, by it's successors--the Nuclear Regulatory Commission (NRC) and the Energy Research and Development Agency (ERDA) later to be transformed into the Department of Energy (DOE)--which continue to employ the same game of denial and unaccountability for the adverse health effects caused by the government's pursuit of nuclear technology development.

Published in 1982, Killing Our Own was the first book to chronicle the true costs to public health in America since 1945. From the time the first Marines landed at Nagasaki and Hiroshima in September of 1945, U.S. government officials both in and out of the military lied about the threat to health from exposure to radiation that military personnel--and most tragically the children they would try to raise in the future--were subjected to. The two bombs dropped on the civilian cities in Japan are the first two entries of the official "Announced United States Nuclear Tests" maintained by the DOE. But these "tests" didn't just examine how Japanese civilians died and suffered with the detonation of a unique new brand of firepower, they also examined how healthy young men in the U.S. military responded to the kind of exposure they were subjected to as "cleanup crews." Tragically, these "experiments" were only the beginning.

Now that the "Communist threat" is no more, we are still being told how we just can't do without thousands of nuclear bombs, and "tests" to continue to guarantee their effectiveness--to say nothing of how the commercial nuclear industry is presenting a "new generation" of "safe", "clean" and "reliable" power plant designs to provide "the energy America needs" and the enriched uranium for commercial foreign reactors constituting a heretofore unheard of international commerce in man-made radioactive materials, the most potent biological weapon on the planet. The people who are promoting further use of this technology are still benefiting by the secrecy and lies that have been the consistent "trademark" of the nuclear industry since its inception during World War II. If the costs of such technology are ever mentioned, they are confined to economic ones. The costs to human health are virtually never articulated by the press. Killing Our Own does a phenomenal job of elucidating the most fundamental costs of all, both to ourselves as well as to all our descendants for all time.

I have elected to included selected excerpts from the text for the remainder of this "introduction" so those people who may not be inclined to read the complete book, can nevertheless gain a better understanding of the underside of this unaccountable and lethal technology:

It is my hope that the book itself will be a useful electronic tool and reference to researchers and activists alike. This out-of-print book is reprinted with the express permission of the authors who own the rights to it.

As always, for those interested, a PostScript version of this book is also available to generate a "prettified", "page-defined" hardcopy form. (See the section in at the bottom of the entry point for download-able PostScript copy of this book.)

-- ratitor

. . . the number of children and grandchildren with cancer in their bones, with leukemia in their blood, or with poison in their lungs might seem statistically small to some, in comparison with natural health hazards, but this is not a natural health hazard--and it is not a statistical issue. The loss of even one human life, or the malformation of even one baby--who may be born long after we are gone--should be of concern to us all. Our children and grandchildren are not merely statistics toward which we can be indifferent.

-- President Kennedy, June, 1963




from chapter 1: The First Atomic Veterans


A Hollow Triumph

Entering Nagasaki six weeks after the nuclear bombing, about one thousand Marines and a smaller detachment of Navy Seabees were billeted in the demolished core area around the blast center. Assigned cleanup duties, they arrived as U.S. military-command press releases announced that scientists had found no lingering radiation worth worrying about in Nagasaki. Two weeks later, in less extensive operations, U.S. Army troops moved into the Hiroshima area.[19]

What they endured in ensuing decades closely resembles the ordeals of a wide range of American radiation victims, consistently ignored and denied at every turn by the very institutions responsible for causing their problems. Accorded no place in official histories, many of these U.S. veterans suffered privately, with debilitating and often rare health afflictions as they reached middle age. Some developed terminal illnesses affecting bone marrow and blood production--the kind of biological problems long associated with radiation exposure. Others found that at unusually early ages they were plagued by heart attacks, severe lung difficulties, pain in their bones and joints, chronic fatigue, and odd skin disorders.

A Legacy Comes Home

In the autumn of 1978 Lyman Quigley received visitors at his house in northeast Portland. Pain-racked but determined, he sat next to a kitchen table piled high with correspondence from the Defense Department, Veterans Administration, and nongovernmental scientists. Thirty-three years after going ashore in Nagasaki, for Quigley, atomic and personal histories had become inextricably meshed.

He was a quintessential American man, raised in the Depression era, proud of his military service. His political views were mainstream; his favorite magazine, Reader's Digest. What set him apart was his belief that an unreported part of history had been telescoped into his own body, his organs and cells--and, he feared, perhaps into the genetic heritage passed on to his children, Ron and Linda, now in their twenties.

Government Response

On Capitol Hill, few members of Congress were willing to step forward. When Junior Hodge, for instance, sought help from his representative, Al Gore, Jr., the ex-Marine veteran of Nagasaki bulldozer assignments got no help as he lay ailing in eastern Tennessee. An aide to Congressman Gore noted that the Tennessee Valley Authority's nuclear power plants carry enormous political clout back home. "I know nuclear weapons fallout isn't exactly the same thing," the aide told us, "but it's close enough to nuclear power that we'd rather stay away from it publicly."[70]





from chapter 2: 300,000 GIs Under the Mushroom Clouds


Living with Nuclear Weapons

Supporters of civilian nuclear control soon began to realize they had won a hollow victory. The AEC was effectively interwoven with U.S. military authority--which was, after all, the prime user of the atom.[121]

Those eager for nuclear proliferation American-style found that in many respects they could enjoy the best of both worlds: the appearance of civilian control, with the military still calling the shots.[122] In the face of Pentagon expertise and clout, the legislative branch quickly accepted a junior role in nuclear matters. When the 1950s began, members of the congressional Joint Committee on Atomic Energy still were not privy to the number of bombs in the U.S. nuclear stockpile.[123]

The American military, meanwhile, rapidly became the primary source of funds for scientists in numerous fields. And those who paid the pipers composed the tunes. By autumn 1946 the trend was becoming painfully obvious to many atomic scientists, including Philip Morrison. Speaking at an annual public-affairs forum sponsored by the New York Herald Tribune, Morrison commented on this evolving relationship: "At the last Berkeley meeting of the American Physical Society just half the delivered papers . . . were `supported in whole or in part' by one of the [Armed] Services . . . some schools derive 90 percent of their research support from Navy funds . . . the Navy contracts are catholic. . . . The now amicable contracts will tighten up and the fine print will start to contain talk about results and specific weapon problems. And science itself will have been bought by war on the installment plan.

"The physicist knows the situation is a wrong and dangerous one. He is impelled to go along because he really needs the money."[124]

The nation's major universities grew steadily entangled in the atomic funding net. In spring 1947 prime academic institutional involvement came from the University of California--operating Los Alamos in New Mexico and the Radiation Laboratory in Berkeley--and from the University of Chicago, main operator of the Argonne National Laboratory along with dozens of other colleges acting as copartners. By the end of the decade scores more large universities were under large atomic contracts from the government.

Less than seven months after the AEC came into existence, President Truman issued a "loyalty order" authorizing police investigations into the moral fiber and political fidelity of federal employees.[125] Atomic researchers with government grants were also subject to such inquiries. Robert Jungk characterized the results as an "unhealthy climate of suspicion, accusations and time-wasting defense against false charges."[126]

"From 1947 on," he added, "the atmosphere in which the Western scientists lived became more and more oppressive every year." Throughout the U.S., England, and France scientists faced "loyalty committees," firings, interference with international travel, and general harassment--so that "in the laboratories of the Western world people started whispering to one another, anxiously on the watch for the State's long ears, as had hitherto been the case only in totalitarian countries."[127]

The fear ran from the lowest lab intern to the most esteemed scientific pioneer. Attending the University of California, physics student Theodore Taylor and a few other pupils devised a proposal for a general strike by American physicists. They approached J. Robert Oppenheimer, then at the height of his considerable national power in nuclear policy circles. Taylor always remembered Oppenheimer's words. After he read over the written proposal, Oppenheimer said, "Take this paper. Burn it. Never recall it. Anyone who knew of this would label you a Communist and you would have no end of trouble the rest of your life."[128]

The H-Bomb

Albert Einstein was among those in 1950 who viewed current events with trepidation. Within the U.S. he warned of "concentration of tremendous financial power in the hands of the military, militarization of the youth, close supervision of the loyalty of the citizens, in particular, of the civil servants by a police force growing more conspicuous every day. Intimidation of people of independent political thinking. Indoctrination of the public by radio, press, school. Growing restriction of the range of public information under the pressure of military secrecy."[143]

It was in this atmosphere that deliberations over whether to proceed with H-bomb research reached their climax. That secretive process is important to understand "because it is one of the relatively few cases where those who explicitly tried to moderate the nuclear arms race came within shouting distance of doing so," according to Herbert York, the first director of the Lawrence Livermore Laboratory where much of the hydrogen bomb R and D subsequently took place. Behind the scenes there was, in York's words, "a brief, intense, highly secret debate."[144]

Under federal law a key source of recommendations for the Atomic Energy Commission was its General Advisory Committee. Called upon by the AEC to take up the question of prospective H-bomb development, the Advisory Committee--chaired by J. Robert Oppenheimer and including such luminaries of nuclear physics as Enrico Fermi and I. I. Rabi--met in late October 1949. While urging continued efforts to magnify the power of atomic weaponry, the Advisory Committee urged that the United States not plunge ahead with developing the H-bomb, also known as the "super bomb."[145]

The panel presented arguments in terms of military strategies, technical aspects, and optimum use of present nuclear resources, concluding that the H-bomb was not needed for U.S. national security. The report also depicted the H-bomb choice as a profound moral issue: "It is clear that the use of this weapon would bring about the destruction of innumerable human lives; it is not a weapon which can be used exclusively for the destruction of material installations of military or semi-military purposes. Its use therefore carries much further than the atomic bomb itself the policy of exterminating civilian populations."[146]

An addendum to the Advisory Committee report, written by James B. Conant--later president of Harvard University--and signed by five other committee members including Oppenheimer, underscored the moral moment of the H-bomb decision: "Let it be clearly realized that this is a super weapon; it is in a totally different category from an atomic bomb. . . . Its use would involve a decision to slaughter a vast number of civilians. We are alarmed as to the possible global effects of the radioactivity generated by the explosion of a few super bombs of conceivable magnitude. If super bombs will work at all, there is no inherent limit on the destructive power that may be attained with them. Therefore, a super bomb might become a weapon of genocide."[147]

These and other anti-H-bomb scientists were in effect muzzled from openly expressing their viewpoints at critical junctures, held back by security-clearance status. Thus in the crucial months before Truman proclaimed his decision on H-bomb development, the public was allowed little information about a decision that could potentially result in millions of deaths and change the course of human history.

In top-secret circles the debate was fierce. Senator Brien McMahon, chairman of the Joint Committee on Atomic Energy, confided in Edward Teller that the anti-H-bomb Advisory Committee report "just makes me sick."[148] For their part McMahon and a constellation of atomic scientists, including Teller and University of California Radiation Laboratory director Ernest Lawrence, were determined to bring about development of the H-bomb as soon as possible, believing it to be the best possible response to Soviet possession of the atom bomb.[149]

Teller went out of his way to tell Bulletin of the Atomic Scientists readers at the time: "The scientist is not responsible for the laws of nature. It is his job to find out how these laws operate. It is the scientist's job to find the ways in which these laws can serve the human will. However, it is not the scientist's job to determine whether a hydrogen bomb should be constructed, whether it should be used, or how it should be used. This responsibility rests with the American people and with their chosen representatives."[150] But in the real world--as Teller well knew--secrecy restrictions prevented the American people from participating in the deliberative process until the basic decisions had already been made at governmental top levels, by men very much like himself.

Atomic Escalation

The 180 American atmospheric nuclear bomb detonations between 1950 and 1960 carried with them great political power. Senators Millard Tydings and Glen Taylor were object lessons.

Tydings, an aristocratically mannered parliamentarian from Maryland, was chairman of the Senate Armed Services Committee. Taylor had been elected to the Senate from Idaho after a barnstorming career as a Western vaudevillian earned him the sobriquet "the handsome cowboy singer." Both men had become vocal foes of unbridled nuclear weapons development and indiscriminate disloyalty charges against dissenters from the cold war.[158] And, in 1950, both Tydings and Taylor were up for reelection.

At the same time Senator Joseph McCarthy was in the midst of launching to new depths his crusade to depict a wide array of citizens and organizations as un-American and pro-Communist--a drive that was to put the word McCarthyism into the political lexicon as a synonym for unsubstantiated, scurrilous smear tactics. Only ten days after Truman's directive favoring the H-bomb, McCarthy delivered a famous speech in Wheeling, West Virginia, claiming that there were many Communists in the U.S. State Department. McCarthy's witch-hunting star was on the rise, with nuclear weapons enthusiasm and anti-Communist hysteria dovetailing nicely for him and his backers.[159]

But, in 1950, Senator Millard Tydings unrepentantly advocated comprehensive disarmament talks to halt and reverse the nuclear arms race. He was one of McCarthy's prime targets. That autumn, running for reelection, Tydings went down to defeat in a campaign filled with charges that he had amiable relations with Communists and was not in favor of vigorously combating reds.[160]

Glen Taylor, elected to the Senate in 1944, was given to committing serious breaches of contemporary political etiquette. In 1948 Taylor ran as the vice-presidential candidate on the Progressive Party's national ticket headed by Henry Wallace. Taylor's decision to run for vice-president came after a meeting with Truman, who expressed views favoring military confrontation with the Soviet Union--an approach that Taylor found appalling in the atomic age. The Progressive Party involvement clearly jeopardized Taylor's Senate career, and even his future ability to support his children and send them through school. "Well hell, honey, if there's an atomic war, it won't matter none if the kids are educated or not," Taylor told his wife.[161]

During his unsuccessful campaign for reelection to the Senate in 1950 Taylor was called to account for his staunch opposition to nuclear boosterism; he was branded disloyal and worse. The sort of conduct that had made him a target was epitomized in a Senate debate two days after Truman's announcement that the U.S. was going ahead with the H-bomb.

"I feel that we have handicaps to overcome," Taylor told the Senate. "The fact that the evil influence of Dillon, Read & Co. was largely responsible for shaping our foreign policy and creating mistrust in many areas of the world, has placed us at a disadvantage."[162] Taylor had committed a severe indiscretion.[163] He had raised the issue of corporate control over U.S. nuclear policies.


The leading Wall Street banking firm of Dillon, Read & Co. was, in fact, well represented in the top echelons of the federal administration that brought the nuclear industry over the billion-dollar-a-year mark in 1950. Truman's secretary of defense, James V. Forrestal, was formerly president of Dillon, Read & Co.; William H. Draper, a high-ranking executive of the same firm, became undersecretary of defense.[164]

Truman's appointee as the AEC's research director, Dr. James B. Fisk, was a former executive of Bell Telephone Laboratories. The AEC commissioners included Sumner Pike, who had been a Republican member of the Securities and Exchange Commission, and Lewis Strauss--a rear admiral and New York banker.[165]

To astute financiers the late 1940s signaled prospects for huge profits to be made from nuclear investments.[166] Fairchild, General Electric, and Monsanto Chemical were taking the lead in postwar corporate nuclear involvements.[167] By the start of 1949 the list of postwar corporate investors had lengthened to include such major companies as Du Pont, Westinghouse, Standard Oil Development Co., Union Carbide, Kellex Corp., Blaw-Knox, and Dow Chemical.[168] A cornucopia of government contracts was anticipated.

"ATOM BECOMES BIG BUSINESS AT BILLION DOLLARS A YEAR," blared a 1950 headline in US. News & World Report. "All across the country, research installations and industrial projects are to be built or expanded as part of the rapid growth of the atom into a big business. Hydrogen-bomb development will be fitted into this pattern."[169]

There was talk, too, of developing nuclear power for electricity--a prospect that would evolve into the "Atoms for Peace" program a few years later. More certain to investors as the 1950s began, however, was the lure of nuclear weaponry.[170]

"To What Extent Can We Trust Ourselves?"

With the twentieth century at its midpoint the United States geared up for a quantum leap in the magnitude and frequency of atomic bomb tests. Wrapped in the flag, the testing package grew bigger, costlier, and deadlier.

Even before the first of hundreds of U.S. nuclear test explosions took place in the 1950s, some nuclear scholars warned about the biological implications of large-scale atomic blasts. One of the first was Hans Bethe, a Nobel laureate credited with discovering energy mechanisms present within the sun--knowledge that proved integral to H-bomb development.

Bethe had served as director of theoretical physics at the Los Alamos laboratory during World War II. A professor at Cornell University, he and eleven other prominent physicists expressed deep concern about the H-bomb in a public statement issued at a Columbia University meeting of the American Physical Society, a few days after Truman's directive approving the new weapon.[171]

In late February 1950 Bethe appeared on an NBC radio round-table discussion that provoked national controversy. When the moderator raised the question of radiation dangers from thermonuclear weapons, Bethe responded: "You are certainly right when you emphasize the radioactivity. In the H-bomb, neutrons are produced in large numbers. These neutrons will go into the air; and in the air they will make radioactive Carbon-14, which is well known to science. This isotope of carbon has a life of 5,000 years. So if H-bombs are exploded in some number, then the air will be poisoned by this Carbon-14 for 5,000 years. It may well be that the number of H-bombs will be so large that this will make life impossible."[172]

Another panelist on the NBC program was Leo Szilard, a University of Chicago professor of biophysics who had been influential in getting the U.S. to embark on atomic development for military purposes at the start of World War II. A physics pioneer whose work on uranium's neutron emissions had made it possible to sustain chain reactions, Szilard posed a profound overview for the national radio audience to ponder. Said Szilard:

In 1939 when we tried to persuade the Government to take up the development of atomic energy, American public opinion was undivided on the issue that it is morally wrong and reprehensible to bomb cities and to kill women and children. During the war, almost imperceptibly, we started to use giant gasoline bombs against Japan, killing millions of women and children; finally we used the A-bomb. I believe there is a general uneasiness among the scientists. It is easy for them to agree that we cannot trust Russia, but they also ask themselves: To what extent can we trust ourselves?[173]

Such talk from impeccably credentialed individuals, if widely disseminated, could have been a roadblock to the nuclear weapons testing program. David E. Lilienthal, who had just retired from his post as chairman of the Atomic Energy Commission, promptly denounced the scientists who had appeared on the NBC round-table radio show as "oracles of annihilation." Lilienthal, speaking at a Town Hall forum in New York City, warned that the "new cult of doom" was liable to bring about "hopelessness and helplessness. . . . And hopelessness and helplessness are the very opposite of what we need. These are emotions that play right into the hands of destructive Communist forces."[174]

If physicists of Bethe's and Szilard's stature could be taken to task for warning the public about perils of radiation, less secure critics had better watch their step. Those running the nuclear machinery were anxious to make clear that they would employ derision and innuendo to fight anyone opposing atomic proliferation. Such pressure would be felt for decades to follow as scientists attempted to investigate the full implications of radiation effects on human health.

Dr. Szilard's unpleasant question, however, would prove prophetic for many thousands of Americans whose lives were forever altered by the mushroom clouds that followed his broadcast words: To what extent can we trust ourselves?





from chapter 3: Bringing the Bombs Home


AEC Denials

As early as 1959 a study disclosed higher radioactive strontium 90 levels in young children living downwind of the atomic tests.[37] In 1965 another suppressed study--this one by U.S. Public Health Service researcher Dr. Edward Weiss--correlated radioactive fallout with an inordinately high leukemia rate among downwind Utah residents. Weiss's report concluded: "An examination of leukemia death records in southwestern Utah" during the years of heavy fallout "shows an apparently excessive number of deaths."[38]

A joint AEC-White House meeting about the Weiss report took place in early September 1965; AEC representatives criticized the study. A week later the AEC's assistant general manager told AEC commissioners that researching such topics as downwind leukemia rates would "pose potential problems to the commission: adverse public relations, lawsuits and jeopardizing the programs of the Nevada Test Site."[39] Although atmospheric testing had been banned by then, underground tests were still releasing radioactivity into the air. And the AEC was gearing up for the civilian nuclear power program, predicated on the contention that low levels of officially permitted radiation were harmless.

The White House shelved the Weiss report in 1965, and blocked any follow-up research.[40] In fact there were many nuclear-testing-related documents and AEC meeting minutes that remained secret until 1979, when they were made public by journalists or Senator Edward Kennedy.[41] For the Weiss study that meant staying locked up in federal vaults for a full thirteen years.[42]

In 1979, however, University of Utah epidemiology director Dr. Joseph L. Lyon independently confirmed the validity of the Weiss report. In an article published in the New England Journal of Medicine, Dr. Lyon and associates documented that children growing up in southern Utah during the aboveground atomic weapons tests suffered a leukemia rate two and a half times higher than for children before the testing began and after it ended.[43]

Fallout on Livestock

But the AEC commissioners proved more concerned with publicity than health problems of either sheep or humans.[115] At a July 7 meeting Commissioner Henry Smyth observed that public concern could be allayed by comparing bomb fallout "to radiation incurred in the normal medical use of X-rays."[116] It was a public-relations angle that proved to be a favorite for the AEC, the Nuclear Regulatory Commission, and utilities operating nuclear power plants across the nation in future decades.

But the analogy--comparing X rays with radioactivity from nuclear fission--is highly misleading. An atomic bomb, or a nuclear reactor, produces radioactive alpha and beta particles that can be deadly if inhaled or swallowed even in minute quantities; the alpha and beta "internal emitters" are not present in the penetrating X rays used for medical purposes. The comparison with X rays also falsely assumes that bomb fallout or emissions from nuclear plants are evenly distributed in the population. A number of factors--including weather conditions and radioactive contamination of the ecological food chain[117]--can subject some animals or people to higher amounts of radioactivity.


In his role as county agricultural agent in southwest Utah, Dr. Brower accompanied sheep rancher Doug Clark to talk with federal administrators. "Doug raised some questions with the team of scientists, one of whom was a colonel," Dr. Brower remembered many years later. The colonel "seemed to be the leading spokesman to kind of press this issue that it couldn't have been radiation. Doug asked him some fairly technical questions about the effects of radiation on internal organs that he'd gotten from other veterinarians."[122]

In response the colonel called Doug Clark a "dumb sheepman" and told him he was "stupid--he couldn't understand the answer if it was given to him, and for just 10 or 15 minutes, just kind of berated him rather than answer the question."[123]

A week after the Atomic Energy Commission's unequivocal public denial that sheep had been harmed by atomic test fallout, AEC officials faced angry livestock owners in a conference room of the Cedar City firehouse. The January 13, 1954, meeting included a dozen or so federal officials and a roughly equal number of area livestock owners.

"We know that practically all the sheep that range in that area had these effects," said a local rancher. "We fed these sheep corn and tried to keep them up. I couldn't keep my sheep up where they were able to raise a lamb. I had never seen it before.[124]

"We would like to have an answer for you," responded AEC biological medicine chief Dr. Paul Pearson. "We don't have any explanation for it. There have been instances of disease coming in that caused different effects, we don't know what happened."[125]

"There is very little protein in corn and they could be low in protein," interjected Leo K. Bustad, a General Electric Company envoy from the AEC-controlled Hanford Nuclear Reservation, prime production center for weapons-grade plutonium. "How was their flesh?"[126]

Refusing to be drawn into a discussion about his sheep's flesh with the GE representative, the rancher said that his sheep got all the protein they needed from grazing. "Range is white sage and black sage. . . . Sage is very high in protein."[127]

And so it went. "The body dose radiation that these sheep got is around five roentgens," explained GE's Bustad midway through the meeting. "You can get more roentgens from a fluoroscope or an X-ray machine than these sheep got through body radiation." Bustad failed to note that the sheep ingested radioactive particles into their bodies, which does not occur during an X ray. Nor did he mention that five roentgens is a hazardous dose in either case.[128]





from chapter 4: Test Fallout, Political Fallout


The Lucky Dragon

Reflecting on the Lucky Dragon crew members three years after their encounter with radioactive fallout, Lapp observed: "The true striking power of the atom was revealed on the decks of the Lucky Dragon. When men a hundred miles from an explosion can be killed by the silent touch of the bomb, the world suddenly becomes too small a sphere for men to clutch the atom."[49]

But, in the midst of the controversy over the H-bomb test effects in spring 1954, AEC Chairman Strauss assured the American public there would be no significant impacts on the continental U.S. The "small increase" in radiation, he said, was "far below the levels which could be harmful in any way to human beings, animals and crops."[50]

The AEC chief's pronouncement provoked disbelief among independent scientists. Particularly disturbed was Dr. A. H. Sturtevant, chairman of the genetics department at the California Institute of Technology. In an address to the Pacific division of the American Association for the Advancement of Science, Sturtevant declared there was "no possible escape from the conclusion that bombs already exploded will ultimately produce numerous defective individuals." He further stated that an estimated "1,800 deleterious mutations" had already resulted from fallout.[51]

The AEC was stunned that the nuclear weapons testing program was being openly questioned by a prominent scientist like Sturtevant.

By early 1955 the AEC released a written response to Sturtevant's charges. Pointing to a "rather wide range of admissible opinion in this subject," the AEC dismissed the geneticist's assessment.[52] The AEC failed, however, to do any of its own calculations of genetic mutations--thus ignoring the scientific basis of Sturtevant's conclusions, which were derived from the work of the AEC's own Division of Biology and Medicine.

Continuing Tests in Nevada

Senator Anderson was in the midst of a personal feud with AEC chairman Lewis L. Strauss.[55] As head of the congressional Joint Committee on Atomic Energy, Anderson could cause trouble. "I do not advocate taking any real risk with public health and safety," the senator said. But his message was clear: If the AEC was willing to let weather interrupt testing schedules at the Nevada Test Site, then the tests might be banished to the far-flung Pacific.[56]

AEC commissioner Willard F. Libby fumed that confining tests to the Pacific would "set the weapons program back a lot."[57] But disregarding weather conditions in Nevada would bring more fallout to the St. George area--"which they apparently always plaster," in the words of AEC Chairman Strauss.[58]

"I have forgotten the number of people at St. George," Strauss said. Informed that forty-five hundred people were living in the town, Strauss ruminated, "So you can't evacuate them."[59]

"St. George is hypertensified . It is not a question of health or safety with St. George, but a question of public relations," commented AEC fallout expert Dr. John C. Bugher. "You remember the uproar at St. George last series." After that experience, Dr. Bugher recollected, "We regarded southern Utah as a forbidden zone for future fallout in this series."[60]

But the AEC decided that the people of Utah were less important than the atomic testing schedule. Former Rear Admiral Strauss, into his second year as chairman, concurred with a suggestion by commissioner Thomas Murray to "get on with the test."[61]

"I don't think we can change them at this stage of the game," said Strauss, referring to Nevada testing criteria.[62]

A forty-three-kiloton blast, code-named Turk, proceeded as planned at the Nevada Test Site. So did ten more blasts in the Teapot series, totaling 114 more kilotons.

At an AEC meeting midway through Operation Teapot spirits seemed to have improved. "People have got to learn to live with the facts of life, and part of the facts of life are fallout," Commissioner Libby said.[63]

"It is certainly all right they say if you don't live next door to it," responded Chairman Strauss.[64]

"Or live under it," chimed in K. D. Nichols.[65]

Vowed Commissioner Murray: "We must not let anything interfere with this series of tests--nothing."[66]

Cancer, Genetics, and Fallout

In 1960, fifteen years after the first nuclear testing, the AEC had finally established a Fallout Studies Branch. Harold Knapp was working in the AEC general manager's office at the time. Asked to join the Fallout Studies Branch in 1962, Knapp's first task was to review the AEC's rebuttal to a series of criticisms by Ralph Lapp. Knapp found that the rejoinder, written by the prestigious General Advisory Committee of the AEC, "didn't answer anything" and was a "wholly inadequate response."[134] Particularly, Knapp found that the issue of radioactive "hot spots" raised by Lapp deserved further exploration.

AEC officials were continuing to assume uniform distribution of fallout--a woefully inaccurate assumption, ignoring variations in fallout patterns, owing to weather conditions and other factors. "For three months I held them off on a daily basis," while working to come up with a better response, Knapp recollected in a 1981 interview.[135] He found evidence that agreed with Lapp's claims about hot spots. The paper, sent to the Joint Committee on Atomic Energy, elicited praise for its candor.

Knapp decided to make a systematic and detailed analysis of the problem of fallout by first looking at radioactive iodine. To his surprise "no systematic approach to the study of fallout had been done before." The monitoring data were "spotty," and evidently there was no real consistent approach to the collection of radiation samples.

"They had inadequate measuring techniques. It takes four days for the radioiodine to build up to a maximum in milk. Within two weeks everything is gone. Either they would analyze the sample too soon or wait too long."[136]

In examining milk data for the 1953 tests, Knapp discovered, "by pot luck someone was measuring the right thing at the right time" for St. George, Utah. Knapp estimated that during the 1950s the dose to the thyroid from iodine 131 in cow's milk was ten times the Federal Radiation Council standards.[137]

Knapp's report was sent upstairs to Charles Dunham, director of the AEC's Division of Biology and Medicine. It was immediately classified.[138] Dunham sent the paper to Gordon Dunning, AEC deputy director for operational safety, who suggested that a special AEC committee, composed of "qualified scientists with specialized backgrounds,"[139] be established to comment on the report.

Four of five reviewers favorably commented on Knapp's paper and urged its release. The only unfavorable review came from the Nevada Test Site's off-site radiological safety officer, Oliver R. Placak.[140] Over Dunning's objections, the AEC assistant general manager for research, Spoford English, reluctantly okayed release of the Knapp report.

The basic point of Knapp's research was that after more than ten years of atomic weapons testing at the Nevada site, the AEC had never actually bothered to methodically assess the impact of fallout on people living nearby. The Knapp report, issued in early 1963, warned that "At the Nevada Test Site, over 1,000 kilotons equivalent of Iodine-131 were released before we obtained any reliable data on Iodine-131 in milk in off-site communities following deposition from specific shots." The amount was more than five thousand times as much as had been released at a 1957 accident at the British reactor at Windscale, which caused a national emergency to be declared because of milk contamination.[141]

The broad outlines of the fallout disaster came into focus even while atmospheric nuclear testing persisted. Two decades later Robert Minogue, research director for the Nuclear Regulatory Commission, told us: "High AEC officials knew very well the biological effects of low-level radiation in the 1950s. They can't use ignorance as an excuse."[142] But, as grim evidence mounted, the nuclear policymakers tried to keep the truth from the public.





from chapter 5: Continued Testing: Tragic Repetitions


American military officers, briefed by the Armed Forces Special Weapons Project during the late 1940s, were warned that underwater nuclear tests entailed special risks. The secret handbook used in the course cautioned that radioactive mist from an underwater nuclear blast could be expected to spray "serious contamination over a large area."[4]

On pages marked "RESTRICTED" the government's own experts elaborated on the dangers. Dr. Herbert Scoville, Jr., who later became deputy director of the Central Intelligence Agency, wrote: "In an underwater detonation the nuclear radiation effects are quite different from those resulting from an air burst and are of considerably greater magnitude." Scoville recalled that the only underwater nuclear test up until that time, in the lagoon at Bikini, had left enormous quantities of radioactivity--"estimated to be equivalent to thousands of tons of radium shortly after the detonation. This is a billion times the radioactivity from a gram of radium. Such is the truly fantastic radioactivity associated with an atomic bomb detonation."[5]

And, Scoville pointed out, in Bikini's lagoon "intensities above tolerance were measured for almost a week." Even "nontarget vessels" were severely contaminated.[6]

The "Clean" Bomb

After his June 1957 meeting with Teller and other physicists, President Eisenhower shared his enthusiasm with the nation. "What they are working on is . . . the production of clean bombs," Eisenhower proclaimed. "They tell me that already they are producing bombs that have 96 percent less fallout than was the case in our original ones, or what we call dirty bombs, but they go beyond this. They say: `Give us four or five more years to test each step of our development and we will produce an absolutely clean bomb.'" The New York Times headline, for the article conveying the President's statements, revealed one of the significant motives behind the announcement: "EISENHOWER WARY OF ATOMIC TEST BAN."[33]

But promises about cleanliness of nuclear bombs did not decontaminate the radiation still rising from Pacific Ocean and Nevada test sites in 1958--during which the U.S. exploded seventy-seven nuclear weapons. Even America's major metropolitan areas were not exempt from intensely radioactive fallout clouds. Rapid-fire atmospheric nuclear tests in Nevada, plus Russian atomic detonations, sent radiation readings to the highest ever recorded in Los Angeles by the end of October 1958. Government officials announced that the fallout on Los Angeles was "harmless." Yet privately the National Advisory Committee on Radiation termed the L.A. radioactivity "an emergency."[34]

Panel members met in secret session on November 10, 1958, to discuss the problem. "If you ever let these numbers get out to the public, you have had it," said Lauriston S. Taylor, head of the Atomic Radiation Physics Division of the National Bureau of Standards.[35]

The average radiation dose in Los Angeles hovered at the maximum levels deemed "permissible" according to federal guidelines--and some citizens received more than that amount. Taylor admitted that references to permissible levels "carry the implication that we know what we are talking about when we set them. But in actual fact, they really represent the best judgment we would exercise now in the total absence of any real knowledge as to whether they are correct or not."[36]

U.S. surgeon general Dr. LeRoy Burney commented, "If I were in Los Angeles, I would consider I was insulted for somebody in the Federal Government . . . to say, `This is nothing to be alarmed about.'"[37]

The huddled government scientists observed that radiation dosages at least as high as those besetting Los Angeles had been found the previous year in Salt Lake City. But twenty years would pass before residents of either city learned about what was said at that closed governmental meeting.[38]

Fallout in New York State

. . . In its early days, Science had strongly questioned the atomic establishment. In 1955 the magazine vigorously attacked Lewis Strauss for scientific suppression and had published E. B. Lewis's papers opposing the "threshold" concept of radiation safety.

But now the editorship of Science had passed to Philip Abelson, a physicist deeply involved in the government's nuclear program from the Manhattan Project on. Abelson also served on the AEC's General Advisory Committee and on its Project Plowshare Committee, which was promoting "peaceful" uses of nuclear explosives.

Nuclear Experiments

In retrospect there is chilling irony in the atomic bomb's--and the nuclear industry's--origins. Stopping Nazi barbarism provided the initial rationale for the Manhattan Project, which developed the atomic bomb. At the Nuremberg trials some Nazi scientists and other functionaries were charged with grotesque experiments on humans; the Nuremberg judges rejected excuses and rationalizations.

But since then, in the United States, "we have already accepted the policy of experimentation on involuntary human subjects,"[56] concluded Dr. John W. Gofman, a pioneer in radiation research who codiscovered the fissionability of uranium 233 and helped isolate the world's first milligram of plutonium.

"In the mid-'50s--when the toxi[ci]ty of low-dose radiation was still uncertain--we were testing nuclear bombs in the atmosphere and launching the Atoms for Peace Program," Gofman recalled in a 1979 statement. "It should have been clear to me, even then, that both atmospheric bomb-testing and nuclear power constituted experimentation on involuntary human subjects, indeed on all forms of life."[57]

With extraordinarily blunt self-criticism Gofman--a physicist and medical doctor--went on: "I am on record in 1957 as not being worried yet about fallout and still being optimistic about the benefits of nuclear power. There is no way I can justify my failure to help sound an alarm over these activities many years sooner than I did. I feel that at least several hundred scientists trained in the biomedical aspect of atomic energy--myself definitely included--are candidates for Nuremberg-type trials for crimes against humanity through our gross negligence and irresponsibility." And, Gofman added, "Now that we know the hazard of low-dose radiation, the crime is not experimentation--it's murder."[58]

People viewing such an assessment as unfair or excessively strident might find it less so after visiting small towns like St. George, Utah, or Fredonia, Arizona, or Tonopah, Nevada. The pain, for many, has just begun.

Before dawn on January 27, 1981--exactly thirty years after the first mushroom cloud ascended from the Nevada Test Site--lifelong Utah residents gathered at the steps of the state capitol and lit candles in memory of dead relatives and friends. Around the state other memorial candles flickered in the darkness.

At the operations center for the Nevada Test Site daylight brought simply the beginning of another working day. An Associated Press reporter phoned for comment on the candlelight observances downwind. He took notes, and wrote in an article sent across the nation a few hours later: "The Department of Energy maintains there is `no positive evidence' of a link between fallout and the cancer cases, said Dee Jenkins, test site spokeswoman."[59]

We called Dee Jenkins and asked for clarification. Had she been accurately quoted?

Yes, she replied. "There is no positive link between low-level radiation and cancer cases."[60]

We asked whether the downwind residents had received "low-level radiation" exposure during the atmospheric testing years.

"I'm not qualified to answer that question," she responded after a pause.[61] Our request for a clarifying official statement was never answered.

Three decades after the first fallout clouds from Nevada, in some respects not much had really changed at federal agencies making pronouncements about nuclear testing.

More Radiation Clouds

In the late 1960s and beyond, the kind of additional fallout that underground testing critics had labored to prevent did indeed occur--with several subsurface nuclear tests shooting radioactivity across the U.S. and into Canada.

From 1966 to 1975 the federal officer responsible for monitoring of off-site fallout from underground detonations was Colonel Raymond E. Brim, chief of operations for the Air Force Technical Applications Center. . . .

While working for the Air Force, Brim went along with the Pentagon program and held his peace. During the first several years after retirement, however, Colonel Brim mulled the implications of underground testing radiation leaks. On August 1, 1979, he testified at a hearing of the House Subcommittee on Oversight and Investigations.

"There is indisputable evidence on record that shows that the people, not just of Utah and Nevada but of a much wider and more encompassing area of the United States, were unknowingly subjected to fallout of radioactive debris that resulted from ventings of underground and cratering tests conducted at the Nevada Test Site," Brim told the congressional panel. "Because of weather and wind patterns, this debris was frequently carried much farther than has been reported to the public."[96]

Although Brim's testimony came at an open hearing on Capitol Hill, The New York Times, The Washington Post, and the nation's other most influential newspapers did not print a word about it.

More than a year later, in January 1981, Brim declared flatly that "Americans were exposed to dangerous levels of radiation from `safe' underground tests all through the 1960s and 1970s, and remain in danger today." In an article published by The Washington Monthly magazine, Colonel Brim charged: "Just as the risk of fallout continues, so does the conscious government effort to cover up the situation. Department of Energy officials fully understand that underground testing can't fully contain radiation, yet downplay the information or even withhold it from the public. Exactly as they did in the 1950s, officials refuse to reveal information necessary for those who live near radiation accidents to protect themselves."[97]

It was a strong statement from someone who--for nearly ten years--served as the Pentagon's top officer in charge of monitoring leaks from underground nuclear tests. "Today it seems incredible that straight-faced government spokesmen could proclaim that standing downwind of an open-air nuclear explosion was perfectly safe," Brim went on. "It seems equally incredible that people believed the claims. Yet that twin mentality continues to operate, with Washington making what will, in years to come, be considered preposterous claims about the safety of underground tests, and most people nodding their heads in agreement."[98]

The Nevada Test Site's current manager, Mahlon Gates, made a public appearance before a 1979 congressional hearing, ostensibly making a clean breast of past underground test radiation ventings. Colonel Brim observed, however, that Gates's "estimate of the total amount of radiation downwind of a test site in the period from 1951 to 1969 . . . worked out to less than a quarter of the radiation the Public Health Service recorded after a single blast on the same site."[99]

Indicative of the kind of present-day hazards--and governmental deceit--Brim alluded to was the underground nuclear test Baneberry. When it vented on the morning of December 18, 1970, Baneberry sent a mushroom cloud of radioactivity eight thousand feet into the air. Ten years later the U.S. Government's official log of nuclear tests was still claiming that only "minor levels of radioactivity" were detected off-site from the Baneberry explosion.[100]

But Colonel Brim, who was responsible for off-site monitoring during the Baneberry test, has pointed to evidence "that a dangerously high concentration of Iodine-131, a radiation byproduct, was found in the milk of Utah and Nevada cows which had eaten vegetation exposed to Baneberry's fallout. Deer and sheep as far as 400 miles from the test range had abnormal concentrations of iodine in their thyroid glands, and the thyroid of a fetus from one sheep contained five times more iodine than the thyroid of its mother."[101] . . .

Was the Baneberry underground test venting a fluke unlikely to be repeated? The United States Government says yes. But a 1974 confidential U.S. military memo, written by nuclear testing program officer Captain William Gay, says otherwise. Made public through efforts by Senator Edward Kennedy in 1979, Captain Gay's memorandum stated that "on the basis of past experience at NTS [Nevada Test Site], a rather high incidence prevails for a release of radioactivity like Baneberry." The Gay memo added that "the risk is not like one in a million or so low as to be comfortable. Ventings have happened and will probably happen again."[107]

Captain Gay, director for tests in the Atomic Energy Commission's Division of Military Application, also wrote in the memo: "Considering past experience, massive venting can be expected in about one [ratio blanked out by censors] events."[108] Even after the decision was made to declassify the document in 1979, the American people apparently could not be trusted to hear a candid official estimate of the chances for future disastrous ventings of underground nuclear bomb tests.





from chapter 6: The Use and Misuse of Medical X Rays


Mammography and Other Problems

. . . Dr. Rosalie Bertell, a mathematician and an expert in radiation and the causes of cancer, later explained that a basic arithmetical error had been made in the design of the mammography program, which may well have resulted in serious health effects to early participants in the program. Some changes were made after the error was pointed out, she said, but had the program continued as originally planned, it might have caused up to twelve breast cancers for every one it picked up. "A lot of this I blame on the nuclear establishment," she said, "which has gone out of its way to convince everybody that low level radiation is no hazard. The nuclear physicist gives cancer risk per year, whereas health professionals give reproductive lifetime (30 year) or lifetime (70 year) risk. A physician using a physicist's estimates and not noting the timeframe difference will underestimate the risk." The medical profession, she said, was also accepting the word of the weapons industry about the magnitude of the risk per year, even if corrected for longer time spans, letting nuclear physicists determine what doses of radiation were safe, and what were not. Thus, she charged, "the doctors have abdicated responsibility in this area."[46] . . .

Though infants in utero and women have proved extremely sensitive to X rays, the problem is not restricted to them. In the early 1960s one of the largest radiation-related population studies ever done was begun at Johns Hopkins University. Known as the Tri-State Leukemia Survey, the study covered some six million subjects in New York, Maryland, and Minnesota who had undergone diagnostic X rays. By 1972 results of an analysis by Dr. Bross and Nachimuthu Natarajan indicated that children with chronic diseases were also at special risk from low levels of X ray. The study also lent crucial confirmation to the problem of in utero X rays, showing that children of mothers X-rayed during pregnancy suffered 1.5 times the leukemia rate as children of mothers not X-rayed. In certain selected sub-categories of children, exposed groups are 5 or even 25 times as likely to develop leukemia as is the general population.[52] Dr. Rosalie Bertell, in examining the data, added that "young adults with asthmas, severe allergies, heart disease, diabetes, arthritis and so on, were about 12 times as susceptible to radiation-related leukemia as were healthy adults." She measured the equivalence in effect of X ray and natural aging. Although the aging acceleration had been recognized as radiation-related, the effect had gone unmeasured. Nor had there been a full accounting for what X rays might be doing to the gene pool. "I think we need to face up not only to the long-term effects on the individual of exposure to radiation," she warned, "but on the long-term effects to the species."[53]

In May 1977 the outspoken Bross coauthored an article in the Journal of the American Medical Association, blaming doctors for excess cancers and increased risks of genetic damage because of misuse of X rays. Within weeks he was notified that federal funding for his work on the Tri-State Survey was being terminated. The National Cancer Institute, which supported the survey for a decade, put two of Bross's best-known opponents on its review committee. Said Bross: "We became the most recent victims of a pattern of censorship and repression that has been going on in the United States ever since the furor over fallout from weapons."[54]

Why So Many X Rays?

Proponents of atomic power and weaponry have long been concerned that indications that small doses of X rays may be harmful would reflect badly on the viability of atomic reactors and the safety of bomb testing. Dr. Stewart's initial study, for example, was the first major epidemiological indication that low-level fallout could be far more dangerous than the currently accepted limits. In fact, even as late as 1979, during the accident at Three Mile Island, nuclear proponents were arguing that exposure levels from the plant were comparable to a single X ray, and thus safe. But Dr. Stewart's study, and a host of others, had indicated that even a single X ray could have disastrous effects on an infant in utero and other susceptible members of the community. As Dr. Allan Reiskin, professor of radiology at the University of Connecticut, told a congressional subcommittee in the wake of the accident, "these comparisons are inappropriate because they fail to recognize dramatically different distribution of radiation energies, different dose rates, different types of radiation, and different types of population that are irradiated."[55]

Another reason for an excess of X rays may be that they add to the income of doctors and medical institutions. X-ray equipment is costly and as the state of the art quickly changes, older but still usable models become obsolete. Doctors who invest thousands of dollars in X-ray machines may well be inclined to use them more than absolutely necessary in an attempt to recoup their investment. . . .

Perhaps the worst problem resides in the medical malpractice laws. These vary from state to state, but in general they are a strong incentive to doctors to give numerous X rays far in excess of real medical need, in the hopes of establishing a record with which to defend themselves in case of a lawsuit. This "defensive medicine" can be carried to extremes. Dr. John McClenahan, a Pennsylvania radiologist, describes the syndrome thusly: " If a tennis player suffers elbow pain after a truck scratched the fender of his car, a radiologist will be called on to take pictures of not only the elbow, but of a shoulder . . . a forearm, a neck, chest and, after the diarrhea ensuing as the result of stress imposed by the accident, of the patient's entire gastro-intestinal tract."[70] Though radiologists and doctors may find such treatment excessive, few would risk losing an expensive lawsuit by refusing to use it. A 1973 survey by the Federal Commission on Medical Malpractice found that more than half the doctors polled admitted to engaging in some form of defensive medicine, and four years later an American Medical Association poll found 75 percent of the doctors contacted were ordering extra X rays to protect themselves from lawsuits.[71]





from chapter 7: Nuclear Workers: Radiation on the Job


Death in the Mines

. . . Ventilation systems that had been installed in Czech mines as early as the 1930s, and that were being operated at a relatively low cost in France, were nowhere to be found in the U.S.[33] In fact the National Council on Radiation Protection had recommended mine-worker exposure standards as early as 1941. At that time the Atomic Energy Commission was the sole purchaser of uranium in the U.S. It also operated some of the mines directly. Under federal law it was responsible for working conditions in those mines. And at the end of the 1940s, as the nuclear arms race accelerated demand, the AEC's Office of Raw Materials Operations recommended taking control of exposure levels underground. "Since we were the only customer for the ore," said Dr. Merrill Eisenbud, who was head of that office at that time, "we should see to it that the standards that already existed could be met." Soon after issuing that recommendation, the functions of Eisenbud's office were inexplicably removed from his department in New York to Washington.

Then, despite the billions of government dollars spent to develop atomic weaponry, the AEC claimed it lacked the funding to enforce mine safety, and turned the job over to the states and the mining companies.[34] The companies did little. And when the states tried to intervene, they were charged with bureaucratic meddling and endangering the national security. One Colorado inspector commented that in the 1950s "anybody that said a thing against uranium mining was suspected of being a communist."[35]

In 1967 Eisenbud helped develop a machine that could identify miners who had already suffered heavy radon exposures, thus aiding them in getting early treatment. The machines were available for use in both Denver and Salt Lake City. But the AEC and the Public Health Service declined to use them, claiming that funds for a testing program were not available. Eisenbud found that "hard to believe . . . because we were talking about a very small amount of money."[36]

And by that time evidence was beginning to pile up that the mines were creating an epidemic of lung cancer. Colorado and other states began to fear a landslide of compensation claims that could cost taxpayers and industry millions. Their fears were substantiated by a PHS study that had begun in 1950, when the service began collecting data on uranium miners and how they were dying. In 1960 the PHS handed the figures to Joseph Wagoner, a recent doctoral graduate of the Harvard School of Public Health. Wagoner told us in an extensive Washington interview that by 1964 "we showed twelve lung cancers in this group where just 2.8 were expected. We then updated the analysis one more year, and showed twenty-two lung cancers where there should have been only 5.7. When we went through 1965 we found thirty-seven lung cancers where there should have been just seven. And through 1978, with that same group, we now show 205 lung cancers where there should have been only forty. In other words there has been a consistent fivefold increase in lung cancer among this group right down the line."[37]

Still, however, the AEC refused to take responsibility for the enforcement of mine-safety regulations. Backed by the pronuclear Joint Committee on Atomic Energy (JCAE), which had effectively blocked any congressional attempts to regulate the mining industry, the AEC sailed along with little regard for the health of its miners--until 1967. Then, at a stormy JCAE hearing session, Secretary of Labor Willard Wirtz charged that "the best available evidence is that over two-thirds of the approximately 2,500 underground miners are working under conditions which at least triple their prospects for dying from lung cancer if they continue this work and these conditions remain unchanged." Year after year of "debate and discussion had produced nothing."[38]

The JCAE continued to insist that more study was needed. . . .

The Portsmouth Naval Shipyard

. . . Between 1954 and 1977 a total of sixty-three atomic subs were either built, overhauled, or repaired at Portsmouth. The General Dynamics Corporation operates the yard on government contract, and roughly a third of the 24,525 workers listed as having worked at PNS have been exposed to radiation, among them Adolph Pohopek.[65]

Pohopek told [blood specialist at Boston Veterans' Hospital] Najarian that numerous Portsmouth workers seemed to die unusually young, and that working conditions in the yards were not all they should be. Pohopek then gave Najarian the names of fifty people who had recently worked at Portsmouth. Najarian found that ten of them were already dead, and he asked the VA for funds to do some follow-up research. The VA turned him down, saying exposures at Portsmouth were too low to have caused any of the deaths.[66]

But Najarian persisted. Using his own money for postage and paper, he mailed questionnaires to about forty past and present Portsmouth workers. Within a week the head of the VA's research division in Washington called Najarian, demanding to know who was funding his research and asking for all his correspondence with naval personnel. When Najarian asked that the request be put in writing, he never heard from the VA official again.[67]

When the questionnaires themselves began coming in, they revealed what Najarian considered an alarmingly high rate of leukemia deaths. In mid-November of 1977 Najarian asked The Boston Globe for help. Although the Navy had refused to give Najarian any of its records, he and an investigative team from the Globe were able to gather some seventeen hundred death certificates relating to Portsmouth workers. The Navy also refused to release any worker exposure records. But with the help of statistician Dr. Theodore Colton, Najarian was able to isolate those workers whose families could confirm that they were exposed to radiation at Portsmouth. In June of 1978 Najarian and Colton published a paper in Lancet, indicating a leukemia rate among exposed Portsmouth workers that was four times normal.[68] . . .

Though controversy still rages over the Portsmouth studies, there seems little doubt in the minds of the people working there that something might be seriously wrong. In January of 1979 Dr. John Cobb of the University of Colorado Medical School, a member of the NIOSH advisory panel, visited Portsmouth to evaluate the situation for NIOSH director Dr. Tony Robbins. When he got there, Cobb found "antagonistic" and "explosive" differences between the unions and the Navy over health and safety issues, and that the unions felt "the Navy would lie, cheat and do anything to cover up their deficiencies in management."[73]

Cobb also discovered "that there could be an incentive for workers to keep their recorded radiation exposure lower than actual exposure," and that the Navy would often issue "waivers" to workers to keep them working in radioactive areas even after they had exceeded exposure limits. Cobb said he "was told that workers were led to believe that radiation exposure would not harm them."[74] Because radiation work brought higher pay, employees were reluctant to wear film badges for fear of being put in lower-paying jobs if they "burned out."

Rocky Flats

The case of Dan Karkenan, a college-trained professional who began work at Rocky Flats in 1968, was never resolved. Karkenan was a mechanical engineer who helped in the cleanup and reconstruction of Rocky Flats after a fire on May 11, 1969, seriously contaminated the plant and sent an uncertain amount of plutonium into the areas south of the plant.

By the spring of 1975 Karkenan began showing symptoms of numbness in his fingers and toes, followed by a loss of coordination and then paralysis in his arms and legs. Doctors were unable to diagnose Karkenan's disease, but he and his family were convinced it could be traced to his work during the cleanup after the 1969 fire, when the entire Rocky Flats area was heavily contaminated.[96] Just before Karkenan died in 1976, he asked his wife Miriam to have tissue samples examined as a part of his autopsy--as was later done with Don Gabel. But when she authorized the autopsy, Miriam Karkenan was told by the hospital that permission was required from Rockwell before her husband's tissues could be analyzed for radioactivity. After three months of wrangling with the company, she obtained permission--and was then told by the hospital that the tissues had been discarded. Karkenan continued to pursue her husband's records from Rockwell International and in late 1979 was sent a "report" ostensibly detailing her husband's exposure history. The document discussed Dan Karkenan's "on-the-job" exposures in 1977, 1978, and 1979--three years after he was already dead.[97]





from chapter 8: Bomb Production at Rocky Flats: Death Downwind


Disaster at Rocky Flats

But plutonium can catch fire spontaneously in air. In the evening of September 11, 1957, some of the "skulls" on the glove box line of Room 180 in Building 771 ignited. The fire was found by two plant production men shortly after 10:00 P.M.

The area was designed to be fireproof. But it was soon a radioactive inferno. Firemen switched on ventilating fans, but that backfired, spreading flames to still more plutonium. They then sprayed carbon dioxide into the area. That also failed. Meanwhile the filters designed to trap plutonium escaping up the stacks caught fire. The shift captain and other observers reported a billowing black cloud pouring some 80 to 160 feet into the air above the 150-foot-high stack of Building 771.

As the crisis intensified, plant officials struggled to find a solution. They knew water would destroy millions of dollars' worth of complex equipment. They also knew the intense heat might flash the water into enough steam to blast into an explosion and send even more plutonium particles flying toward Denver. But when the carbon dioxide failed, there was no alternative. In the early hours of the morning water began pouring into the blaze. Fortunately it worked. The fire went dead roughly thirteen hours after it began.[12]

The damage was extensive. Initial AEC reports contended that there was "no spread of radioactive contamination of any consequence." Seth Woodruff, manager of the Rocky Flats AEC office, told the local media that "possibly" some radiation had escaped. "But if so," he emphasized, "the spread was so slight it could not immediately be distinguished from radioactive background at the plant.[13]

But--as at Three Mile Island twenty-two years later--there was no reliable equipment operable at the time to monitor the amount of radiation that actually went out the stacks. Not until a week after the fire were working gauges installed. Then, in a single day, emissions registered sixteen thousand times the permissible level--a full fifty years' worth of the allowable quota.

Some fourteen to twenty kilograms were estimated to have burned in the fire, enough to make at least two bombs equivalent to the one dropped on Nagasaki.[14] And that may not have been the worst of it. According to a study based on figures from Dow Chemical, which operated Rocky Flats at the time, some thirteen grams of plutonium were routinely deposited daily on the first stage of filters in Building 771. According to government documents obtained in a lawsuit against the plant, the 620 filters in the building's main plenum had not been changed since they were installed four years before the fire. Thus a pair of local researchers theorized that as much as 250 kilograms of airborne plutonium could have gone out the stacks from the burning filters alone.[15]

Such an enormous release of plutonium struck some in the Denver area as beyond plausibility. But a much lower estimate of 48.8 pounds of plutonium--one tenth of the 250-kilogram figure--was calculated as enough to administer each of the 1.4 million people in the Denver environs a radiation dose one million times the maximum permissible lung burden.[16] "I find the high release estimates hard to believe," we were told by Dr. John Cobb of the University of Colorado Medical School. "But even if only one gram of plutonium escaped, as the plant operators say, that would be cause for concern."[17] Nor was plutonium the fire's only by-product. The water used to extinguish it became infused with radioactivity. In this case some thirty thousand gallons of it escaped unfiltered, thus spreading its contamination into local streams and the water table.

Through the whole crisis there had been no warning to local schools, health departments, police, or elected officials that something extraordinary and dangerous was happening at Rocky Flats. There were no backup plans for evacuation, no notification to area farmers or ranchers to safeguard their health or that of their animals.

And though some of the buildings were heavily contaminated, bomb-trigger production was back under way within a few days. Over the next thirteen months, Rocky Flats's operators recorded twenty-one fires, explosions, spills of radioactive material, and contamination incidents inside the plant.[18]

More Fires

A continent and an ocean away, in countryside that could hardly have been less like the flatland at the foot of the Rockies, Britain was also facing a disaster from bomb production. Amid the cold, deep lakes and lush farmlands of the English north country, fire struck the plutonium production reactor at Windscale in early October 1957--less than a month after the first fire at Rocky Flats. Windscale was designed to produce plutonium for bombs. Rocky Flats made such plutonium--once it was chemically processed--into triggers.

On October 7 uranium fuel pellets in the Windscale reactor caught fire. Attempts to quench them failed. . . .

Despite the national emergency that had been proclaimed, British officials told the public it was unlikely "in the highest degree" that anyone had been harmed by the accident.[19] But several months later British officials conceded to a United Nations conference at Geneva that nearly seven hundred curies of cesium and strontium had been released, plus twenty thousand curies of I-131. The admitted iodine dose represented more than fourteen hundred times the quantity American officials later claimed had been released during the 1979 accident at Three Mile Island.[20] . . .

DOE monitoring records kept from 1970 to 1977 indicated that levels of airborne plutonium were higher in the Rocky Flats area than at any of fifty other stations around the U.S. Dust samples downwind showed plutonium concentrations 3,390 times what might be expected from fallout. Evidence also surfaced that the nearby town reservoir had been contaminated.[27]

Constant mishaps at Rocky Flats led to a growing distrust among area residents. As early as 1969, in the wake of the fire that spring, a group of scientists from local industries and universities asked DOE and the AEC to monitor the soil downwind. Their request was refused.

So Dr. Edward Martell, a nuclear chemist working at the National Center for Atmospheric Research, with considerable experience from the bomb-testing era, decided in the fall of 1969 to conduct some tests of his own. His findings confirmed some of the community's worst fears. Abnormal plutonium levels were clearly evident in soil to the east and southeast of the plant.

Martell quickly came under attack from plant supporters. But when the AEC did its own study of downwind soil, it also had to admit to significant contamination. "We find his results are accurate," conceded a ranking military spokesman. "We don't disagree with his new data. As far as measurements, sampling techniques, and knowledge of science, we think Martell is a very competent scientist." The AEC did, however, question Martell's health conclusions. "While it is true," they said, "that some plutonium is escaping from the plant, we don't believe it presents a significant health hazard to Denver."

Dr. Arthur Tamplin--at the time a leading AEC health researcher--strongly disagreed. The Martell study "shows about one trillion pure plutonium oxide particles have escaped from Rocky Flats," he warned. "These are very hot particles. You may only have to inhale 300 of them to double your risk of lung cancer." Tamplin calculated that if plutonium had been spread as Martell suggested, lung-cancer rates in Denver could rise, over time, 10 percent. An additional two thousand Coloradans could fall victim to Rocky Flats.[28]





from chapter 9: Uranium Milling and the Church Rock Disaster


Church Rock, New Mexico . . . hosts a series of underground uranium mine shafts, a mill, and a scattered community of Navajo families who survive by herding cattle, goats, and sheep.

A deep gully leads from the mine site into the Rio Puerco, which once flowed only when fed by spring rains. Now it is wet year round, bolstered by water pumped from the mine shafts to keep them from flooding. That water flowing from the mine is laced with radioactive isotopes. And the pond hides a burden of contaminated waste.

The 350 families who water livestock in the Rio Puerco rely on their small herds to eke out a meager existence. Many are members of the Dine--Navajo--Nation, with incomes in the range of two thousand dollars per year. During the hot days of the desert summer local children would play in the stream as their parents tended the goats, sheep, and cattle.

A Wall of Radioactive Water

In the early morning hours of July 16, 1979--fourteen weeks after the accident at Three Mile Island--all of that changed. The dam at Church Rock burst sending eleven hundred tons of radioactive mill wastes and ninety million gallons of contaminated liquid pouring toward Arizona. The wall of water backed up sewers and lifted manhole covers in Gallup, twenty miles downstream, and caught people all along the river unawares. "There were no clouds, but all of a sudden the water came," remembered Herbert Morgan of Manuelito, New Mexico. "I was wondering where it came from. Not for a few days were we told."[1]

No one was killed in the actual flood. But along the way it left residues of radioactive uranium, thorium, radium, and polonium, as well as traces of metals such as cadmium, aluminum, magnesium, manganese, molybdenum, nickel, selenium, sodium, vanadium, zinc, iron, lead and high concentrations of sulfates.[2] The spill degraded the western Rio Puerco as a water source. It carried toxic metals already detectable at least seventy miles downstream.[3] And it raised the specter that uranium mining in the Colorado River Basin may be endangering Arizona's Lake Mead, and with it the drinking water of Las Vegas, Los Angeles, and much of Arizona.

Except for the bomb tests, Church Rock was probably the biggest single release of radioactive poisons on American soil. Ironically it occurred thirty-four years to the day after the first atomic test explosion at Trinity, New Mexico, not far away.

The source of the catastrophe was uranium mill wastes. Usable uranium is extracted from the sandstone in which it is usually found by grinding it fine and leaching it with sulfuric acid. The acid carries off the desired isotopes. But the leftover waste sands--"tailings"--still contain 85 percent of the ore's original radioactivity, and 99.9 percent of its original volume. There are now some 140 million tons of them scattered around the West. NRC commissioner Victor Gilinsky and others consider them "the dominant contribution to radiation exposure" of the entire nuclear fuel cycle.[4] The acid milling liquids--called "liquor"--also dissolve dangerous traces of thorium 230, radium 222, lead 210, and other isotopes. Because of their high radioactivity the tailings and liquor both must be isolated from the environment--but nobody has yet demonstrated a method with any long-term success.

At Church Rock several hundred million gallons of the liquor were being held in a large pond so the liquids could evaporate off and the solid tailings be stored. The whole complex was owned by the United Nuclear Corporation (UNC), a Virginia-based firm with assets in the hundreds of millions of dollars and influence in the New Mexico state government. Its dam and pond at Church Rock were opened with the understanding that they would operate just eighteen months; twenty-five months later, at the time of the accident, no alternative sites were being developed.

Tailings Forever

According to the Government Accounting Office (GAO) at least twenty-two uranium mills had shut down on the continental United States by 1978. They left behind some twenty-five million tons of tailings in "unattended piles and ponds" in eight western states plus Pennsylvania and New Jersey. Another sixteen mills were in operation, with an additional 115 million tons on site--bringing the total to 140 million tons. In the early 1980s another six to ten million tons of tailings were being produced per year. Based on high growth estimates, the NRC in 1981 predicted another 109 mills could be operating by the year 2000 producing 470 million more tons of tailings and scores of acid ponds like the one at Church Rock.[42] One estimate from Los Alamos Laboratory put the total far higher, predicting 900 million tons of tailings by the year 2000 in New Mexico alone.[43] Such a total would involve some twenty trillion cubic feet of tailings. . . .

In fact the NRC has attempted to present long-term calculations for New Mexico tailings-gas emission levels in such distant locations as Los Angeles, Chicago, Miami, Washington, D.C., and New York City.[45] NRC staff member Reginald Gotchy told us that despite its short half-life (3.8 days) radon gas from a tailings pile in New Mexico can carry to the East Coast of the United States. On its way contamination would appear "on grain grown in the Midwest" and elsewhere. "This stuff," he said, "goes everywhere." Gotchy hastened to add that he and the NRC consider the doses "minuscule."[46]

But in 1977 Dr. Chauncy Kepford, a chemist based in State College, Pennsylvania, testified during hearings on the license for Three Mile Island Unit 2 (which caused the 1979 accident) that the quantity and health effects of radon tailings emissions had been vastly underestimated. Kepford stated that the NRC had failed to account for continued emissions over the full decay chains of the elements involved. Assuming a stable human population and society, he estimated that tailings from the fuel needed to operate TMI-2 for just one year could cause a million cancer cases over time.[47]

In 1978 Dr. William Lochstet of Pennsylvania State University argued that the operation of a single uranium mine could result in 8.5 million deaths over time.[48] And Dr. Robert O. Pohl of Cornell told the NRC that the potential health effects from mill tailings could "completely dwarf" those from the rest of the nuclear fuel cycle and add significantly to the worldwide toll of death and mutations.[49]

The essence of those conclusions was substantiated, surprisingly, from within the Nuclear Regulatory Commission itself. In the fall of 1977 Dr. Walter H. Jordan of the commission's Atomic Safety and Licensing Board wrote an internal memorandum arguing that the NRC "had underestimated radon emissions from tailings piles by a factor of 100,000."

Because of the long half-lives of the isotopes in the solid tailings, radiation will continue to be emitted from the tailings piles for billions of years. Said Jordan: "It is very difficult to argue that deaths to future generations are unimportant."[50]


In the meantime NRC Commissioner Gilinsky has warned that "none of the abandoned sites can be considered to be in satisfactory condition from the long-term standpoint."[53] In fact most of the piles continue to lie exposed to the winds and rain. Residents of Durango, Colorado, have experienced plumes of dust towering thousands of feet in the air, covering cars and houses with radioactive dust. Children have played in the "dunes." The piles were "the biggest, best sandpile in the world," Greta Highland of Durango told the High Country News. "After school my friends would sneak into the mill yard and play in the tailings."[54]





from chapter 10: Tritium in Tucson, Wastes Worldwide


A World of Waste

So far numerous experiments with various means of disposing of rad-wastes have been tried--all without proven success. In the wake of such failure the official focus has been on downplaying the potential dangers--especially in the case of commercial reactor waste. A standard industry claim has been that the fuel for operating a one-thousand-megawatt reactor for a year comes to about two cubic meters. As a public-relations gimmick various utilities have handed out small plastic pellets, which they compare to the size of each person's yearly share.

But the comparisons are deceptive.

First, they ignore the fact that mining and milling the fuel for one average reactor for one year will create roughly 180,000 metric tons of uranium mill tailings--of the type that poured out of the Church Rock dam, and that are sitting in piles throughout the West. According to the NRC's Ross Scarano 1.6 metric tons of tailings occupy a cubic meter of space. Those 180,000 metric tons of tailings created to fuel a reactor for a year will occupy roughly 100,000 cubic meters of space--a long way from the two cubic meters of "rear-end" wastes advertised by the industry.[38]

As for those smaller volumes of "rear-end" wastes that come directly from the reactors, they make up in intensity what they may lack in size.

Longtime nuclear advocate Bernard Cohen, of Pittsburgh, argues that those yearly rear-end wastes "would fit comfortably under a dining-room table."[39]

But anyone eating at that table would have a hard time walking away. The heat and radiation generated by spent fuel demand that it be diluted and spaced apart to avoid a chain reaction. Dr. Marvin Resnikoff of the State University of New York at Buffalo estimates those "two cubic meters" would require ten thousand times that much space for safe storage. Any "dining-room table" they'd fit under would need a top the size of a football field and legs ten feet high.[40]

Catastrophe at Kyshtym

In the fall or winter of 1957-1958--within months of the fires at Rocky Flats and Windscale, and while wastes were simmering at Lewiston and Tonawanda and still piling up at INEL, Savannah River, and Hanford--a massive explosion blew apart a radioactive-waste dump in the Ural Mountains of the Soviet Union.

The blast sent huge quantities of radiation into the air. It killed hundreds--possibly thousands--of people. It made permanently unlivable an area at least fifty kilometers square. And it ended forever any possible illusions about the dangers of radioactive waste.

. . . Though explicit mention of the accident was banned, scores of scientists had gone to the Urals to study its aftereffects. One of Kyshtym's great ironies was that despite official secrecy far more will be known to future generations about the radiation damage surrounding it than about either Windscale or Rocky Flats, where official scientific follow-up was virtually nonexistent.

Medvedev knew that his former colleagues had written more than a hundred studies involving lakes and the fish in them, insects, mammals, birds and vegetation that were "somehow" exposed to heavy doses of radiation in late 1957 or early 1958. By identifying the types of plant and animal life, the weather patterns, and other key features of the area, Medvedev pieced together an indisputable portrait of the "vast nothing" created by the catastrophe. . . .

In late 1979 a special report from the Oak Ridge National Laboratory confirmed that a system of fourteen lakes had been contaminated by the Kyshtym blast. About thirty small towns listed in Soviet maps before the accident were gone from contemporary maps.[58] . . .

Kyshtym was "a tragedy of extraordinary dimensions," added Richard Pollock of Ralph Nader's Critical Mass Energy Project, which had sued for the release of the CIA documents. The explosion of nuclear wastes had underscored the dangers of both weapons production and the "peaceful atom." Pollock called for a moratorium on nuclear reactor construction, and asked: "Will U.S. energy policy makers be willing to accept the risk of hundreds of square miles of heavily contaminated cropland or metropolitan areas as the price for electricity? Will we be willing to write off a New York or Chicago or a Seattle or Miami as the Soviets have with cities in their country?"[66]





from chapter 11: The Battle of Shippingport


Dwight Eisenhower stood in the Oval Office of the White House and waved what his press secretaries had dubbed a "neutron wand." The date was May 23, 1958, a year in which the United States would detonate seventy-seven atomic tests, but one that would also see the first tentative test ban agreement. The ceremonial shaft, which had been topped with a futuristic phosphorescent bulb, passed through an electric eye as Eisenhower waved it. The President thus tripped a circuit that fired up America's first commercial atomic reactor--at Shippingport, Pennsylvania, three hundred miles west of the White House.

. . . A new industry had been born.

But America's private utilities were skeptical. With a few exceptions its generally conservative executives were worried about the dangers of a nuclear accident and the risks of sinking so much capital into an untested technology. It was only with government-insurance guarantees, fuel subsidies, and lavish research-and-development help that commercial atomic power moved ahead. Even at that, private utilities did not become heavily involved until faced with the threat of being squeezed out of business by federal competition in the form of the Tennessee Valley Authority and other government-owned utilities. To this day TVA remains the nation's single largest reactor buyer. As Sam Day, former editor of the Bulletin of the Atomic Scientists, told us: "The private electric companies did not jump into nuclear power. They were kicked in."[3]

Enter Ernest Sternglass

. . . When we talked with him in the fall of 1980, he was finishing work on a new method using a computer to take X rays without film.

During that interview Sternglass told us that the work of Alice Stewart had first alerted him to the dangers of small doses of radiation. "We all knew from the bombs that large doses could be dangerous," he said. "But when Dr. Stewart showed that small X-ray doses could harm infants in utero, that opened up a whole new way of looking at things."

Official researchers had made a crucial mistake in measuring the effects of radiation by looking primarily at damage to genes without also looking at the embryo. "A human fetus in the first trimester of development can be many times more radiation-sensitive than human genes," Sternglass said. "When the AEC failed to consider what fallout was doing to infants, they missed the most important effect of them all, and thus vastly underestimated the damage being done by the bomb tests."[16]

In 1969 Sternglass published an article in the Bulletin of the Atomic Scientists, contending that some 375,000 American infants had died as a result of atomic bomb testing. The thesis rested on the idea that as medical technology was advancing, the rate of infant mortality dropped, essentially by a constant percentage each year. The better the technology got, the fewer babies were dying at birth. But when the bomb testing began, the rate of decline slowed. When the tests stopped, the rates began to drop again as they had before, in keeping with continued medical advances.

It was the "bump" in the line--a bump involving roughly 375,000 American babies--that Sternglass attributed to radioactive bomb fallout. Particularly important in that calculation was iodine 131, which could travel through the placenta and irradiate the tiny prenatal thyroid. By destroying cells in that crucial gland, in its early stage of development, radiation would cause stunted growth brain damage, and underdeveloped lungs that could make it impossible for the new baby to survive the first few days of life. Congenital deformities, underweight, hypothyroidism, and a breathing problem called hyaline membrane disease can be considered symptoms of I-131 poisoning because of fallout. They had, said Sternglass, slowed the downward trend of infant deaths below what should have been expected during the height of the bomb testing, and in so doing had killed those 375,000 American babies.[17]

Sternglass's assertions came in the same year--1969--as Gofman's and Tamplin's recommendation of a tenfold reduction in exposure levels from atomic reactors.

As shocking as Sternglass's findings seemed, they were by no means the most radical estimates of death from fallout. In 1958--eleven years before Sternglass's article--Nobel prizewinner Linus Pauling had predicted that 140,000 people would die from each and every bomb test, a prediction that translated into literally millions of total deaths over time.[18] Pauling also wrote that a single fallout product, radioactive carbon 14, from a single year's bomb testing--30 megatons of explosions--could cause 425,000 embryonic and neonatal deaths (deaths before one month of age), 170,000 stillbirths and childhood deaths, and result in another 55,000 children being born with "gross physical or mental defects."[19] Russian scientist Andrei Sakharov added his own calculation that bomb-produced carbon 14 would kill ten thousand people for every megaton blown off in the atmosphere, a toll that translated into millions of deaths over time. As a "conservative estimate" Sakharov said that testing by the mid-fifties had caused half a million human deaths. "We cannot exclude the possibility that the total number of victims is already approaching 1 million persons," he added and that each year continued testing increases this number by 200 to 300 thousand persons."[20]

A decade later Sternglass was pointing specifically at the American people. He was saying that as of 1969, based on national infant-mortality statistics, about 375,000 American infants had already died from the tests, and countless more American children and adults were suffering ill-effects. Because it dealt with hard statistics about American children, it was an assertion that cut to the very core of the nuclear industry. . . .

Sternglass also found later confirmation of some of his fallout conclusions from a most unexpected source--the U.S. Navy. In 1979 he and Stephen Bell, an educational psychologist, presented a paper before the American Psychological Association suggesting that the atmospheric tests were linked to a decline in college-entrance Scholastic Aptitude Test (SAT) scores among American teenagers. . . .

The paper met with harsh criticism from the nuclear establishment. Among other things, an increase in television watching and the consumption of junk food were blamed for the SAT declines.

But in 1980 a study commissioned by the U.S. Navy substantiated the thesis. The Navy was concerned that its increasingly complex weapons technology was outstripping the abilities of new recruits to manage it, and it worried about a decline in the mental abilities of American youth. Researchers Bernard Rimland and Gerald Larson agreed that radiation probably played an important role. In terms of the SAT, they said, the findings were "consistent to the hypothesis that the proximity to the tests or high rainfall downwind from the point of detonation should lead to the largest decline."

In fact, Rimland and Larson added: "The state having the largest drop in scores from children born during this two-year period [1956-58] was Utah, a fact which is consistent with Utah's proximity to the Nevada Test Site and the general northeastern motion of the fallout clouds produced by the Nevada tests." Thus they said, "Sternglass and Bell provide very convincing and disquieting evidence closely linking the SAT score decline to the cumulative effects of nuclear fallout." "I wish it weren't so," Bernard Rimland told us in a 1981 interview, "but I don't think anyone could look at the data and come to any other conclusion. Sternglass's work is very sound and very convincing."[28]

But by the time Rimland and Larson had confirmed Sternglass's findings on fallout, another radiation source--atomic power reactors--had moved to center stage.





from chapter 12: How Much Radiation?


The Reactors' Safety Record

By 1979 sloppy reactor construction, poor design, and inept operation had become a national scandal. That year's NRC records revealed more than twenty-three hundred operating errors, including a failure of control rods at Browns Ferry; a temporary blackout in the control room of a power plant in Florida; the surprise development of a steam bubble in another Florida reactor; and the blowout of a coolant pump at Arkansas Nuclear One, near Herschel Bennett's farm. New York's Fitzpatrick II--where Nancy Weber's cows were dying--listed eighty-eight incidents of its own.[20] There were other incidents as well: one reactor cooling system had been hooked up to the plant's drinking supply. At another plant a basketball wrapped in tape had been used to plug a defective pipe.[21]

Through the end of 1979, the allowable average dose to residents near the plants remained at 170 millirems per year, a rate Drs. Gofman and Tamplin calculated would guarantee an extra thirty-two thousand deaths per year. And methods of measuring radioactive releases had not been systematically improved despite the recommendations of the Shapp Commission. If anything, standards were regressing.

In 1975, for example, excessive strontium 90 radiation was found in milk at a farm near the Shippingport plant. The following year, monitoring at that farm was discontinued.[22]

In October of 1977 Ernest Sternglass charged that strontium emissions from the Millstone Nuclear Power Station at Waterford, Connecticut, were extraordinarily high, and had led to an increased rate of cancer.[23] Soon thereafter the NRC eliminated the requirement that utilities collect strontium 90 data. Budgetary reasons were cited.[24]

Also that fall the General Accounting Office released a report charging that the EPA's national radiation monitoring program did not measure exposure for 40 percent of the American people, "and provides only educated guesses for the remaining 60%." The GAO warned that "levels of radiation are increasing which affect not only the health of the current population, but of future generations because of genetic damage." Federal agencies lacked resources, staff, and know-how to deal with the problem, said the GAO. Environmental Protection Agency policy "may not be the result of public need, but rather reflects a crisis-oriented approach to the problem."[25] Despite the warning, the Reagan administration in 1981 drastically cut the EPA's radiation monitoring program well below the levels cited as inadequate by the GAO.





from chapter 13: Animals Died at Three Mile Island


The NRC Steps In

By the summer of 1980 stories about Dr. Weber, Jane Lee, Charles Conley, and other area farmers had begun to seep into the media.[14] It was precisely the kind of publicity the industry could least stand. The reactors were operating at roughly 65 percent of full capacity; originally the industry had promised 80 percent. And with just seventy plants on line, atomic power was producing a net of just 9 percent of the U.S. electricity supply, and less than 2 percent of all U.S. energy. After thirty-five years of research and development, $40 billion in taxpayer subsidies, and more than $100 billion in utility investments, commercial reactors were providing American consumers with less usable energy than firewood.[15] . . .

But nuclear power was not being abandoned. Those still in the industry had billions of dollars invested. First and foremost, it seemed necessary to dispel the idea that TMI had caused anyone any harm. And that meant the animal question. Just as Nevada sheep had become the first visible victims of the 1950s bomb tests, so the goats, pigs, cows, and cats of central Pennsylvania seemed destined to play the role at the dawn of the 1980s.

And like the AEC before it, the state of Pennsylvania stood firm. "There's not a shred of evidence that there's been a radiation-connected problem," Governor Richard Thornburgh said of the farmers' complaints. "If you could tell me of a single instance of a radiation-connected problem, then we'd want to take a look."[16]

But resistance at the state level to pursue the question further than the limited DOA study remained firm. "There was not enough radiation to give any evidence of any need to do such a study," said Robert Furrer, a management analyst for the DOA. "To do more study would have been chasing a ghost," added Nunzio Palladino, dean of Pennsylvania State College of Engineering. "I wouldn't put a nickel toward more study."[17] In 1981 Palladino became chairman of the Nuclear Regulatory Commission.

Despite such opinions the NRC teamed up with the EPA to study the animals around Three Mile Island in the spring of 1980. Headed by the NRC's Germain LaRoche, the task force set about contacting those farmers who had complained of problems with their animals. By the fall of 1980 their investigation was complete and their conclusions firm--"no reasonable connection" could be made between radiation from TMI and damage to any nearby animals.

Published in October of 1980, the study immediately became national news. The New York Times accepted it as definitive proof that the farmers' claims were without basis. . . .

But apparently neither the Times's editorial board nor much of the major media had read the NRC/EPA report carefully. Its authors had warned in their introduction that the survey "should in no way be thought of as an epidemiological study." There were, they said, numerous cases "that could not be investigated in depth because not enough data were available." There was also a "lack of background information" on many diseases in the area.

According to Germain LaRoche, whom we contacted by telephone in early 1981, the authors of the report "did not survey animals. We surveyed people and reports from the lab. We got a list of problems from the state and contacted as many of the farmers as we could."

In other words the Pennsylvania DOA's sketchy 1979 survey, which had been labeled "worthless" by The (Baltimore) News-American, had served as the basis for the "definitive" federal study of animal problems around the nation's biggest reactor accident. And in fact the NRC had contacted even fewer farmers--a year later--than had the state. "We did not go to all those people," LaRoche told us. "But we did go to quite a few."

Nor was there any improvement in actual testing of livestock. "We did not see any animals," LaRoche explained. "We did not do any autopsies. This [study] was done over a year after the accident. By the time we did our survey, all those animals had died or had been disposed of."[20]

In fact the final NRC/EPA report listed fewer than thirty-five cases involving animal problems near TMI. In more than half of them the investigators conceded that there were insufficient data to draw any conclusions about radiation poisoning one way or the other. Under the category of farm animal reproductive problems, for example, the report listed fourteen different cases. In ten of them the researchers acknowledged having either no data, insufficient data, or "cause unknown."[21]

As for the reports of Dr. Robert Weber that stillbirths and malformations among area pigs were epidemic, there was no survey or interview. The authors simply noted that "episodes of farm animals requiring caesarean delivery of young were reported after the accident." A repeat of "this specific problem was not evident in 1980; however, an increase of stillbirths in pigs was reported during the spring of 1980." There was no systematic poll of local veterinarians, no tabulated survey of area pig farmers. . . .

As for the widely reported disappearance of wildlife, the report blamed pesticides and the weather. There was no mention of independent studies showing high radiation levels in local rabbits, meadow voles, and milk.

To support one of the most crucial official health contentions in American history, the NRC and EPA had cited less than two dozen year-old autopsies and performed none on its own; presented no systematic survey of area hunters, farmers, gardeners, veterinarians, doctors, breeders, or fishermen; and made no substantial contributions to the very brief two-day survey done a year before by the state. "I was disappointed in the NRC's report," said Pennsylvania's John Nikoloff. "I felt with their resources they could have done a better job."[24]

Still the commission was prepared to promise that "concerned citizens may be assured that in keeping with its mission to safeguard the public health and safety, the staff of the Nuclear Regulatory Commission will continue to investigate reports of unusual problems experienced with plants and animals, and any pertinent findings will be made available."[25]





from chapter 14: People Died at Three Mile Island


The Mental Fallout

In those frightening first days of the accident at Three Mile Island fifth and sixth graders in nearby Middletown had gotten together to write their last wills and testaments.[46] The terror was at the gut and instinctual levels, and it dated all the way back to Hiroshima. "The first application of nuclear energy was the atomic bombs which destroyed two major Japanese cities," explained the Kemeny Commission in 1979. "The fear of radiation has been with us ever since, and is made worse by the fact that, unlike floods or tornadoes, we can neither hear nor see nor smell radiation."

Thus, predicted the commissioners, the "major health effect of the accident" appeared to be "mental distress" felt by "certain groups" living near the reactor. The problem, they said, was "short lived."[47] But nine months later state researchers confirmed a 113 percent jump in the number of TMI neighbors using sleeping pills and an 88 percent rise in those using tranquilizers. The use of alcohol was up by 14 percent and cigarette smoking had increased by nearly a third.

As documented by one thousand telephone interviews, a wide range of "psychosomatic illnesses" had surfaced, including chronic headaches, diarrhea, loss of appetite, sweating, rashes, and hypertension.[48] "The symptoms people are suffering are similar to those suffered by people who work at dangerous jobs," we were told by Dr. Robert Holt, a New York University psychologist who studied the TMI area. "In those situations you expect an increase of tension, shortened tempers, mood swings and more physical symptoms like hyperventilations, ulcers, and asthma."[49] . . .

Meanwhile some TMI neighbors questioned the fact that so much attention had been given to the mental fallout from the accident, and so little to its physical health effects. According to The New York Times, at least fourteen psychological surveys were taken of area residents in the wake of the accident, based in part on grants of $375,000 from the National Institute of Mental Health and another $52,000 from the utility industry. The state of Pennsylvania, which had mustered a bare hundred work hours to study area animals and which had established no systematic ongoing survey for possible physical damage from TMI radiation, did conduct a one-thousand-person telephone poll on the mental impact of the accident.[55] "It makes you wonder," Jane Lee told us, "how they can get so much money to study the psychological effects of this accident when they can't seem to pull it together to look at the physical effects on animal and human health."[56]

The Taste of Tragedy

For many in the TMI area the outcome of the reactor accident now seemed as obvious as it had become at Bikini Island, St. George, Utah, and other communities downwind from years of nuclear bomb testing; among the GIs who had helped clean up Hiroshima and Nagasaki; among the 300,000 who had served as guinea pigs at the tests in Nevada and the Pacific; among millions of citizens exposed to too many medical X rays; among workers in the uranium mines and mills such as Church Rock and Shiprock, and at nuclear facilities such as Hanford, Portsmouth, Paducah, Piketon, U.S. Radium Dial, American Atomics, and Rocky Flats; among citizens living downwind of Windscale, Kyshtym, American Atomics, Rocky Flats, and downriver from Church Rock, Durango, and other mill-tailing sites; among thousands of Americans living near those tailings piles, some of whom built homes with them, others of whom suffer from them in their water supply and air; among millions of Americans near low- and high-level waste dumps with reason to fear for their own and their children's long-term health; among farmers near the Shippingport, Arkansas Nuclear One, West Valley, Vermont Yankee, Rocky Flats, and Fitzpatrick and Nine Mile Point facilities with reason to believe that their animals are coming to the same ugly end as the sheep caught in the "Dirty Harry" bomb fallout of 1953; and among citizens near the Dresden, Humboldt, Indian Point, Shippingport, Millstone, Arkansas One, and seventy-odd other American reactors with reason to fear that their babies are being killed by radiation before they live even a month.





from chapter 15: Conclusion: Surviving the New Fire


. . . As the GAO reported in December of 1979, a review of radiation monitoring programs in eight key states indicated that "many sources of radiation were not regulated, the coverage of regulated sources was limited, and there was limited assurance that identified hazards were corrected."[9] And as the budget-cutting Reagan administration took office in 1981, the NRC and industry backers moved to speed the licensing process and gut the monitoring programs around atomic reactors.

Similar trends were evident in the study of public health. Despite the findings of Gofman, Tamplin, Pauling, Sakharov, Caldwell, Knapp, Lyon, Weiss, Martell, Livingston, Pendleton, Sternglass, Caldicott, Rimland, Larson, Dyson, Morgan, Stewart, Kneale, Bross, Blumenson, Bertell, Abrams, Kushner, Matanowski, Mancuso, Cobb, Najarian, Drinker, Flinn, Martland, Wagoner, Archer, Eisenbud, Johnson, Radford, Winterer, Gottleib, Odin, Goodman, Franke, Steinhilber-Schwab, Talbott, Jordan, Kepford, Pohl, Lochstet, Resnikoff, Medvedev, MacLeod, Takeshi, and a host of other "dissident" scientists, doctors, and researchers in the radiation field, no major systematic steps had been taken to survey public-health trends around America's nuclear facilities.

By attacking these experts on an ad hominum basis, by ignoring the findings of "nonprofessional" farmers and private citizens, and by failing to provide independent studies of their own, the nuclear industry and public-health authorities have denied thousands of victims of radiation poisoning access to speedy treatment, and millions of Americans the right to make an informed decision on this nation's nuclear policies. Official statistics have been uniformly sketchy or nonexistent. Nine years after Pennsylvania's Shapp Commission made its recommendations for modernizing radiation and health monitoring around nuclear facilities, and more than two years after TMI, none of the high-level recommendations had been put into law.

. . . A human embryo in its second month of development weighs 0.1 gram, one 600,000th the weight of its mother. Radiation doses received by the mother can have enormous impact on the unborn fetus.





from Appendix A, The Basics of Radiation and Health


Radiation and Human Health

There has been considerable debate among radiobiologists about how often a cell must be hit by radiation to mutate into a cancer. Dr. E. B. Lewis in 1957 advanced the idea that it took just one "hit" to produce irreversible cell damage. Others believe it may take two or more. There is little dispute, however, over the fact that the cell is most vulnerable when it is dividing. The human fetus, infants, and young children--whose cells are multiplying most frequently--are thus the most sensitive to radiation damage; blood-forming organs such as the bone marrow are also particularly vulnerable.

Susceptible Groups

Exposure of the fetus to radiation during all stages of pregnancy increases the chances of developing leukemia and childhood cancers. Because their cells are dividing so rapidly, and because there are relatively so few of them involved in the vital functions of the body in the early stages, embryos are most vulnerable to radiation in the first trimester--particularly in the first two weeks after conception. This period carries the highest risk of radiation-induced abortion and adverse changes in organ development. During this stage of development the tiny fetus can be fifteen times more sensitive to radiation-induced cancer than in its last trimester of development, and up to a thousand or more times more sensitive than an adult. In general it is believed that fetuses in the very early stages of development are most vulnerable to penetrating radiation such as X rays and gamma rays.

In all stages, they are vulnerable to emitting isotopes ingested by the mother. For example, if a pregnant mother inhales or ingests radioiodine, it can be carried through the placenta to the fetus, where it can lodge in the fetal thyroid and where its gamma and beta emissions can cause serious damage to the developing organ. Once the fetal thyroid is damaged, changes in the hormonal balance of the body may result in serious--possibly fatal--consequences for the development of the child through pregnancy, early childhood, and beyond. Such effects include underweight and premature birth, poorly developed lungs causing an inability to breathe upon delivery, mental retardation, and general ill-health.

--

It is not enough for a handful of experts
to attempt the solution of a problem,
to solve it and then to apply it.
The restriction of knowledge to an elite group
destroys the spirit of society
and leads to its intellectual impoverishment."

-- Albert Einstein


back to Killing Our Own | radiation | rat haus | Index | Search