The sinister history of America's 'uranium gold rush'

The success of the Manhattan Project sent demand for uranium skyrocketing, and enterprising prospectors went out West in search of an overnight fortune. But many were exposed to lethal radiation in the mines.

Two miners, both wearing a hardhat, pushing an ora cart as they emerge from an uranium mine
Two miners, both wearing hardhats, push a cart out of a uranium mine in Kern Countym California in 1955. To bolster its nuclear arsenal, the U.S. government paid prospectors to mine uranium, inadvertently exposing miners to harmful levels of radiation.
Photograph by FPG, Getty Images
ByErin Blakemore
July 12, 2024

Armed with picks and shovels, the prospectors turned to the American West intent on finding deposits of the mineral that would make their fortunes. Their pursuit of wealth led to vast riches—and left ghost towns in its wake.

But the year wasn’t 1849, and the miners weren’t in search of gold. Instead, it was the 1950s, and they carried Geiger counters along with their shovels. They were part of the United States’ last big mineral rush—a forgotten race to find uranium deposits at the dawn of the nuclear age.

Uranium mining’s early days

Uranium hadn’t always been a hot commodity: When a prospector found a deposit of yellow rock in Montrose County, Colorado in 1881, radioactivity hadn’t even been discovered yet.

Though uranium mines  near the intersection of New Mexico, Colorado, Arizona, and Utah, now known as the Uravan Mineral Belt, were active in the early 20th century, but production was low and mining was mostly focused on radium and vanadium, elements also found in carnotite ore that is used in steel production.

A jagged, green rock specimen
Uranium ore from Daybreak Mine in Washington state. Once considered an cheap byproduct of mining radium and vanadium, uranium became highly sought after when scientists learned it could power nuclear weapons and create massive amounts of energy.
Photograph by Bjoern Wylezich, Getty Images

By the dawn of the Second World War, uranium was still considered to be what historian Bernard Conway calls a “worthless byproduct of vanadium refinement.”

But that changed with the Manhattan Project, the top-secret effort to develop the world’s first nuclear weapons. Project scientists attempted to invent both a uranium bomb and one based on plutonium, an element that, they discovered, could be produced in a reactor fueled by uranium.

A new atomic age

In early 1945, Manhattan Project scientists successfully detonated Trinity, the first plutonium bomb. A similar bomb would be detonated over Nagasaki, Japan less than a month later, just three days after an untested uranium-powered bomb was dropped on Hiroshima. The detonations ended the war and ushered in the nuclear age—and suddenly uranium and its derivatives were seen as not just a valuable commodity, but a matter of national security.

After the war, amidst a global debate on nuclear proliferation, the U.S. created the Atomic Energy Commission, a civilian-led agency tasked with overseeing all nuclear affairs. The Manhattan Project had purchased most of its uranium from the Belgian Congo. But the U.S. wanted to use home-mined uranium for its weapons—and keep its uranium out of the hands of the USSR amid growing Cold War tensions. To do so, the U.S. would need to monopolize its own uranium for its nuclear arsenal.

A black and white photo of a waitress poses with a 'Uranium-Burger' at a diner
A waitress poses with a 'Uranium Burger' at a diner in Salt Lake City, Utah in 1954. The sandwich was named after the region's booming uranium industry.
Photograph by Carl Iwasaki, Getty Images

Though the federal government could have owned and operated all U.S.-based uranium mining, the AEC instead opted to pay civilians to discover and mine the uranium. The agency touted American ingenuity, claiming it needed civilian mining know-how to procure home-mined uranium and discover new deposits.

But in reality, writes historian Nate Housley, the decision to sponsor a free-enterprise uranium program was based on officials’ suspicion of organized labor and a desire to thrust oversight of the uranium industry onto the states. And so, when the AEC announced in 1948 that it would pay guaranteed minimum prices and discovery bonuses for uranium ore, the federal government became the uranium industry’s sole customer—and a race to discover and mine as much uranium as possible began.

The boom begins

The U.S. was rich in uranium deposits, and uranium mines sprang up throughout the Southwest.

In 1952, geologist Charlie Steen discovered a gigantic uranium deposit in Utah, the first major find of the program. He became an instant multi-millionaire. The Navajo Nation, which owned much of the land on which uranium ore was found, also took part in the boom.

Soon, prospectors from all over the nation headed west. The government even published a do-it-yourself guidebook that offered a crash course in radioactive elements, uranium prospecting, and how to cash in, ensuring readers that the pursuit was not dangerous in any way. Between 1949 and 1962 alone, the U.S. purchased over 3.6 million tons of uranium ore, making fortunes and encouraging an ever-growing industry.

A rusty, abandoned, ore bin on the side of a reddish, dessert rocky mountain side Steen Canyon near La Sal, Utah
An ore bin abandoned at the Mi Vida Mine in Steen Canyon near La Sal, Utah. The mine was near the site of the first big uranium find in the U.S. Facing calls for nuclear disarmament, U.S. government demand for the mineral diminished and the booming industry abruptly ended.
Photograph by VW Pics, Getty Images

A radioactive legacy

But the boom had a sinister side. The health implications of working with radioactive substances were little known, and the industry was virtually unregulated in many areas. Navajo workers were particularly affected, with more than 1,000 mines dug on Navajo reservation land. Many workers were poorly paid and went to work with no protective gear of any kind.

Inside the mines, they faced dangerously high levels of radon, and at home, the dust contained on their clothing and shoes meant that their family members were exposed to radioactive materials, too.

Yet public health studies initially examined white miners, note historians Doug Brugge and Rob Goble, and despite a growing awareness that working in the uranium mines was associated with high lung cancer rates and other health problems, the AEC suppressed research and insisted states regulate uranium mining, despite nearly a century of federal involvement in mine safety in other mining industries. Only in 1967 did the federal government set its first enforceable radon regulations.

By then, the boom was headed toward a bust. International pressure for nuclear disarmament was on the rise, and in 1964 the government monopoly on uranium purchasing ended. The U.S. also faced a looming energy crisis, changing the nation’s nuclear priorities. In 1974, the AEC was disbanded with the passage of the Energy Reorganization Act.

The uranium industry has had its ups and downs since then, most notably when uranium prices tanked in the wake of the Chernobyl disaster in 1986.

But the boom days at the industry’s dawn still echo today—in the bodies of the workers whose mining cost them their health, in the fallout of nuclear contamination of groundwater and soil, and in the form of hundreds of abandoned mines, many of them Superfund sites, scattered throughout the Southwest.

Go Further