ELI Primary Menu

Skip to main content

"The Ultimate Eco-Catastrophe”

Wednesday, December 2, 2020
Stephen R. Dujack

Stephen R. Dujack

Editor, The Environmental Forum®

The biggest machine ever built is run by a consortium of European governments called CERN. Its Large Hadron Collider accelerates heavy subatomic particles at near light speed around a circle 17 miles in circumference before smashing them together. Scientists then study the remains and obtain important clues about how the universe works.

The LHC occupies a huge donut-shaped tunnel looping across the Swiss-French border near Geneva. It is a neighborhood that pairs our most advanced tech just below alpine pastureland seemingly out of the children’s book Heidi. The bucolic setting belies the fact that experiments at the LHC (and at smaller accelerators around the world built earlier) could conceivably trigger what Harvard physicist Sidney Coleman once called “the ultimate ecological catastrophe.”

The LHC was built with a singular purpose — finding the elusive final link in the Standard Model of Particle Physics. Predicted in the 1960s, the long-missing Higgs boson, working through the conjectured Higgs field, endows other particles like protons and neutrons with mass. Without the Higgs, there would be no stars, planets, or people.

AtomThe race to discover subatomic particles really kicked off after World War II. Around 1970, some physicists became alarmed that the energy of particle collisions might push the fragment of the vacuum pervading the universe that is within the machine itself from the current low-energy state to an even more stable one. This vacuum decay could then start a cascade of change destroying the whole of creation. A later worry was that creating a Higgs boson could similarly end the cosmos through a change in the universe’s Higgs field.

Well, nothing happened with any of the particles scientists began to discover using colliders, and the LHC found the Higgs in 2012, completing the set. So seemingly less problematic is the merely local catastrophe resulting from the formation of mini black holes that then swallow the Earth. A few weeks before the LHC was switched on, physicist Sean Carroll calculated the chances of creating such a black hole as about 0.00000000000000000000000001 percent. That is exceedingly small, but there was not a notice-and-comment period in which the public was informed of the risk as well as any rewards that might result from discovering the Higgs. Indeed, the same has been true of other potentially dangerous experiments performed earlier.

Naturally these risks have led not only to (failed) federal lawsuits to stop colliders but also to a chapter in a book by Richard Posner, the prolific former Seventh Circuit jurist. In Catastrophe: Risk and Response, he proposes a permanent special advisory body of experts to inform the public debate.

Instead of a public advisory body, there were decades of secret meetings, as revealed in Ian Sample’s award-winning 2013 book Massive: The Higgs Boson and the Greatest Hunt in Science. Physicists have in fact recognized the risks of colliders and carefully researched and assessed them — just as they had with conjectures the atmosphere could catch on fire as a result of the Manhattan Project’s atom bombs. When it comes to the experimental accelerators that popped up after the war, physicists ended such secret debates time and again by noting that heavy particles in cosmic rays collide with the airless Moon at energies far higher than any current machine can generate.

Which isn’t to say there has been no public discussion. The fear of a black hole being created by an atom smasher looking for the Higgs became a media frenzy in 1999 after Scientific American published letters expressing concern. The editors had called on physicist Frank Wilczek to write a reassuring adjoining note. He concluded that black holes were unlikely to persist beyond a few seconds — but what should really worry the public are strangelets that might be produced by an experiment. Strangelets are a theoretically more stable form of matter. Much like in vacuum decay, making them could lead all adjacent particles to become strangelets, on and on again, ending the universe as we know it. A similar ecocatastrophe was later laid at the feet of conjectured magnetic monopoles.

One study that was discussed only in the literature put the upper limit of the risk of a universe-ending event at the Brookhaven accelerator near New York City at 1 in 50 million. That is within an order of magnitude of a Superfund site post-cleanup risk goal, but vastly more people are in danger. Sample notes that a 1 in 50 million chance of killing everybody on the planet works out to “the equivalent of expecting 134 people to die” as the result of a subatomic experiment, not to mention destabilizing the entire universe.

Sample is hopeful that nothing is likely to go wrong with today’s technology. But it makes sense to widen the circle of decisionmaking along the lines of Posner’s public panel of experts — and to make all meetings open. There is also a need for a body of law to govern potentially dangerous experiments. And as Posner notes, for lawyers to play a useful role here, they need to become more scientifically literate.

Meanwhile, colliders are getting more powerful every year.

This blog originally appeared in the November/December 2020 issue of The Environmental Forum and is republished with permission.

All blog posts are the opinion of its author(s) and do not necessarily reflect the views of ELI the organization or its members.