First things first: oh dear — I apologise to all if I gave the impression I was in any way involved.
Further than seeing the fluff items in the news “CERN courts disaster controversy” stuff.
Being naturally iconoclastic, I had a think to see if I should be worried.
That’s as far as my contribution to the safety of mankind went *that year*.
Secondly: I think you raise an excellent point, and I also think that for a certain class of issues, there is perhaps an persuasive dividing line.
Do I stay up at night worrying all the air in the room will one day leap to corner and hence suffocate me?
There is nothing stopping it from doing so: it’s random.
But it is vanishingly unlikely.
After Chernobyl, this kind of conversation was held at length in the pubs, I seem to recall.
The issue is perhaps that humans have a limited repertoire of numbers:
small, medium, big, really big.
And we tend to treat the categories as adjacent.
I blame science fiction.
Simply put: no-one can eliminate the possibility that CERN would blow up the world.
But place that against the fact that right now farms feeding their animals antibiotics are literally running the experiment on how long it will take to breed a species hopping, virulent unstoppable superbug (We even have the interim results report!).
In my opinion: the existential risk to humankind posed by CERN’s experiment is not even a rounding error compared to what humankind are doing elsewhere.
There are 2 issues here dealing with the infinite, near infinite and near infinitesimal:
Pascal's Wager - Wikipedia
Pascal's Wager was based on the idea of the Christian God, though similar arguments have occurred in other religious…
For infinity: it’s actually easy. Numbers that are in some way quantifiable are where the opportunity for debate arises.
So, your question is pertinent, and I would suggest to respond to
And if we do say yes, we implicitly accept the future empowerment of countless future groups, which will operate on the edge of plausible catastrophe & imperfect certainty in the century to come as synthetic biology, AI work, and eventually perhaps nanotech enable ever smaller, less vetted, and more numerous group to make similarly fraught decisions without input or oversight from outsiders.
[i] CERN was not a referendum on risk taking: they are funded by governments and have oversight, and publish their reasoning for others to critique.
[ii] That for all such cases where everyone has a stake we take a methodical approach to each case on a case by case basis. Similar to CERN in some ways?
Instead, rank their risk / reward alongside other human activities and make the process transparent: at least then, when things go wrong no-one can complain (if there is anyone left).
- antibiotics as growth factors : insane WTF factor, medium benefit to food production
- CERN: low WTF (we think): endless benefit to science
- Mortbage backed securities: ludicrous WTF factor: medium gain to tiny set of 1% ers
- CRISPR modification of genomes *live on stage*: insane WTF factor: medium gain to a group of risk takers
So, yeah: I think there are some things going on that are very troubling. I will concur with other on some matters and demur on others.
Note I factor in permanent events versus ephemeral differently: today’s burger enriches me once; today’s book enriches me as long as I live; the book I write might enrich society forever.
All the above assessments are my opinion, but that’s the point.
We can always talk. And we have a fighting chance of having a sensible conversation if we use the language of probability.