My previous post was quite brief, so here are a few more thoughts on the two paragraphs quoted therein. The first paragraph was about estimating the danger posed by the LHC to the planet (if not the universe), and the big argument for safety is that cosmic rays produce such collisions all the time. Whatever a collider might produce, it’s very likely that such has already been produced on the moon, for example, lots and lots of times. And of course, the moon’s still there. That argument doesn’t seem to depend upon the niceties of particle physics. But cosmic rays spread out from the sun. So they are most concentrated near the sun. What if some merging of products of collisions is most likely nearest the sun? Such events might not occur on the moon, but might occur in the most concentrated beams of our biggest colliders. So how likely is it that such an event causes tiny ripples on the sun? The problem is that such an event would destroy the earth. And our most popular theories have little to say about such questions (and did fail to predict dark matter).
Furthermore, suppose we could estimate the answer at no more than one in a billion. Would that be safe? We are talking about the possible destruction, not only of less than ten billion people, but of all possible future human beings. What figure should be given to that? So we also need some way of determining just how safe a oneinabillion chance of destroying the human race really is (as Sample noted). Since that problem is so intractable (cf. the St. Petersburg Paradox), surely the main thing here is that we do have better things to do, things associated with more mundane risks (and more immediate benefits). Even theoretical physicists have plenty of other puzzles to solve. A competitive Academia may encourage them to excel at the language game of string theory, but surely the most puzzling thing in theoretical physics (given the materialism there) is the absence of anything at the fundamental level that could conceivably give rise to awareness when the fundamental particles are parts of complicated biochemical systems (the elephant in the room in which we debate synthetic biology).
Or they could address their big methodological problem, which is the question of what they should be thinking they’re doing. The language of science is mathematics, but standard mathematics is heavily influenced by Formalism, which encourages the move from general interest in a new type of theory, to the adoption of the presuppositions of that type of theory. We all know what is meant by ‘1 + 1 = 2’, but standard mathematicians will tell you that it means that {0, {0}} follows {0} in the von Neumann series (where those are ZFC sets, and ‘0’ denotes the empty set). They will say that that is just the language game that is modern mathematics. But mathematics is not a game, but the language of science; and that word ‘language’ is being used metaphorically. The literal languages of science are our natural languages, which include mathematical terminology when one is doing science. It is a philosophical question, what those terms refer to, if anything; but the meaning of mathematical statements is clearly akin to logic (not a madeup, formal logic, but the logic that all scientists should apply).
[Link] Outgroups, Bias, and the Dark Web (for Quillette magazine)

My first essay for Quillette builds on "The Context is Conflict" to explore
the clash between decouplers and contextualizers, why everyone is a
hypocrite, ...
13 hours ago