2012-06-15

Lubos Motl Explores the Epistemology of Mysticism

Lubos Motl has written a fascinating blog post exploring the epistemology of mysticism. "Epistemology of mysticism" is my term. Motl just thinks he's trying to explain the mindset of people who reject science for one reason or another.

He begins by laying out the core assumption that learning to think scientifically is something that people must do prior to puberty, otherwise they will spend a lifetime rejecting it. He then proceeds to elaborate on this by laying out a few tiers of unscientific (or anti-scientific?) thought.

The first is based on superstition:
I believe that the deliberate priority of various superstitions, obsolete religious dogmas or, on the contrary, newly created religious superstitions such as a frying Earth and so on are seriously lowering the efficiency with which most people on Earth use their brains. For all of them, the world is full of witches, homeopathic solutions, prophets, dowsing sticks, lucky numbers, geopathogenic zones, miracles, divine truths revealed to shamans, tipping points leading to the Armageddon, and so on. Scientists – people who actually try to use brains, logic, and mathematics applied to the observations – are just some exotic freaks who deserve humiliation and who can't ever reach the glory of the true leaders such as the witches and prophets. Many of these 5+ billion people would be capable of disproving this opinion of theirs if they tried (and if they decided not to fool themselves) but they just don't want to try. They have already made their mind – either individually or they were forced to adopt it – and doubting it would mean for them to undermine their own spiritual existence which is what they don't want to do.
This is a pretty important point. Motl describes both a prevailing cultural bias against science, and a deliberate choice made by unscientific people to refuse to question their own mysticism. He then shifts gears and addresses people "who superficially claim that they want to pursue the scientific method but they don't."

The second tier of unscientific thought is the rejection of mathematical logic. What he says in this section is a bit hard to excerpt or condense. Motl points out that there are
...people who don't think or who don't "agree" that our knowledge or their knowledge about the world may be organized into propositions that are either right or wrong or something in between but whose validity may be, in principle, studied, whose validity matters, which can a priori be right or wrong, and which may be correlated with other propositions by the rules of mathematical logic.
This, I think, describes it best. You may have heard people frantically insist that "people aren't numbers" with respect to conclusions from economics. You have heard people dismiss all questions of their own personal viewpoints as "just one opinion." To these folks, everything is a matter of opinion - there is no such thing as right-and-wrong at all. It's all opinion.

A third tier is what Ayn Rand termed context-dropping, although Motl seems to be unaware of Rand's work on the topic.
I decided to insert this short section that covers two errors that are opposite to one another.... The first error is that many people try to answer a question but they ignore other questions that are demonstrably relevant; the second error is that they fail to stop talking about questions that are demonstrably irrelevant.
Using the example of climate change, Motl notes that once we have included all interacting factors into the study of climate change, and incorporate every relevant time scale, we must conclude that the temperature is both rising and falling, which is of course "inconvenient" to many political pundits.

But of course, that's just one example. I find this kind of error to be particularly egregious in the social sciences. Historians seem to do it the most, determining for themselves what is "important" and what isn't, and refusing to consider adjacent events or facts.

The next tier Motl discusses is what he terms the "rejection of quantification of claims," but what he really means is the rejection of degrees or nuance.

In other words, once we have moved on from yes/no questions and determined that the answer is "yes," we next have to consider "yes by how much?" or "yes to what degree?" For the simple-minded, yes is enough. But as I have tried to point out on this blog many times, life is often quite complicated, and real-life issues often involve varying degrees of... everything.

Because I believe in objective right-and-wrong, I often get accused of seeing everything in black-and-white. A lot of this is the rejection of mathematical logic, as Motl would put it. It is all the more ironic that such people are often the most guilty of refusing to consider the degree to which something is true or false. In reality, they are the ones who see things in black-and-white; the only difference is that they think everything is a matter of black-and-white opinion, whereas I revel in the fact that a great many things are absolutely true or false, and especially to complicated and varying degrees.

Motl next writes two sections that seem to apply mostly to the realm of quantum physics and climate science. They are good sections, but  I will not discuss them here.

His final point is something that I encounter most often in political discussions. People have an inherent desire to build common ground, and they do so during controversies by watering-down their original statements so much that they come up with statements so vague that they feel the other side must concede "at least that much." They never consider that there is no descriptive power in such statements - they are vagueries designed to end conversation, not to build consensus. Motl puts it this way:
At the beginning, I mentioned that one shouldn't talk about propositions that can't a priori have both answers at all. They're tautologies or anti-tautologies, whatever is the right term for a proposition that can be proved identically false just by using the rules of logic. But surprisingly many people seem to believe that if they leave "enough wiggle room" in their general statements, these statements must be true or, to say the least, they can't be shown to be false.

However, this is a complete misconception. Whether something may be disproved depends on the actual available evidence, not on your vague feelings whether your statement is sufficiently vague or whether your collection of candidate explanations looks like a large army.
Those guilty of this particular strategy are extremely numerous and cannot be dissuaded, mostly because - as Motl notes - they cannot seem to understand why their positions are still wrong, despite their having watered things down to near-disintegration.

Well, Motl's whole post is long and enlightening. Recommended reading for all!
 

No comments:

Post a Comment