There are two mistakes you can make when you read a scientific paper: You can believe it (a) too much or (b) too little. The possibility of believing something too little does not occur to most professional scientists, at least if you judge them by their public statements, which are full of cautions against too much belief and literally never against too little belief. Never. If I’m wrong — if you have ever seen a scientist warn against too little belief — please let me know. Yet too little belief is just as costly as too much.
It’s a stunning imbalance which I have never seen pointed out. And it’s not just quantity, it’s quality. One of the most foolish statements that intelligent people constantly make is “correlation does not imply causation.” There’s such a huge bias toward saying “don’t do that” and “that’s a bad thing to do” — I think because the people who say such things enjoy saying them — that the people who say this never realize the not-very-difficult concepts that (a) nothing unerringly implies causation, so don’t pick on correlations and (b) correlations increase the plausibility of causation. If your theory predicts Y and you observe Y, your theory gains credence. Causation predicts correlation.
This tendency is so common it seems unfair to give examples.
If you owned a car that could turn right but not left, you would drive off the road almost always. When I watch professional scientists react to this or that new bit of info, they constantly drive off the road: They are absurdly dismissive. The result is that, like the broken car, they fail to get anywhere: They fail to learn something they could have learned.
Elsewhere, Joe Carter has an interesting post on How to Have Confidence In Your Opinions (Even Though They May Be Wrong).
No comments:
Post a Comment