Monday, August 10, 2009

Saying things you don't believe: A scientific virtue

In the course of scientific debates and discussions, I often say and claim things I don't actually believe. I may think these things have a good chance of being true, or maybe I just can't immediately think of why they're false, but I don't actually believe them the way that I believe that my lunch was tasty today or that my adviser is currently in Mexico. I may even think that what I'm saying definitely can't be right for a number of reasons that I'm already aware of, but that there is a nugget of truth in it, which can be stripped and pulled out of the whole, clarified and put forward on its own.

I think this is an activity that, outside of scientific discussions, seems pretty incomprehensible. After all, why would any sane person voluntarily debate and defend a position they're really not that sure of (or even blatantly know is wrong) just for fun? Is that person just being contrary? Arrogant?

But I think that when it comes to the deeper questions -- the questions none of us really know the answers to -- saying things we don't believe is really the only way to proceed, because it's the only way to say anything at all. Scientists would be a lot less productive if they waited until they had fully worked-out theories before proposing them to anyone, because there are enormous benefits in the discussion process itself. In many cases, saying things we don't believe is a sort of test -- if I can't think of any reason why what I'm saying is wrong, and you (a thousand yous) can't either, then maybe it's actually right. But of course, at the time I'm saying it, I'm far from sure it's right. I may even know it's wrong, but not know exactly why, or not know how to get at the one bit that seems right. Stating an idea and letting others tear it down is one way -- in my opinion, a very good one -- of getting at the truth. And that makes it, I think, a good scientific skill to have.

It's something I wish was more understood and appreciated in other domains -- in ordinary non-academic discussions of politics, religion and morality. It's especially something I wish was acknowledged by scientists themselves. Too often, I think, scientists feel the need to stand behind their claims, staking their reputations and careers not so much on the interestingness or eventual potential of their idea as on its truth in its current form. This necessarily causes ego to get involved and emotions to run high when theoretical matters are debated. It is as if the person feels that all of the rest of their ideas would cease to have value if the one at hand didn't turn out to be true, while in reality, just about everyone who has ever made an amazing discovery has had incredible amounts of false (not to mention just plain bad) ideas. Take, for example, Newton's weird theological and occult theories. Of course, there are many different reasons egos get involved in science, and a full discussion isn't necessary here (which is lucky for me, to paraphrase Jerry Fodor, because I don't have one). I think, though, that this unfortunate tendency would be curbed somewhat if scientists became more comfortable with -- and graduate students were explicitly encouraged to -- express thoughtful, well-reasoned and interesting ideas that are acknowledged to be quite probably false.

4 comments:

  1. Maybe the next post could be about believing things you don't say. What kind of virtue is that?

    ReplyDelete
  2. Quick thought on this.

    I agree that proposing ideas in order to advance an argument or idea (and not neccessarily for its truth value) is an important part of scientific thinking. However, I am not sure if I am ready to say that other disciplines do not do it.

    The reason for this doubt is that I am not sure how "natural" scientific thinking is for most people. For me, for as long as I can remember, I have thought about things in a very relativistic, premises -> conclusion type of way, with empirical data acting as strongest evidence. Most people I hang out with do the same. So - is this way of thinking something that comes naturally with the way our mind works or is it something that is learned?

    My intuition tells me that it is the former, and, as such, that other people too, when arguing, must assert things that they do not fully believe because it follows from the previous assumptions (a pop-culture example that comes to mind is when Jon Stewart called Truman a war criminal).

    One could argue against my intuition by pointing out that others clearly believe things that are not scientific (e.g., in religion, in aliens, etc). But I don't think that this is non-scientific thinking, but merely not empiricial in a third-person evidence sort of way. I think that everyone believes things based on evidence, and that, quite often, for those who believe in aliens or religion, it comes down to a subjective kind of evidence (perhaps something they feel when they go to church that is as strong or stronger as what they experience otherwise). And they will quite quickly reject this idea if suffficient evidence came to the contrary. In this sense, everyone else thinks in a scientific kind of way, as well.

    I know that I am greatly exaggerating my point here, because scientific thinking is far more complex than what I have illustrated above. But, I would not be so quick to assume that scientists think in a different kind of way than others. Maybe just that they take different kind of evidence as support for their arguments.

    ReplyDelete
  3. Hey Darko,

    I didn't mean to imply that this kind of thinking doesn't exist in non-scientific domains. But I do think it's rarer, partly because scientists' training encourages us to act this way more than the training non-scientists get. Of course, some people stumble upon this by themselves, or are (like me) naturally argumentative. Nevertheless, I do think it's more common in science circles than elsewhere. It's funny that you used Jon Stewart as an example -- I think he's an exceptionally smart, argumentative layman. If all laypeople had the same sort of critical mind as him, we'd be very far along towards the universality of the attitude I'm talking about.

    It's important to not that the issue I raised isn't so much about evidence-based vs. non-evidence-based thinking. I think it's more of a matter of attitude. Many, if not all, non-scientists reason in various ways from what they consider to be evidence, but in this specific skill or activity -- arguing things without really feeling the need to commit to believing their truth -- that is an attitude more than a way of argument. That attitude, I think, is more common among scientists. Of course, I dedicated a large part of the post to pointing out that scientists would do well to cultivate this skill as well. It may be somewhat more common among them, but many still don't have it.

    ReplyDelete
  4. To Ori,

    Believing things you don't say -- A virtue I don't have... at all.

    ReplyDelete