Friday, August 24, 2012

In Offense of Rationality

Okay, I said I'd get around to my issues with Rationality several months ago. So maybe I actually should. To start, I'm not against being rational. It's just a mode of thought that says "don't use invalid premises." The problems start when you try to define what an "invalid premise" is. Based on how you do it, Rationality as a worldview forks into two major camps:

Truth-Rationality: an invalid premise is one that contradicts objective reality. In other words, seek the truth, and only the truth. Believing in, say, astrology is right out. Sometimes called "epistemic rationality".

Value-Rationality: an invalid premise is one that makes it more difficult to achieve your goals. Believing in astrology is okay if it by some strange miracle benefits you in the long run. Sometimes called "instrumental rationality".

I feel that Value-Rationality is pretty tautological (Think things that are good for you! Don't think things that are bad for you!), so I'd like to focus most of the rest of this post on Truth-Rationality. Most people who've gushed to me about Rationality have, explicitly or not, taken that stance. And it's a very tempting stance to take. Truth is good, so why not use it in all things? I can think of three general reasons:

The Truth isn't always available. This is primarily true with non-naturalistic things, such as religion and the supernatural. When the rationalist says "it's irrational to believe in god", he's saying that there's no evidence to believe in god. But at the same time, there's not evidence to not believe in god, either. If you start talking about how it's unnecessary or hurts people, then you're not arguing about the truth. You're making a value-judgment.

The Truth isn't always necessary. Do we have free will? Who cares? Having free will isn't gonna affect you one way or another. Trying to come up with a "rational" argument about it is a waste of time.

The Truth isn't always helpful. This is the important one. People are far more empowered by their beliefs than their knowledge. If you learn knowledge that contradicts your beliefs, then you could very well sabotage your ability to do well in something. Normally this is used in conjunction with belief in religion, but that topic is so damn volatile I'd like to give some more down to earth examples:

-If you take a group of students and tell half of them they are smart and the other half they are hard workers, the latter group will develop considerably more over time. This is true even that group didn't start out as hard workers. Their untrue beliefs give rise to actual, measurable differences in ability.

-Let's say you start going to the gym. Initially, you feel embarrassed that you're so much less fit than everybody else there, but tell yourself that they're not paying attention and focus on exercising. Over months you start getting much stronger and healthier. When you start talking to the other gym-goers, one admits that during the first two months everybody was laughing at you behind your back.

Would you really have wanted to know that from the start?

And no, don't go "that wouldn't have stopped me". Unless you've been bullied before (and even then) you have no way of predicting just how much pain it would have caused you. Maybe it would have been tolerable. Or maybe it would have caused you to go less or even stop entirely. Is having do deal with the truth really so much better than just pretending that everything is okay?

This isn't a special case, either. Confidence in yourself is almost a necessary condition to succeeding at something. It's easier to give a good speech if you're confident you're a good speaker, even if that means overestimating yourself. And confidence in yourself is often disconnected from the truth. It's very hard for physical knowledge to make you feel better about something- Our minds are better at manifesting raw emotive will than producing it from abstract facts. The truth can help, if it provides grounds to support, and it can prompt you to do better, if you can see room for improvement. But for truth to always help? Nope. It's a lot easier to use the truth to shatter your confidence than to bolster it.

This is not to say that the truth is conditionally bad. Often it's incredibly important we find it. One of the big advantages of Rationality as a worldview is that it emphasizes we understand our cognitive biases, like our habit of externalizing our problems. And often, even if the truth hurts now it will help in the long run. But that doesn't mean you should embrace it and damn the consequences. There may be a way of using false premises in the short run and the truth in the long run. Delusion is a pretty neat thing.

Now a lot of rationalists say there's No Such Thing. There's never a case where a false premise provides more value than a true one. This is such a sweeping statement about humanity that I find it utterly ridiculous. See the gym example. See religion. People are a mess of contradictions and groundless beliefs, but many can draw power from it. If you can live completely free of false premises and still have the same level of empowerment, then you're not human. You're an ubermensch.

Truth-Rationality has some serious problems. You're welcome to still have it as a worldview, but you should recognize that it cannot and does not work for everybody. Sadly, I do see a lot of Truth-Rationalists see themselves as better than those who aren't. Maybe if you could be a perfect TR, where your mind is just so and the flaws in TR don't apply to you. But if you were a perfect TR, you'd see yourself as equal to everybody else on this earth. It doesn't matter what you believe, as long as it drives you to grow as a person.

Or not, if that's not your goal. I shouldn't be judging.

Monday, August 6, 2012

The Hardest Question


I find I write best when I am writing about someone who, justifiably or not, made me rage. This last happened on Sat night, when a few of us were drunkenly playing the answers game. You know, ask somebody a question, and if they don't answer it they have to take a drink. Over the course of the game I grew progressively more frustrated with the answers I was getting. I was angling pretty heavily at darker, heaver questions than the rest of the group, and a lot of the answers I was getting were flippant or "I dunno".* Finally, when it came to my last question, I turned to the person across from me and asked "What's the hardest question someone can ask you, not because of what you'd have to admit to them but because of what you'd have to admit yourself?"

At which point he mocked me and said that if there was any such question, his natural curiosity would mean he'd want to answer it anyway. The question, to him, was intrinsically stupid. So naturally the rest of this post will be me trying to argue that it isn't.

Like it or not, everybody is delusional. If you believe you aren't then you've gone straight past 'deluded' and into 'crazy'. Anything you believe pragmatically as opposed to empirically is a delusion, and there are a LOT of those. "I am destined for greatness." "My ethical system is the best one." "Grad school will be worth it." Even though the claim is tenuous or even outright false (like a/theism to theists/atheists), what matters is that it affects you. And it doesn't necessarily have to be negative- it often isn't. Maybe you're getting motivation, or courage, or even just comfort. What's important is that you're getting something at the cost of truth.

Maintaining delusions is a delicate balancing act. Moreso for college students, who have both the opportunity and temptation to break them. We are drawn to the truth, but truth is fatal to happy lies. It's possible to have both the truth and the lie, building a convoluted bridge between them. But that bridge is itself a delusion, and just as vulnerable to the truth. Easier to refuse to believe part of the truth, just enough so that we can keep our toolbox. It's not pleasant, but we have to function somehow.

Now some people would argue that any delusion is inherently less useful than the truth, and discarding them will eventually put you in a better place. I don't like this argument. There are definitely some cases where the truth sets you free. "I am inherently better than {racial group}" is a good one. Whatever comfort that provides you does not make up for your terribleness, and the truth can only help you. But all delusions? No. Belief can move mountains. We know this. To say otherwise is an extraordinary claim requiring extraordinary evidence. Sometimes the truth can hurt you. Deciding when it won't is what makes it hard.

This brings us back to our original question. The hardest question you can ask yourself is a question about whether a cherished belief is a delusion. And if it is, whether to build the labyrinth of self-deception or give up a meaningful part of you. Whether to reject the ugly truth or the beautiful lie.

Sounds like a tough question to me.

*Yeah, I know that most people don't like playing this way. I'll admit it was a bit of a faux pas.**

**A lot of a faux pas.