Back in my high school days I struggled with the issue of the existence of God. At one point I determined there was little evidence that God, if he existed, was actively involved in our lives. So I ended up with belief in a non-interventionist God – a God that existed, perhaps did the initial ‘push’ to get the universe started, but was otherwise uninterested in the activities of human life.

One day it struck me – I don’t remember where the thought came from – that my belief in a non-interventionist God implied that there was no evidence in the universe that he did or did not exist. If God didn’t intervene in any way, there was no intervening to observe. In other words, my belief provided no way to verify itself! Here I had decided that it was impossible to ‘observe’ God, yet continued to believe that God existed. Why was that?

It turns out I did not have a good reason to believe God existed, I merely believed that I ought to believe in God.

That didn’t feel right. It felt a little forced. Why lean a certain way? Why believe I ought to believe anything at all? Obviously, you ought to believe something if the evidence is there, if you have good reasons. But believing that you ought to believe something, merely because you ought to… that seemed silly!

I dropped the belief, and went looking for something better.

It wasn’t explicitly obvious at the time, but it’s clear now that the other reason my belief was flawed was because my anticipations were the same regardless of whether the belief was true or not. Observering no intervention by God wasn’t only evidence that a non-interventionist God existed, because it can equally be used explain that there is no God.

If evidence can be used to defend your belief and the opposite belief, it is not evidence!

///

Post image for Best of Rationality Quotes, 2011 Edition – lesswrong.com

Less Wrong has semi-monthly open threads for user to post quotes about rationality. DanielVarga recently compiled a collection of the best quotes from 2011.

Below are a few that stood out to me. Check out the full list here. Warning: Procrastination opportunities ahead!

From MinibearRex:

The greatest obstacle to discovery is not ignorance – it is the illusion of knowledge.
–Daniel J. Boorstin

From Maniakes

The society which scorns excellence in plumbing as a humble activity and tolerates shoddiness in philosophy because it is an exalted activity will have neither good plumbing nor good philosophy: neither its pipes nor its theories will hold water.
–John W. Gardner

From anonym:

Although this may seem a paradox, all exact science is dominated by the idea of approximation. When a man tells you that he knows the exact truth about anything, you are safe in inferring that he is an inexact man.
–Bertrand Russell

From benelliott:

Day ends, market closes up or down, reporter looks for good or bad news respectively, and writes that the market was up on news of Intel’s earnings, or down on fears of instability in the Middle East. Suppose we could somehow feed these reporters false information about market closes, but give them all the other news intact. Does anyone believe they would notice the anomaly, and not simply write that stocks were up (or down) on whatever good (or bad) news there was that day? That they would say, hey, wait a minute, how can stocks be up with all this unrest in the Middle East?
–Paul Graham

From Tesseract:

It was a good answer that was made by one who when they showed him hanging in a temple a picture of those who had paid their vows as having escaped shipwreck, and would have him say whether he did not now acknowledge the power of the gods,—‘Aye,’ asked he again, ‘but where are they painted that were drowned after their vows?’ And such is the way of all superstition, whether in astrology, dreams, omens, divine judgments, or the like; wherein men, having a delight in such vanities, mark the events where they are fulfilled, but where they fail, though this happens much oftener, neglect and pass them by.
–Francis Bacon

From wallowinmaya:

Doubt is not a pleasant condition, but certainty is absurd.
–Voltaire

From AlexMennen:

The discovery that the universe has no purpose need not prevent a human being from having one.
–Irwin Edman

From Dreaded_Anomaly:

Complex problems have simple, easy to understand wrong answers.
–Grossman’s Law

From Jayson_Virissimo:

The typical citizen drops down to a lower level of mental performance as soon as he enters the political field. He argues and analyzes in a way which he would readily recognize as infantile within the sphere of his real interests. He becomes primitive again.
–Joseph A. Schumpeter, Capitalism, Socialism, and Democracy

From AndrewM:

We are built to be effective animals, not happy ones.
–Robert Wright, The Moral Animal

From MinibearRex:

The proposition here is that the human brain is, in large part, a machine for winning arguments, a machine for convincing others that its owner is in the right – and thus a machine for convincing its owner of the same thing. The brain is like a good lawyer: given any set of interests to defend, it sets about convincing the world of their moral and logical worth, regardless of whether they in fact have any of either. Like a lawyer, the human brain wants victory, not truth; and, like a lawyer, it is sometimes more admirable for skill than for virtue.
–Robert Wright, The Moral Animal

From aausch:

You’ll worry less about what people think about you when you realize how seldom they do.
–David Foster Wallace

From MinibearRex

“Holmes,” I cried, “this is impossible.”
“Admirable!” he said. “A most illuminating remark. It is impossible as I state it, and therefore I must in some respect have stated it wrong.”
–Sherlock Holmes, The Adventure of the Priory School

From endoself:

Most people would rather die than think; many do.
–Bertrand Russell

From KenChen:

Hofstadter’s Law: It always takes longer than you expect, even when you take into account Hofstadter’s Law.
–Douglas Hofstadter, Gödel, Escher, Bach: An Eternal Golden Braid

From Kazuo_Thow:

Apathy on the individual level translates into insanity at the mass level.
–Douglas Hofstadter

Check out the complete list: Best of Rationality Quotes 2011

///

What is rationality? What do people think it is?

Many people misunderstand what the word ‘rationality’ really means, in large part because of its confused portrayal in movies, on TV, and in the media.

How exactly do they get it wrong? Julia Galef gave a presentation at Skepticon 4 on Hollywood rationality. It does a great job of introductioning rationality in general and the common public misconceptions of it.

A transcript of the entire presentation can be found here. I also wrote my own summary of the presentation below.

Check it out:

httpvh://www.youtube.com/watch?v=tLgNZ9aTEwc

The classic Hollywood example of rationality is the Vulcans from Star Trek. They are depicted as an ultra-rational race that has eschewed all emotion from their lives.

But is this truly rational? What is rationality?

A “Straw Vulcan”—an idea originally defined on TV Tropes—is a straw man used to show that emotion is better than logic. Traditionally, you have your ‘rational’ character who thinks perfectly ‘logically’, but then ends up running into trouble, having problems, or failing to achieve what they were trying to achieve.

These characters have a sort of fake rationality. They don’t fail because rationality failed, but because they aren’t actually being rational. Straw Vulcan rationality is not the same thing as actual rationality.

What is real rationality?

There are two different concepts that we refer to when we use the word ‘rationality’:

1. The method of obtaining an accurate view of reality. (Epistemic Rationality) — Learning new things, updating your beliefs based on the evidence, being as accurate as possible, being as close to what is true as possible, etc.

2. The method of achieving your goals. (Instrumental Rationality) — Whatever your goals are, be them selfish or altruistic, there are better and worse ways to achieve them, and instrumental rationality helps you figure this out.

These two concepts are obviously related. You want a clear model of the world to be able to achieve your goals. You also may have goals related to obtaining an accurate model of the world.

How do these concepts of rationality relate to Straw Vulcan rationality? What is the Straw Vulcan conception of rationality?

“Straw Vulcan” Rationality Principles

The Straw Vulcan, from measureofdoubt.com

Straw Vulcan Principle #1: Being rational means expecting other people to be rational too.

Galef uses an example from Star Trek where Spock, in an attempt to protect the crew of the crashed ship, decides to show aggression against the local aliens so that they will be scared and run away. Instead, they are angered by the display of aggression and attack even more fiercely, much to Spock’s dismay and confusion.

But this isn’t being rational! Spock’s model of the world is severely tarnished by his silly expectation for everyone else to be as rational as he would be. Real rationality would require you to try to understand all aspects of the situation and act accordingly.

Straw Vulcan Principle #2: Being rational means never making a decision until you have all the information.

This seems to assume that the only important criteria for making decisions is that you make the best one given all the information. But what about things like time and risk? Surely those should factor into your decisions too.

We know intuitively that this is true. If you want a really awesome sandwich you may be willing to pay an extra $1.00 for some cheese, but you wouldn’t pay $300 for a small increase in the quality of a sandwich. You want the best possible outcome, but this requires simultaneously weighing various things like time, cost, value, and risk.

What is the most rational way to find a partner? Take this example from Gerd Gigerenzer, a well-respected psychologist describing how a rationalist would find a partner:

“He would have to look at the probabilities of various consequences of marrying each of them—whether the woman would still talk to him after they’re married, whether she’d take care of their children, whatever is important to him—and the utilities of each of these…After many years of research he’d probably find out that his final choice had already married another person who didn’t do these computations, and actually just fell in love with her.”

But clearly this isn’t optimal decision making. The rational thing to do isn’t to merely wait until you have as much information as you can possibly have. You need to factor in things like how long the research is taking, the decreasing number of available partners as time passes, etc.

Straw Vulcan Principle #3: Being rational means never relying on intuition.

Straw Vulcan rationality says that anything intuition-based is illogical. But what is intuition?

We have two systems in our brains, which have been unexcitingly called System 1 and System 2.

System 1—the intuitive system—is the older of the two and allows us to make quick, automatic judgments using shortcuts (i.e. heuristics) that are usually good most of the time, all while requiring very little of your time and attention.

System 2—the deliberative system—is the newer of the two and allows us to do things like abstract hypothetical thinking and make models that explain unexpected events. System 2 tends to do better when you have more resources and more time and worse when there are many factors to consider and you have limited time.

Take a sample puzzle: A bat and ball together cost $1.10. If the bat costs $1 more than the ball, how much does the ball cost?

When a group of Princeton students were given this question, about 50% of them got it wrong. The correct answer is $0.05, since then the bat would cost $1.05 for a total of $1.10. The wrong answer of $0.10 is easily generated (incorrectly) by our System 1, and our System 2 accepts it without question.

Your System 1 is prone to biases, and it is also incredibly powerful. Our intuition tends to do well with purchasing decisions or other choices about our personal lives. System 1 is also very powerful for an expert. Chess grandmasters can glance at a chessboard and say, “white checkmates in three moves,” because of the vast amount of time and mental effort spent playing chess and building up a mental knowledge base about it.

Intuition can be bad and less reliable when based on something not relevant to the task at hand or when you don’t have expert knowledge on the topic. Your opinions of AI may be heavily influenced by scifi movies that have little basis in reality.

The main thing to take away from this System 1 and 2 split is that both systems have strengths and weaknesses, and rationality is about finding the best path—using both systems at the right times—to epistemic and instrumental rationality.

Being “too rational” usually means you are using your System 2 brain intentionally but poorly. For example, teenagers were criticized in an article for being “too rational” because they could reason themselves into things like drugs and speeding. But this isn’t a problem with being too rational; it’s a problem with being very bad at System 2 reasoning!

Straw Vulcan Principle #4: Being rational means not having emotions.

Rationality and emotions are often portrayed in a certain way in Straw Vulcan rationalists, such as when Spock is excited to see that Captain Kirk isn’t dead, and then quickly covers up his emotions. The simplistic Hollywood portrayal of emotions and rationality is as follows:

The Straw Vulcan view of the relationship between rationality and emotion.

Note that emotions can get in the way of taking actions on our goals. For example, anxiety causes us to overestimate risks; depression causes us to underestimate how much we will enjoy an activity; and feeling threatened or vulnerable causes us to exhibit more superstitious behavior and and likely to see patterns that don’t exist.

But emotions are also important for making the decisions themselves. Without having any emotional desires we would have no reason to have goals in the first place. You would have no motivations to choose between a calm beach and a nuclear waste site for your vacation. Emotions are necessary for forming goals; rationality is lame without them!

[Galef noted in a comment that the intended meaning is in line with “Emotions are necessary for forming goals among humans, rationality has no normative value to humans without goals.”]

This leaves us with a more accurate portrayal of the relationship between emotions and rationality:

The updated model.

How do emotions make us irrational? Emotions can be epistemically irrational if they are based on a false model of the world. You can be angry at your husband for not asking how your presentation at work went, but then upon reflection realize you never told him about it so how would he know it happened? Your anger was based on a false model of reality.

Emotions can be instrumentally irrational if they get in the way of you achieving your goals. If you feel things are hopeless and there are no ways to change the situation, you may be wrong about that. Your emotions may prevent your from taking necessary actions.

Our emotions also influence each other. If you have a desire to be liked by others and a desire to sit on a couch all day, you may run into problems. These desires may influence and conflict with each other.

We can also change our emotions. For example, cognitive behavioral therapy has many exercises and techniques (e.g. Thought Records) for changing your emotions by changing your beliefs.

Straw Vulcan Principle #5: Being rational means valuing only quantifiable things, like money, efficiency, or productivity.

If it isn’t concrete and measurable then there is no reason to value it, right? Things like beauty, love, or joy are just irrational emotions, right?

What are the problems with this? For starters, money can’t be valuable in and of itself, because it is only a means to obtain other valued things. Also, there is no reason to assume that money and productivity are the only things of value.

The Main Takeaway

Galef finishes off with this final message:

“If you think you’re acting rationally but you consistently keep getting the wrong answer, and you consistently keep ending worse off than you could be, then the conclusion you should draw from that is not that rationality is bad, it’s that you’re bad at rationality.

In other words, you’re doing it wrong!

Image from evilbomb.com

///

How to Be Humble

December 2011

in Self-mastery

Post image for How to Be Humble

I’m writing a series that highlights key material from LessWrong.com. This post is based on The Proper Use of Humility by Eliezer Yudkowsky.

“Faced with the choice of changing one’s mind and proving there is no need to do so, almost everyone gets busy on the proof.” —John Kenneth Galbraith

Humility: Good and Bad

Proper humility is difficult—both to be it and to understand it.

The word ‘humility’ is used in our society to mean several different things, and not all of them are good or useful.

Take the humble student, illustrated by Eliezer, who is encouraged to study harder for his test but replied, “No, it wouldn’t work for me; I’m not one of the smart kids like you; nay, one so lowly as me can hope for no better lot.”

This isn’t humility, it’s social modesty! As Eliezer points out, when told to be more humble people tend to associate this with social modesty. Saying, “What makes you think you know all the answers? You should be more humble!” is criticizing for claiming too much social status. The student is “humbly” lowering his status, but doing nothing about the original situation (that he should study harder).

A second, and more common use of humility is as an excuse to shrug. [click here to keep reading…]