Keeping an open mind is a great philosophy, so long as your mind doesn't change so quickly that no one can coordinate actions with you. But I'm having a difficult time inferring the significance of this post.
On the one hand, if I'm constantly changing my mind, and my mind tends to change toward a stable, slow-changing correct solution, by definition, I'll be "right a lot", so long as I've had sufficient time to converge. In any case, I'll be right a lot more than either a person whose mind does not tend to change toward the correct solution, or someone whose mind does not change. This seems true by definition.
On the other hand, if the correct solution changes rapidly and dramatically, and my mind does not change as quickly, I will trivially be wrong a lot.
Likewise, focusing too much on "details that only support one point of view" seems wrong by construction, unless you magically pick the right point of view to begin with.
I'm not trying to be snarky here, seriously. I just feel like I must be missing the significance. I've reread the post several times, but I don't see it. Perhaps someone could enlighten me?
As Keynes said, "When my information changes, I alter my conclusions. What do you do, sir?"
When I was younger, I didn't know what confirmation bias was. I had to be introduced to the concept. After being introduced, I was able to make better decisions. Although it may seem trivially true to you (now?), it's not to everyone.
You may be reading more into "change their minds a lot" than is intended. A lot in this context means "more than most people". There's a reason phrases like "strong opinions, weakly held" become popular in rationalist circles. It's an emphasis on better decision making.
I think the first time I heard about this idea was in Marilyn vos Savant's Brain Building. She argued that societal infatuation with "having the courage of your convictions" is not the redeeming quality it's made out to be. She mentioned that she could always give her opinion on an issue, but she was also always prepared to change her opinion upon new information.
I'm pretty sure Bezos does not mean you should change your mind 180 degrees at each new contradictory piece of information. Like a Bayesian spam filter, if you've had lots of pieces of evidence for one position, it should take lots or very significant new evidence to change that position.
Decision making biases of various sorts are my chief pet peeve in modern life. It's almost impossible to discuss public policy with people, even in the smartest online forums I know. I think it's banned at Less Wrong. When I listen to the media, I spend most of my time ticking of the biases I hear.
>When I was younger, I didn't know what confirmation bias was. I had to be introduced to the concept. After being introduced, I was able to make better decisions.
Hah! You only think that because you're not counting the bad decisions.
>Decision making biases of various sorts are my chief pet peeve in modern life. It's almost impossible to discuss public policy with people, even in the smartest online forums I know. I think it's banned at Less Wrong. When I listen to the media, I spend most of my time ticking of the biases I hear.
But biases are like stereotypes in that they're shortcuts your mind has developed based on past experience. That's what people used to call "wisdom".
Not to be snarky or pedantic, but you're describing a Heuristic, not a bias. On the other hand, Wisdom is in large part knowing what not to do as a starting point. It may be wrong, which would be in error, but a true bias probably not.
You're right about heuristics, but the relevant wiki page for bias is really "Cognitive Bias"[1] rather than "Statistical Bias".
Also Holmes is not a good example of a rationalist; his main ability comes from being a fictional protagonist. Holmes cannot be a much better source of rationality advice than Conan Doyle, who quite literally believed in fairies.
This is actually a very good comment. Some of these ideas are nested, logically. A Heuristic is not per-se biased, although a subset of heuristics is in fact associated with the origin of the term 'cognitive bias'. Viz,
Although much of the work of discovering heuristics in human decision-makers was done by Amos Tversky and Daniel Kahneman,[4] the concept was originally introduced by Nobel laureate Herbert A. Simon. Gerd Gigerenzer focuses on how heuristics can be used to make judgments that are in principle accurate, rather than producing cognitive biases
Significance of the post - This is a list of common characteristics that Jeff has seen in people he considers to be geniuses. Maybe it is possible that if us normal people adopt these characteristics into our own identity, we might be able to achieve genius status as well.
As far as the details you get into about the constantly changing mind, the solution changing rapidly, and focus on self supporting details, I think you have made a mis-step with the idea of a rapidly changing solution.
If there is a correct solution to a problem, it will never change. Our understanding of the solution can change, the problem could change which would require a new solution. But the correct solution (again assuming one exists) to a set problem can not change.
When he is talking of this changing mind, he is referring to a human's natural process to build it's understanding of our environment, or some problem space, on previous data and conclusions. The problem that most people have is they accept their many previous conclusions as fact, so they are unable to see problem from the correct perspective, because that perspective would require some previous belief to be wrong, thus they never consider that correct perspective. A good example here would be a creationists inability to come to correct scientific conclusions because they believe the world is only a few thousand years old. Bill Nye!
But the worst kind of person to be if you desire to reach genius status is one that searches for validation of their beliefs over truth. Most of us have this in us, I know I do even though I think of myself as open minded. The best example for this would be for you to go out and debate a friend with opposing political views. At some point you will find yourself defending something that you don't fully understand, your friend will make a valid point, and this little asshole inside you will tell you to defy his point at all costs. But you won't, because you started this conversation searching for truth over validation, and then you will consciously understand the desire to search for validation.
If we pop another stack frame, I think there's a deeper level of meaning to the statement "Smart people change their minds".
Smart people are open to evidence and argument. But also I think many really smart people recognise that their own cognition is a mutable thing, that the mind changes as we age and that we can play a role in that. They change their minds.
By immersing themselves in hard, interesting problems, surrounding themselves with other smart or inspiring people and by savouring experiences - that they change the thoughts that occur to them, the memories they can build upon, and the way in which they can form conclusions.
Maybe the really really smart people can even direct change in their mind about how they change their mind.
Then again, maybe that kind of meta-cogitation is tail-recursive.
He's not talking about changing your mind out of changing your mind's sake. But only consider that you might be wrong and be just open to change your mind, if that proves to be the right thing. This might sound obvious, but the point he's making is that most people don't do this. And it's one of the most remarkable characteristics of smart people, it's that they do change their mind when needed.
Average people think the the genius are genius because they're right all the time. But that's not the case. We're all human and we're all wrong a lot. The most remarkable characteristic of geniuses is that they realize they're wrong a lot, while others live under the illusion they're right a lot. Both the average and the genius are wrong often. What sets both apart is the genius notices it, changes his course and -- at the end of the day -- makes a better decision.
The significant bit to take from here is, don't try to be right at your first impression. You're not dumb for getting it wrong the first time. It's ok, just be a skeptic, think about it and don't be afraid to change if you later find that you were wrong.
On the one hand, if I'm constantly changing my mind, and my mind tends to change toward a stable, slow-changing correct solution, by definition, I'll be "right a lot", so long as I've had sufficient time to converge. In any case, I'll be right a lot more than either a person whose mind does not tend to change toward the correct solution, or someone whose mind does not change. This seems true by definition.
On the other hand, if the correct solution changes rapidly and dramatically, and my mind does not change as quickly, I will trivially be wrong a lot.
Likewise, focusing too much on "details that only support one point of view" seems wrong by construction, unless you magically pick the right point of view to begin with.
I'm not trying to be snarky here, seriously. I just feel like I must be missing the significance. I've reread the post several times, but I don't see it. Perhaps someone could enlighten me?