Tuesday, March 19, 2013

Beliefs: the Solution and Cause of our Problems

A well-known commenter suggested I be more open on a subject where I have a strong opinion. He makes the reasonable claim that it's unlikely my single explanation is the only correct one among many, that perhaps the truth is actually a combination of several hypotheses, and further my credibility is dampened by singling out a pet theory, and lastly it causes a biased view of the evidence.

I like to think about epistemology, and the nature of beliefs brings forth a mare's nest of paradoxes. As Richard Feynman noted, the first goal of a scientist is not to fool yourself, because you are the easiest one to fool.  How can this be? For starters, overconfidence is not only ubiquitous, but very beneficial in moderation (Dan Kahneman says it's the bias he would most want his children to have). I've found those who state they are not merely unbiased, but rationally objective to a fault, are usually the most biased thinkers around (Stephen Jay Gould, Richard Dawkins). This is why Alfred Adler noted the hardest thing to do is know yourself, and then change yourself.  Further, it's hard to  explain why two rational people can agree to disagree (see Milgrom and Stokey's No-Trade Theorem)

 It's generally good when discussing a problem to not propose solutions until the problem has been discussed as thoroughly as possible, and only then suggest solutions.  Griffin and Tversky find that once we can predict our beliefs, they are basically fixed; if you think there's a 70% chance you will vote for something, then there's an almost certainty you will. This is why juries aren't supposed to discuss a trial until the end, because if you discuss it earlier you will unconsciously favor points that support your initial view.

As Eliezer Yudkowsky notes, your wisdom is determined by how you create your beliefs, not how you defend them. Alas, most thoughts are focused unconsciously at the latter.

Given that, there's a large cost to having beliefs, because they are difficult to change as a practical matter.  On the other hand, there are many potential theories and facts, and being open to all of them creates indecision. As they say, if you don't know where you are going any road gets you there, but if that's your goal no one will follow you there either. I mean, you can always be a mercenary, someone who does something specific other people  find difficult or onerous (eg, an accountant, programmer), but if you want to be a portfolio manager, you need a clear vision to have a chance of succeeding.

Doubt is the essence of human consciousness, as our lack of instinct means we often think about what we are doing, why and what to do. It is necessary at least once to doubt all things, but eventually, you have to choose beliefs.  Sure, they should be tentative, but more like amendments to the constitution as opposed to annual budget items.

 My thoughts on the risk premium aren't something I think about cavalierly, things like my thoughts on gay marriage or the what ended the Roman Empire. Rather, it's something I've studied in depth for decades, looking at theory and data. If I didn't have a strong opinion after such a time, you could safely conclude I didn't see anything.

Theory is a lens and a blinder, necessarily focusing on some things to the exclusion of others.  You need a theory to see. It may be wrong, but unless you take a point of view you'll never know, and everything will remain as it is, a blooming, buzzing confusion. 

6 comments:

Anonymous said...

Has Cliff Asness ever admitted to being wrong about anything? Perhaps he could debate your points, rather than accuse you of confirmation bias.

TK said...

"unless you take a point of view you'll never know"

Strip out the double negative and you have "you [must] take a point of view to know"

Yikes! What sort of cognitive bias does this fall under?

bjk said...

Why is it necessary to come up with a cause for low-beta outperformance? Why not just point to the anomaly and acknowledge that causes can be mysterious? Doctors are glad to prescribe a drug even if they don't understand the mechanism. Forgive me if I haven't followed this debate from the beginning.

MSL said...

If Asness mechanism is the sole mechanism, then removing constraints on leverage through regulation or private sector innovations will cause the anomaly to diminish or possibly reverse.

If Falkenstein is right, then leverage innovations/deregulations will have no effect or even enhance the phenomena of low-beta outperformance.

A mixture of the two means that anything is possible, so the underlying model is important for future speculations.

Mercury said...

Assuming that greater certainty is in fact the goal there may be an art to favoring some beliefs over others as placeholders or guides in the face of uncertainty but the ultimate goal should always be to replace belief with certainty. The best tool humans have for this is science and the best rule of thumb for the practice of science is probably Karl Popper’s concept of falsifiability.

The testability of a hypothesis determines certainty and certainty displaces belief. Of course, some hypotheses are more testable than others and the more testable a hypothesis is the closer that field of inquiry is to science (the less so, the closer it is to philosophy). At the end of the day opinions and beliefs have either inhibited more than they enabled the arrival of greater certainty or they didn’t. Generally I don’t see much evidence that arguments on Falkenblog are an impediment to greater certainty but I do see many posts that present a testable hypothesis, apply scientific scrutiny to the half-assed arguments of others and connect ideas from the “philosophy” side of the scale to actual empirical data in an attempt to drag them towards the opposite side of science and certainty.

Not so bad actually. Is this well-known commentator familiar with the work of Dr. Paul Krugman?

Anonymous said...

One can trust Mercury for a pile of pants in each of his comments... There's a French saying that goes something like this: "culture is like jam, the less you have, the more you spread it"...

Anyway, Popper's main point is precisely that there is no certainty. Testing a hypothesis (attempting to falsify it) corroborates it but does not make it certain in any way. Please google "Popper" together with "certainty" and learn.