Yes, yes, of course we understand the risk of over confidence

Overconfidence can be ruinous in real estate and elsewhere. Stephen Ryan discusses the issue and implications in this article for The Property Chronicle magazine.

Overconfidence is a cognitive error that we make routinely. It’s the tendency to hold a false assessment of our skills, intellect or talent. Take this article, for example: I expected to write it in about three hours, but it has taken much longer because I overestimated my writing skill.

A person’s subjective confidence in their judgements is typically greater than the objective accuracy of those judgements. We are particularly overconfident when it comes to harder decisions, it seems.

Overconfidence comes in three varieties: over-estimation of one’s actual performance; ‘over-placement’ of one’s performance relative to others; and over-precision, which means expressing unjustified certainty in the accuracy of one’s beliefs. History is full of examples of overconfidence, such as Napoleon’s catastrophic decision to invade Russia in 1812. Other examples include New Coke, introduced by Coca-Cola in 1985 and withdrawn following a huge backlash from Coke drinkers, and the recent European Super League of football clubs, which fell apart after fan protests.

Cognitive errors, of which overconfidence is just one, can affect investors in real estate at different stages of the decision-making process: the data collection stage and the decision itself. This means investment committees and boards of trustees should be alert to cognitive errors from the very outset of the process.

“We skate over relevant but unmeasured concerns because of what Daniel Kahneman, author of Thinking Fast and Slow, coined WYSIATI, which stands for ‘What you see is all there is’”

“We have 100% confidence in our data.” Why? Protecting the data-collection stage from cognitive error can be challenging. To start with, important aspects of a potential investment may not be measured. As economist Robert Shiller noted, “It’s kind of amazing what is not measured”.

We skate over relevant but unmeasured concerns because of what psychologist and economost Daniel Kahneman, author of Thinking Fast and Slow, coined WYSIATI, which stands for ‘What you see is all there is’. WYSIATI is the notion that we form impressions and judgements based on the information that is available to us without asking, “What information am I missing?” To protect against it, ask yourself some specific questions, such as: does the data measure or even mention political risk? What about embodied carbon?

Another useful exercise is to compare the topics covered by your data with other sources. Sources could include the relevant regulations or a real estate due diligence questionnaire, such as INREV’s. If a topic is covered in one of these sources, but not in your data, can your omission be justified? For pension schemes, compare your data with the Statement of Investment Principles – have you covered every single risk? This sort of comparison
will help us to avoid overconfidence in our data collection. It pushes us to venture beyond what is currently collected to what should be collected. It also requires us to grapple with unstructured data in the form of words. Unstructured means not set out neatly in rows and columns like a spreadsheet.

Managing unstructured data is not difficult if you use natural language processing (NLP). NLP is a branch of artificial intelligence that helps computers understand and manipulate human language. NLP software can ‘slice and dice’ words and phrases in different ways, almost as deftly as spreadsheets handle numbers.

Data omissions are one problem, data choices are another. When things are measured and available as structured data (such as market indices), alternative data sources may exist. The choice between those data sources is also prone to overconfidence.

“Overconfidence affects us at the data collection and investment decision-making stages, and we need to be constantly vigilant against it”

Data collection precedes the investment decision. Overconfidence affects us at the data collection and investment decision-making stages, and we need to be constantly vigilant against it. There is another danger lurking, however, which is the risk that overconfident data collection could expose us to a second cognitive error – the framing effect – which could impact the investment decision.

Framing describes a judgmental shortcut where we react differently to the same scenario depending on how it is presented. Alternative representations of the same objective information significantly alter our thinking and ultimately our decisions. For real estate investors, how the data is presented shapes how we perceive risk and return. If the data is in some way incomplete or skewed, our decisions will be flawed.

Cognitive errors are not easily observable or quantifiable. The most damaging of these is overconfidence: the kind of optimism that leads governments to believe that wars are quickly winnable and capital projects will come in on budget despite statistics predicting exactly the opposite. Daniel Kahneman observed that overconfidence creates illusions. For him, it is the cognitive error he would most like to eliminate if he had a magic wand.

To make good investment decisions we need to guard against overconfidence at two stages: the data collection stage and the decision itself.

Leave a comment:

Your email address will not be published. Required fields are marked *

Top