Group-Unthink – Why Investors Need to Avoid Confused Analysis

by Plotinus

Reading a recent CNBC article on how family offices investment decisions are changing with the transfer of wealth from older to younger generations1, it inadvertently touched on an issue which should be at the heart of all family office (and all sophisticated investors’) investment analysis, what we would term “Group-Unthink”.

A parody on groupthink, Group-Unthink is where bad decisions are made because of a collective failure to think. An idea is generally assumed to make sense and is then followed by a lack of thinking, leading to key logical confusions in the idea, being missed, resulting in flawed decision making.

To return to the case in point, in the article on wealth transfer, a family office expert was quoted with reference to younger generations investments being more motivated by societal impact than by money. “The challenges are real … yes, we talked about climate in the 60s and 70s, you’ll find them in the American newspapers then, but it was just a little more abstract. Now, it’s real. Storms are coming, flooding is happening, hurricanes are more often… it’s proof [and] they see it.

The text (bolded for clarity) was what caught our attention. The ‘they’ in the sentence being younger family office members. The problem here is that the proof that “they” are apparently seeing, the US National Oceanic and Atmospheric Administration (experts in this field) are not seeing. In fact in a recent study2 they deduce that a 2°C increase in global mean temperature would lead to a 15% decrease in North Atlantic hurricanes and a likely increase of 5% in storm strength. This however is not the key finding of the study which is, that hurricanes variability from extremely active to extremely inactive seasons has, and is expected to increase (by an estimated 36%).

Here we have an expert in family offices, drawing an erroneous conclusion, suggesting that family offices are investing influenced by the same premise. Similarly CNBC editorially included this quote without allusion to the statement’s inaccuracy. This is an example of where a lack of thinking is leading to the missing of nuance creating confusing incorrect analysis.

What is most disturbing about the absence of critical thinking is inconsistency. One would imagine that anyone in a family office investment advisory role, for example, would never make the mistake of misunderstanding tail risk, by confusing greater magnitude, with greater frequency, with increased variability of outcome, in an investment context. The stringency that is applied by investment committees in vetting investment allocations, is testament to rigorous analysis, so it is right to be worried, from an investment perspective if that type of rigor is not being applied across the board, by negligently accepting without question narratives, when the data is available to assist their verification. After all, the goal of investing in a sophisticated manner is to identify, where possible the value of information and recognize its limitations, in order to make better investment decisions.

What You See is What You Get – AI is a Potential Solution but it Too is a Nuanced One

The above anecdotal example shows the power of narrative and illustrates how investors should hedge against the potential of their own instincts to believe unquestioningly in a story. At Plotinus, we are inclined to propose that the solution to providing that hedging mechanism lies in using AI in the investment decision making process. That itself though is a matter of nuance and it too is subject to the flaws of human storytelling.

Simply going with the idea that using AI will strip out human irrationality is a fallacy. To explore this, one has to step back and look at what the data is on which the AI is trained, where it comes from and how it is interpreted. In most cases large AI models sources of data are laden with the same human influences, biases, misinterpretations because humanity has created the data. When it comes to the next level of AI generated data, it too is a reflection of that same human created data. Furthermore the development of AI itself has been very profoundly influenced by a Turing Test mindset, where the emphasis is on replicating human intelligence to the point where machine and human are indistinguishable. The problem with this is that this human intelligence is very complex and often deeply flawed.

For our part at Plotinus, we are more comfortable with a form of AI that uses limited amounts of data. That analyzes that data in a non-conventional manner, not as an attempt to replicate human analysis but as a way to provide a different insight from which trading decisions can be made. In taking this approach, one is presented with a crude form of decision making, which makes correct and incorrect decisions. The unclutteredness of this process is key to being able to identify the limitations of its decision-making capabilities. Thus the system is designed to avoid the human flaw of being unreal and seeing proof where it is not. This is where it can add benefit for sophisticated investors who are aware of the dangers of making decisions using confused analysis and wish to mitigate it.

1 How Gen X and millennials are changing the face of the traditional family office as they inherit over $80 trillion.
2 Projected increase in the frequency of extremely active Atlantic hurricane seasons, Science Advances, Vol. 10, No 46.

© 2024 Plotinus Asset Management. All rights reserved.
Unauthorized use and/or duplication of any material on this site without written permission is prohibited.

Image Credit: Chantal de Bruijne at Shutterstock.