In-depth analysis on Credit Writedowns Pro.

Spinoza, Descartes and suspension of disbelief in the ivory tower of economics

Here’s something I want to run by you on behavioural economics and the way economic issues are being debated in the blogosphere.

We are witnessing an implosion of long-held belief structures that go the core of how we believed our economic system functioned. You heard Alan Greenspan admit this after the financial system collapsed in his testimony in October 2008 before Congress:

"Yes, I found a flaw," Greenspan said in response to grilling from the House Committee on Oversight and Government Reform. "That is precisely the reason I was shocked because I’d been going for 40 years or more with very considerable evidence that it was working exceptionally well."

Greenspan said he was "partially" wrong in opposing regulation of derivatives and acknowledged that financial institutions didn’t protect shareholders and investments as well as he expected.

-Greenspan Concedes to `Flaw’ in His Market Ideology, Bloomberg, 23 Oct 2008

The implosion of this neo-classical laissez-faire belief system of economics is a death about which we now grieve.

Disbelief

However, the grief is getting in the way of rational conversation. It goes to suspension of disbelief, cherished values, strongly-held beliefs and fear. I believe these are major issues in how accepting we are of new ideas  — and consequently for why this particular financial crisis is so devastating.

I have run into this problem on two specific occasions recently.

Despite my Austrian economics sympathies, I recently posted some articles inspired by Modern Monetary Theory (MMT).

Now, I don’t buy into some of what Modern Monetary Theory says about source of money’s value and the role of the state in monetary affairs. But I do very much appreciate MMT’s understanding of the mechanics of the fiat currency monetary system (a system I don’t fully support, by the way).

So I have presented some MMT-based ideas from a neutral frame in order to demonstrate their applicability to the present financial crisis. Invariably, I run into a lot of spurious arguments by people who sound like they don’t understand the accounting.

Or maybe they just feel threatened on some strange existential level – as if what I am writing threatens their core belief system. I think that is a lot of what is going on. So I am writing this post to explain how the human brain processes information. And then I will make a few remarks about how this applies to the present day situation.

Suspension of disbelief

The core of my argument will come from James Montier, now at the fund manager GMO. As a strategist at Dresdner Kleinwort Benson in 2005, he wrote a timeless piece on the debate between two 17th century philosophers René Descartes of France and Baruch de Spinoza of the Netherlands. Descartes was of the view that people process information for accuracy before filing it away in memory. Spinoza made the opposite claim, that people must suspend disbelief in order to process information. The two competing ideas were put to the test; and it appears that Spinoza was right about the need for naïve belief, something that has grave implications for investing, the subject of Montier’s essay.

Here is a long excerpt of what Montier wrote. The article is available online via John Mauldin (the link is at the bottom). This is a fantastic look into how people process information.

Sometime ago a client asked us to compile a list of myths that the markets seemed to hold dear. We came up with twelve potential myths ranging from stocks for the long run to dividends don’t matter via such topics as commodities for the future and bond supply matters. However, this exercise also made me wonder why it was that supposedly smart people ended up believing such strange things.

This pondering sent me (as is usually the case) to the annals of psychology. To some extent these errant beliefs seem to stem from bounded awareness/inattentional blindness and framing. We have explored such elements before. However, there may well be another factor at work. We seem to be hard wired to ‘believe’.

Daniel Gilbert, a professor of psychology at Harvard, has explored how we go about believing and understanding information. In a series of truly insightful papers Gilbert and co-authors have explored the belief process using two alternative philosophical viewpoints.

Cartesian systems
The first view is associated with the work of Rene Descartes. When it came to belief, Descartes suggested the mind performs two separate mental acts. First it understands the idea. Secondly, the mind assesses the validity of the idea that has been presented. This two stage process seems intuitively correct. After all, we can all imagine being presented with some novel idea, holding it in our minds and then pondering the truth or otherwise associated with the idea. The Cartesian approach fits well with folk psychology.

Descartes was educated by Jesuits and like many 17th century philosophers generally deployed psychology and philosophy in the aid of theology. Like anyone of any sense Descartes was well aware that people were capable of believing things that weren’t true. In order to protect the Church, Descartes argued that God had given man the power to assess ideas. So it clearly wasn’t God’s fault when people believed things that weren’t true.

As Gilbert (1993, op cit) notes, Descartes approach consisted of two axioms. Firstly, the mental separation and sequencing of understanding and believing and secondly, that people have no control over how or what they understand, but are totally free to believe or disbelieve ideas as they please.

Spinozan systems
Spinoza’s background and thinking could not be much more different than Descartes. Born a Jew, Barauch de Espinoza (later to become Benedict Spinoza) outraged his community and synagogue. The tensions finally resulted in Spinoza being excommunicated, accused of abominable heresies and monstrous deeds. The order of excommunication prohibited other members of the synagogue from having any contact with Spinoza.

Freed of the need to conform to his past, Spinoza was able to explore anything he chose. One of the areas he turned his considerable mental prowess to was the faults contained in the Cartesian approach. Spinoza argued that all ideas were first represented as true and only later (with effort) evaluated for veracity. Effectively Spinoza denied the parsing that Descartes put at the heart of his two step approach.Spinoza argued that comprehension and belief were a single step. That is to say, in order for somebody to understand something, belief is a necessary precondition. Effectively all information or ideas are first accepted as true, and then only sometimes evaluated as to their truth, once this process is completed a ‘corrected belief’ is constructed if necessary.

Libraries
Gilbert et al (1990, op cit) use the example of a library to draw out the differences between these two approaches. Imagine a library with several million volumes, of which only a few are works of fiction. The Cartesian approach to filing books would be to put a red tag on each volume of fiction and blue tag on each volume of non-fiction. Any new book that appeared in the library would be read, and then tagged as either fiction or nonfiction. Any book that is unread is simply present in the library until it is read.

In contrast, a Spinozan library would work in a very different fashion. Under this approach a tag would be added to each volume of fiction but the non-fiction would be left unmarked. The ease of this system should be clear; it requires a lot less effort to run this system than the Cartesian approach. However, the risk is that if a new book arrives it will be seen as non-fiction.

Gilbert et al note that under ideal conditions both systems produce the same outcome if allowed to run to conclusion. So if you pick up a copy of Darwin’s ‘The expression of emotions in man and animals’ and asked the Cartesian librarian what he knew about the book, he would glance at the tag and say non-fiction. The Spinozan librarian would do pretty much the same thing, concluding the book was non-fiction because of the absence of a tag.

However, imagine sneaking a new book into the library, say the latest Patricia Cornwell thriller. If you took the book to the librarian and asked them what they knew about the book, their response would reveal a lot about the underlying process governing the library’s approach to filing. For instance, the Cartesian librarian would say "I don’t know what sort of book that is. Come back later when it has been read and tagged appropriately". The Spinozan librarian would glance up and see the absence of a tag and say "it doesn’t have a tag so it must be non-fiction" – an obviously incorrect assessment.

 
A testing structure
The picture below taken from Gilbert (1993) shows the essential differences between the two approaches, and also suggests a clever way of testing which of the two approaches has more empirical support.

descartes-vs-spinoza

Say an idea is presented to the brain, and then the person considering the idea is interrupted in some fashion. Under a Cartesian system, the person is left merely with an understanding of a false idea, but no belief in it. However, if people are better described by a Spinozan approach then interrupting the process should lead to a belief in the false idea. So giving people ideas or propositions and then interrupting them with another task should help to reveal whether people are Cartesian or Spinozan systems when it comes to beliefs.

The empirical evidence
It has long been known that distracting people can impact the belief they attach to arguments. For instance, in their 1994 review Petty et al report an experiment from 1976 which clearly demonstrated the impact of distraction techniques.

To test the impact of distraction, students were exposed to a message arguing that tuition at their university should be cut in half. Students listened to the ideas which were presented over headphones. Some heard strong arguments, others heard relatively weak arguments. At the same time, the students were subjected to a distraction task which consisted of tracking the positions of Xs that were flashed on a screen in front of them. In the high distraction version of the task, the Xs flashed up at a fast pace, in the low distraction task the rate was reduced heavily.

The results Petty et al found are shown in the chart below. When the message was weak, people who were highly distracted showed much more agreement with the message than did the people who only suffered mild distraction. When the message was strong and distraction was high, the students showed less agreement than when the message was strong and the distraction was low. Distraction did exactly what it was meant to do… prevented people from concentrating on the important issue.

montier-post-message-attitudes

Petty et al conclude "Distraction, then, is an especially useful technique when a person’s arguments are poor because even though people might be aware that some arguments were presented, they might be unaware that the arguments were not very compelling." Something to bear in mind at your next meeting with brokers perhaps? The next time an analyst comes around and starts showing you pictures of the next generation of mobile phones, just stop and think about the quality of their investment arguments.

Is there more direct evidence of our minds housing a Spinozan system when it comes to belief? Gilbert et al (1990, op cit) decided to investigate. They asked people to help them with an experiment concerning language acquisition in a natural environment. Participants were shown ostensibly Hopi words with an explanation (such as a monishna is a bat). They had to wait until the experimenter told them whether the word they had been given was actually the correct word in Hopi or whether it was a false statement.

Subjects also had to listen out for a specific sound which if they heard required them to press a button. The tone sounded very shortly after the participant had been told whether the statement was true or false. This was aimed at interrupting the natural processing of information. Once they responded to the tone, the next Hopi word appeared preventing them from going back and reconsidering the previous item.

When subjects were later asked about their beliefs, if they worked in a Spinozan way then people should recall false propositions as true more often after an interrupt than the rest of the time. As the chart below shows, this is exactly what Gilbert et al uncovered.

montier-words-recognized

Interruption had no effect on the correction identification of a true proposition (55% when uninterrupted vs. 58% when interrupted). However, interruption did significantly reduce the correct identification of false propositions (55% when uninterrupted vs. 35% when interrupted). Similarly one could look at the number of true-false reversals (the right side of the chart above) When false propositions were uninterrupted, they were misidentified as true 21% of the time, which was roughly the same rate as true propositions were identified as false. However, when interrupted the situation changes, false propositions were identified as true some 33%, significantly higher than the number of true propositions were identified as false (17%).

In another test Gilbert et al (1993, op cit) showed that this habit of needing to believe in order to understand could have some disturbing consequences. They set up a study in which participants read crime reports with the goal of sentencing the perpetrators to prison. The subjects were told some of the statements they would read would be false and would appear on screen as red text, the true statements would be in black text.

By design, the false statements in one case happened to exacerbate the crime in question; in the other case they attenuated the crimes. The statements were also shown crawling across the screen – much like the tickers and prices on bubble vision. Below the text was a second row of crawling numbers. Some of the subjects were asked to scan the second row for the number (5) and when they saw it, they were asked to press a button.

At the end of experiment, subjects were asked to state what they thought represented a fair sentence for the crimes they had read about. The chart below shows that just like the previous example, interruption significantly reduced the recognition of false statements (69% vs. 34%), and increased the recognition of false statements as being true (23% vs. 44%).

montier-statements-recognized

The chart below shows the average recommended sentence depending on the degree of interruption. When the false statements were attenuating and processing was interrupted there wasn’t a huge difference in the recommended jail term. The interrupted sentences were around 4% lower than the uninterrupted ones. However, when the false statements were exacerbating and interruption occurred the recommended jail term was on average nearly 60% higher than in the uninterrupted case!

montier-recommended-jail-terms

The reptilian response 

Edward Here. What Gilbert, Petty, and Montier have demonstrated is that human beings have to suspend disbelief to process information and make judgments based on that information. Unfortunately, distractions (think bread and circuses) can lead people to believe something is true when in fact it is not – with grave implications for investing.

However, that’s not what happens with strongly-held beliefs at all. I remember talking to my mother about the Montier post, asking her about her own strongly held views on religion. Her answers were interesting because it demonstrated to me an unwillingness to even process information that ran counter to her most cherished and strongly-held beliefs. She admitted this interpretation was correct when we discussed it afterward.  Remember what Montier said "in order for somebody to understand something, belief is a necessary precondition." The point was that she didn’t even process the information – such an existential threat it was to her.

Human beings have a very clear view of self and this is strongly intertwined with a belief system which generates what we describe as core values. So, if you attack those core values, you are likely to get an irrational and reptilian response. There is no processing of information as I described in "Through a glass darkly: the economy and confirmation bias in the econoblogosphere" going on; the cognitive dissonance is too great. Instead what you get is fear and an irrational defence. This is what my mother described.

The resolution of cognitive dissonance

So the world view widely held in Anglo-Saxon economies that markets are self-regulating and self-equilibrating is under threat because of the dislocations of the last two years. However, this view is deeply entrenched, having built up over nearly three decades of history. It is now adhered to with almost religious fervour (see my thoughts on this in The year in review at Credit Writedowns – Kleptocracy). People are not going to relinquish the self-equilibrating/regulating view overnight and not without overwhelming evidence to the contrary; the cognitive dissonance would be too great.

What this effectively means for me is that financial calamity and economic collapse are really the only way to dislodge this thinking.  Maybe I’m wrong – and, in fact, the markets are self-regulating and self-equilibrating.  Recent events suggest otherwise as does the frequency of what were viewed as similarly improbable market disturbances. And maybe I’m wrong about suspension of disbelief. Perhaps, humans are resilient and can process information despite the existential threat it poses to their sense of self.  I sure hope I am wrong for the sake of the economy.

Source

Scepticism is rare, or, Descartes vs. Spinoza – Investor Insight

About 

Edward Harrison is the founder of Credit Writedowns and a former career diplomat, investment banker and technology executive with over twenty years of business experience. He is also a regular economic and financial commentator on BBC World News, CNBC Television, Business News Network, CBC, Fox Television and RT Television. He speaks six languages and reads another five, skills he uses to provide a more global perspective. Edward holds an MBA in Finance from Columbia University and a BA in Economics from Dartmouth College. Edward also writes a premium financial newsletter. Sign up here for a free trial.

26 Comments

  1. John Haskell says:

    ok- so “Spinoza was right,” in that people suspend disbelief before processing information. Several studies support this conclusion. But you, in your personal observation, tried to have a discussion with your mother about religion and observed that she did not suspend disbelief at all- she rejected your proposition almost immediately.

    And we see in society at large a large group that also rejects uncomfortable ideas immediately – i.e. “banks are run by thoughtful people; unregulated financial markets work best.”

    It seems that “Spinoza was right” when college students are examining ideas of no great importance to them- e.g., should a hypothetical person be put in jail. But in the real world we see that very many important ideas are not assessed with suspension of disbelief- quite the opposite.

    • That’s it exactly. In order to process the information you have to suspend disbelief. But what if you don’t want to process the information? I mean if you told me you could drive across the US in 17 hours and had done it many times, I probably would dismiss this out of hand. People are even more dismissive of ideas that consciously or unconsciously contradict closely-held views of how the world works. It’s as if you already know the idea is patently false from prior examination. So why re-examine it?

  2. J. Powers says:

    Great stuff, Edward. Thanks so much for blogging about this. Real nourishing food for thought. One important reaction: I think your categorization of these findings as “disturbing” itself reveals a bias of the kind you’re criticizing.

    The valorization of certainty in practical matters, like commerce and politics, is a rather bizarre bastard of early modernity’s marriage of theology, science, and politics. From Attic Greece to the French Revolution, it was taken for granted that anything having to do with ordinary human affairs was uncertain. Politics and commerce were probabilistic (the history of probability, its migration from a the name for legal or literary authority to a numerical “science” is fascinating), entirely distinct from apodeictic mathematics, geometry, and logic. The cultivation of skepticism was therefore considered an essential task of civic education: Aristotle said that the mark of an educated mind is the ability to evaluate something while withholding assent. It was taken for granted that rational skepticism and critical evaluation wasn’t natural, it was something that had to be learned, painfully, through study and practice. (Anyone who teaches college students understands this intuitively.)

    Anyone who lives with children will understand why people have to believe before they understand or assess. Babies and children need to survive in an environment that needs to be taken for granted before it can be abstracted. First we eat, then we talk, then perhaps we think. Or take Willam James’s “The Will to Believe,” which argues convincingly that humans are at their best when they’re believing. Good thing we default to belief. I mean, providing credit to someone who has some idea for a new business–that’s not a decision that’s 100% bulletproof rational. That requires belief. Risking your savings by lending it to some corporation over which you have no control? That requires belief. Modern finance doesn’t manage risk (as it supposes), it manages belief. It’s more closely related to rhetoric than mathematics. (So, I think, is accounting–which is about words, not numbers. Defining what’s income and what’s not, etc. The rest is so mechanical a spreadsheet can do it.) Further, don’t most markets reflect a measure of confidence in future value? Belief again. Human commerce breaks down without belief that isn’t perfectly and entirely justifiable.

    Notwithstanding the pins and needles you feel when making a bet on some investment (and they’re all just bets), why should we be uncomfortable with belief at a theoretical level? That’s not disturbing, that’s a relief. Finally we can start to talk about how to train a generation of traders, economists, and regulators who grapple with the way markets really work, through persuasion and argumentation.

    • I am sure I have the same biases (and employ the same heuristics) everyone else does. But I never used the words disturbing. That is from the Montier excerpt.

      • J. Powers says:

        Yeah, sorry about that. Even with your cues, it’s a bit hard to follow who’s saying what. My bad.

        So, do you find the conclusions disturbing in general, or only in the special case of strongly-held beliefs?

        • J., I like the conclusions just because they are so powerful. But I would agree with Montier that it is disturbing that our ‘distractability’ could drastically alter the kinds of sentences we mete out in the criminal justice system.

          As for the last bit that I added on the ‘reptilian response,’ I live it every day. Trying to build some sort of framework to explain the chaos now going on around us is difficult. When you get there it is the product of hours and hours of work. So being confronted with non-confirming evidence after the fact induces a visceral and negative response. I guarantee you the same is true for me or for Marshall or anyone who writes on this blog as it is for everyone else.

          That part I find natural and understandable. But, of course, after the initial response, we need to overcome that defensiveness because it is anchored in fear. And letting fear dictate our actions is not a good way to live.

          • J. Powers says:

            Edward, thanks for the thoughtful reply. Reading my first reply, I can see why you took me as offering a critique. Apologies for the heavy-handed writing. It’s an exciting idea, and I didn’t edit carefully enough.

            I wasn’t thinking of the bias against irrational belief as belonging to you specifically. I see it as a general condition of our culture, which leads us to suppose that politics and commerce _ought_ to be rational. We’re disturbed when it turns out that they’re not, but we nevertheless seem to have a hard time just admitting, “Well, I guess markets and politics and people in general aren’t entirely rational.” I don’t mean that markets and politics and people can’t be explained, just that the explanation or model is never exhaustive. (Am I stating the obvious here?)

            I like to think that it’s possible to help people improve their capacity to, as you put it, “overcome that defensiveness” that “is anchored in fear.” I suppose you could say that I’m thinking about the emotional underpinnings of rational skepticism, and I’m wondering whether a market made up of people with a strongly-held belief that markets are irrational would be different than a market with strongly-held belief that markets are rational. Looking at it this way, it seems like training in rhetoric (how to see, analyze, and use persuasive techniques) would be more useful than simple numeracy.

            Just more food for thought. Again, thanks!

  3. Matt Stiles says:

    Great article. I like exploring the psychological side to explain what most just dismiss as malintent.

    But it still bothers me to hear that Greenspan or his type actually had any belief in “laissez-faire” or unregulated markets. If they had, they wouldn’t have advocated the numerous market interventions they did (interest rate manipulations, money supply, relaxing capital restrictions for banks, etc).

    Or perhaps they were so “pro business” that they blindly followed policies that in the short-run appeared to be of benefit to big business, but which in the long-run sought to destroy them… A blatant act of hypocrisy to an outside observer, but perhaps just a defensive mechanism by Greenspan et al to prevent outside skepticism of his worldview.

    His logic would have gone like this:

    Recession is bad because people will question my worldview, not because it is bad (or unnecessary itself);

    There are two ways in which I can prevent this questioning: fiscal stimulus, or the relaxation of accounting standards, derivatives regulation, capital ratios, etc. I prefer the latter;

    Therefore, because these policies will prevent a questioning of my worldview, they are necessary to perpetuate it.

    It is completely ignored that recession is a necessary precondition of free-markets – it facilitates a transfer of wealth from the incompetent to the competent (savers) via asset price deflation, thus enabling class mobility. Without this feature, capitalism becomes exactly what it supposedly opposes (authoritarianism). And outright collapse naturally follows.

  4. jimh009 says:

    This is a great post. And this type of thinking is clearly evident in all spheres of society, politics and government today.

    But I do someday hope you clarify on this little bomb you dropped in the post:

    > What this effectively means for me is that financial calamity and economic collapse are really the only way to dislodge this thinking.

    I actually agree with your statement – unpleasant but needed changes in life are usually forced by calamitous events. Yet, the terms “financial calamity” and “economical collapse” can mean many things. Maybe someday you can elaborate on what you see in the future?

    • Matt Stiles says:

      “Financially calamitous to whom?” is the question I want answered.

      Perhaps part of the reason I feel recession is a necessary function in a market economy is because of my age and financial position. I am 27 with no debt and some savings. I am self-employed in a fairly stable business.

      The ones saying that we have to avert deflation and/or depression “at all costs” seem to be those that are neck deep in debt, own real estate and have pensions invested in other assets. Or they are representing those with such interests (ex. bank economists).

      Naturally, we are both self-interested. Who isn’t?

  5. Scott says:

    “Invariably, I run into a lot of spurious arguments by people who sound like they don’t understand the accounting. ”

    Is there a possibility there are people mistaking apples for oranges? If an analyst says, “the formula is an identity so it must stand” that may be interpreted that if the private sector saves then government sector must spend because the formula, not actions dictate reality.

    GDP = Private Sector + Government Sector + Net Exports

    Period 1: GDP = 70 + 40 – 10 = 100

    G as a % of GDP 40%

    Period 2: The private sector saves 1 unit.

    GDP = 69 + 40 – 10 = 99

    G as a % of GDP = 40/99 = 40.4%

    G goes up as “a percentage of GDP”, G does not go to 41 unless the government decides to spend another 1 to offset the loss from private sector saving.

    I bring this up because I do understand accounting and I do understand the NIAI formula, but I don’t understand how the formula dictates that G must increase spending by 1 to offset saving of 1. It took me a while to figure out, but your recent post showing charts all with the Y axis as spending as a % of GDP finally made it click. I’ve been thinking apples and analysts have been thinking oranges. It’s so simple that I think I may not be the only one making the mistake, hence your accounting challenged readers.

    When Greece hits a deficit of X% we all go nuts and when the US spends $X billion on a bailout we all go nust, but the difference between the first and the second, is that the first is a description, as represented by the formula and the second is a flow counted in units. It’s difficult to keep it all straight when you add in philosophies and opinions and fancy rhetoric. In the first case we should not worry so much about percents as they are cyclical, in the second case, we should discuss the merits of using the formula as a guide to shape policy decisions to direct the trajectory of GDP growth.

    I am guilty of talking apples when the argument was about oranges. Whether or not that was an unconscious filter where I was only looking for oranges, I cannot determine without re-reading everything, but I’ll say it could have been my fault, but I think it could be less than an accounting error.

  6. Scott says:

    please moderate my comment out Ed if I’m wrong. I’m still not looking at deficits which are G – T, but I don’t think I’m wrong. If T goes down cyclically, it does not matter whether you use my analysis of total G or G – T, I think. I think the basis of the argument still stands. Some are thinking in G regardless, and some are thinking in G – T, and some are thinking in G – T as a percentage of total GDP, and my point was, analysts and the peanut gallery are rarely on the same page.

  7. John Lounsbury John Lounsbury says:

    Edward – - -

    Excellent piece. While reading about the distraction experiments, I couldn’t help but think of how politicians spend so much effort with distractions to cover weak arguments.

  8. One thought I just had regarding Montier’s thesis makes me want to re-state how strongly-held beliefs work under the Spinoza paradigm.

    It is that rejecting new information out of hand based on strongly-held beliefs is entirely consistent with having ALREADY gone over the ideas contained in the new information previously and categorized them as false. There is no need to re-interpret the new information because you already know it is false.

    That is a huge impediment to engaging in suspension of disbelief in order to re-interpret your pre-conceived ideas. Instead people are likely to reject the new information and maintain the previous interpretation – entirely consistent with what psychologists call confirmation bias.

  9. dansecrest says:

    Very good post! And I also see similarity between economic and religious mythology. The mythologies serve a purpose, but eventually must evolve into something more credible…