a System of Intuition and Reasoning

In listening to Daniel Kahneman talk about intuition I get the sense that he’s an advocate for developing an environment that accepts intuition as a fundamental feature of human behavior, and accommodates its flaws and strengths.

Kahneman, and other cognitive psychologists, describe a two-part process of thought involving intuition and reasoning.

System 1 (intuitive) is: fast, automatic, confident,, parallel, effortless, associative, slow-learning, and emotional.

System 2 (reasoning) is: slow (it doubts and deliberates), serial, controlled, effortful, rule-governed, flexible (it can be taught/can learn), and (emotionally) neutral.

According to Kahneman we are governed by System 1, it works sub-consciously as it responds to changes in our environment automatically, confidently, (doubt and deliberation are not features of System 1) and emotionally. When it works well it provides fast and useful impulses along with a rational and useful network of coherent associations. When it doesn’t work well it provides fast, confident, impulses and irrational yet coherent associations for the reasoning mind (System 2) to work with. System 2 works with, deliberates on, whatever it’s fed from System 1, so if the information/associative network is bad, our subsequent decisions are flawed – the speed, confidence, and sub-conscious nature of System 1 makes training it very difficult – Kahneman points out that System 1 seems actually resistant to learning – resistant to a change in direction of stimulus from the perceptual/nervous system to a dialogue with one’s reasoning mind. Training System 1 is possible, however.

Skilled, intuitive chess masters or violinists may spend upwards of 10,000 hours of practice in order to cultivate (train) their intuition. Cultivating intuition is labor intensive, and requires more than a time commitment, Kahneman points out that successful training of System 1 also requires quick and unequivocal, reliable feedback. Cultivating one’s intuition for chess or music, for example, can be successful as both of those areas are highly understood and highly rule based – hence the feedback one receives from interaction with those processes can be unequivocal and reliable. Put another way, System 1 responds to interactions that are similar to its own tendencies. These conditions are not often met, and the result is flawed intuition and flawed reasoning – we often fall victim to the law of small numbers, for example, i.e., making a law or governing principle based on a brief experience.

Most of us, it seems, are slowly training our intuitive minds by persistent interactions with the environment., many of which are less than optimal.

This seems to be an opportunity for artists and designers, doesn’t it? Obviously some of us have been exploiting these cognitive tendencies for a long time – think about advertising, product, and commercial interior design. The perception-to-System 1-to-System 2 paradigm seems to make the world present itself to us one problem at a time (the serial nature of System 2 – we are most aware of the last thing that happens, and in this process that’s the reasoning of System 2), and seems to make it difficult for us to draw critically and deeply on past knowledge to solve these immediate, present problems and dangers – Kahneman calls this Narrow framing and I wrote about it here two weeks ago.

It occurs to me that environmental elements are participants (silent partners, really) in the decision making process, and those whose job it is to make things and arrange environments, and those who care about building knowledge and improving our conditions may benefit from a point of view that includes an understanding of the psychology of interaction, and a respect for just how much the things we live with matter.

Knowledge, as Piaget noted a long time ago, is operative. Kahneman and his colleagues have expanded and described our understanding of perception, intuition, and cognition by experimentation and observation in a way that makes an enhanced critical dialogue and practice surrounding art, design, and interdisciplinality possible.
In an interview following his recent talk at Berkeley, Kahneman, in response to a question about decision making within the System 1 and 2 paradigm, had this to say:

Q: You say that skills are acquired in an environment of feedback and opportunity for learning in a social network. That would help us understand what makes it possible for [professional intuition] to be [successful].

A: They think about situations a lot and they talk about things a lot, so they develop models of various kinds of files. They don’t have to experience — you know, we are capable of learning a great deal from simulated experience. Even athletes can learn from simulating things in their minds, and they do: they practice a lot at night without doing anything. This is one piece of machinery that we dispose of. It will not help you in certain domains; it’s not something that a CIA analyst can do, because the systems that they deal with are fundamentally more complicated.

Q: Which raises the interesting question of how groups can learn from their own experience. Your work is related to decision making in the marketplace, and in a minute we’ll talk about your article in Foreign Policy. In those cases, what is the difference when you have institutions and groups that would like to correct these kinds of errors?

A: Well, in the first place, my main observation would be that groups, by and large, do not correct errors. That’s [from] recurrent observations. There’s a lot of lip service paid in organizations about improving the quality of our intelligence and the quality of our decision making but I think it’s mainly lip service, because imposing a discipline on decision making, as I illustrated by my example of the book — you know, I don’t want to impose discipline on my decision making, and the leaders of organizations — civilian and governmental and commercial — don’t like to be second-guessed. It’s the rare leader — [although] there are very salient examples; the Cuban missile crisis is the example that people think about, where President Kennedy developed a deliberating team that was superbly efficient in allowing dissent and in allowing ideas and slowing down the process of decision making to a rate that was appropriate to the complexity of the situation. That’s very rare.

“…President Kennedy developed a deliberating team that was superbly efficient in allowing dissent and in allowing ideas and slowing down the process of decision making to a rate that was appropriate to the complexity of the situation.” I think that’s a brilliant and intriguing observation, and find myself how one might design an environment so that the rate of decision making is more often appropriate to the level of complexity of a given situation.

Here’s the interview quoted above – and, again, this is a companion event to the Kahneman’s lecture linked in the previous post. Oh, and enjoy the intro music…