Reaction Paper Thinking, Fast and Slow 2011 a book by Daniel Kahneman Submitted in partial fulfillment of the requirements For Master of Business Administration Degree Judgment in Managerial Decision Thinking The secrets of the human brain: the two mechanisms that control our lives Thinking, Fast and Slow is a 2011 book by Nobel Memorial Prize winner in Economics Daniel Kahneman which summarizes research that he conducted over decades, often in collaboration with Amos Tversky.
It covers all three phases of his career: his early days working on cognitive bias, his work on prospect theory, and his later work on happiness. The book’s central thesis is a dichotomy between two modes of thought: System 1 is fast, instinctive and emotional; System 2 is slower, more deliberative, and more logical. The book delineates cognitive biases associated with each type of thinking, starting with Kahneman’s own research on loss aversion.
From framing choices to substitution, the book highlights several decades of academic research to suggest that we place too much confidence in human judgment Psychologist Daniel Kahneman, one of the best known experts in cognition and pioneer of behavioral economics, studied for over four decades, decision-making mechanisms of the human brain and identified numerous cognitive errors that influence our decisions without us realizing it. In 2002, Kahneman was awarded the Nobel Prize for Economics for his work which has shown that man is not a “ rational actor”, as claimed many economists, but one subject to numerous pitfalls of intuition.
Kahneman’s Nobel was awarded a first, as for the first time the top prize for economics was awarded to a specialist in another field (in this case, psychology). Kahneman argues that human thinking is controlled by two systems: System 1, which he called “ fast thinking” (quick thinking) is unconscious, intuitive and requires no voluntary effort or control the system in February, called “ slow thinking” (thinking slow) is conscious, uses deductive reasoning and requires much effort.
To observe the person in the picture is angry no need for a conscious effort, realizing this instant and involuntary, in an example of quick thinking, typical of the system 1. Instead, to solve a multiplication problem and operation of 17 to 25 is needed directing conscious attention to a voluntary effort without which the answer can not be obtained. The latter is an example of application of the second system. System 1 is born, a consequence of the evolution and outcome of adaptation to environment over time, while system 2 is a specific component of man.
In fact, what we perceive as self specific system is 2 – self conscious and rational, who manages beliefs, choices and decisions. Although we live under the impression that System 2 is responsible for most decisions we make, our life is controlled largely by the first system. The reason? Every day we have to take many decisions, making it impossible to use the second system for most. Because rational decisions take time for analysis and inference, which consumes energy efforts, the second system is used infrequently.
In most cases, system 1 generates suggestions for the second (impressions, insights, intentions and feelings) that it adopts without modification. System 2 occurs when one system does not provide an immediate response (eg, if the problem 17 x 24) or when it detects an error will occur (as when we refrain to react in a wrong way difficult situation, the system control mechanism will generated two blocks system 1). The two are, however, limits: the researchers found that when a person is occupied with a problem that requires the use of two systems, the ability to self decreases, it is more likely to yield to temptation.
System 1 shows systemic errors, cognitive errors that lead, often, the adoption of wrong decisions. In the latest book, Thinking Fast and Slow, Doctor Daniel Kahneman exposes some of these errors of thought, hoping that by their determination to help others to identify and better understand their own decisions. For system 1 is active all the time (unlike the two, which requires a conscious effort), are more prone to cognitive errors. An example of system autonomy 1: 01 Muller-lyer is optical illusion in which two parallel lines seem to have different lengths.
Even if we measure the two lines and convince ourselves (with the help of the two) that their length is the same, system 1 will continue to perceive as unequal. Like optical illusions, cognitive illusions tend to be difficult to overcome, but the first step out from under the domination of these mistakes is thinking their awareness. When people are in a time of crisis, uncertain situations, decisions are taken by the system 1. Therefore, it is essential to know its weaknesses. Cognitive mistakes that influence our decisions It is vital to understand that there are people who are not affected by the weakness of the system 1.
This is demonstrated by a simple test that Kahneman applied it a thousand times: “ A baseball bat and a ball together cost $ 1. 10. The bat costs a dollar more than the ball. How much is it? “. Even for the most intelligent students, such as those at Harvard and Princeton, more than half gave the obvious answer offered by one system, but also wrong 10 cents. The correct answer was, of course, 5 cents. One of the most common cognitive errors is “ the overconfidence bias” – the tendency to excessive trust in their own abilities.
Statistics show that the chances of a new company founded in the U. S. to work for 5 years is approximately 35%. However, a survey among entrepreneurs showed that they tend to estimate the chances of success of a new company to 60% and 81% chances of their own companies. Kahneman argues that optimism is the engine of capitalism, which is confirmed by the fact that the leaders, inventors and others that influence the life of a large number of people tend to be optimistic, taking risks being convinced that they will succeed in their attempt.
Another cognitive error identified by Kahneman is “ the planning fallacy” – estimation error in planning. Psychologist encountered this problem for the first time in 1970, when the Ministry of Education of Israel asked him to design a manual and a study program on the topic of decision making. Kahneman has formed a team of specialists, among whom was an expert in the design of programs, and after a year of work colleagues asked them to estimate how long they thought it was not necessary.
Most estimated project completion in about two years, with a margin of error of 6 months. Then, Kahneman asked the expert how programs such projects lasted on average. He explained that their average duration was 7-10 years, and 40% of them complete. Though he knew it, even the expert forecast a period of work for another 2 years. Finally, the project was completed in 8 years and in the meantime the Ministry of Education was not interested. Another example of the error in planning comes from the U. S.. A urvey of homeowners indicated that they expected to spend on average 18, 500 dollars a kitchen refurbishment. Real average cost rises, but at 39, 000 dollars. An example of Scotland show that the differences can be even higher: in 1997 when it was revealed the plan for a new Parliament building, cost estimates amounted to 40 million pounds. In 2004, when construction was completed, the total cost was 431 million pounds. Another mistake of thinking is what Kahneman calls “ the availability bias” – the tendency to judge based on what comes easily to mind.
A survey of Americans revealed that they believe that the probability of a fatal accident is 300 times higher than that of death from diabetes, although the actual rate is 1: 4. Kahneman consider this a proof that the media influence how we perceive risks, which may have negative consequences on our lives. A study conducted after the terrorist attacks of 11 September 2001 showed that many Americans have chosen that year I drive long distances to take the plane instead of these, about 1, 500 were killed in traffic accidents, underestimating the risk of traveling by car and instead of a terrorist attack.
A serious cognitive error is “ the anchor effect” – the effect of anchoring. One example that illustrates this “ shortcut” thinking is a study of a group of German judge with over 15 years experience. In the experiment, they were read a description of a case in which the defendant was caught stealing in a shop, and before deciding punishment, judges were asked to throw two dice. They were rigged so that amounted to either 3 or 9. Then they were asked to decide the proper punishment for the defendant.
Although dice should not affect the decision of experienced specialists, researchers found that nine judges whose dice insumasera decided, on average, a sentence of eight months in prison and those whose dice totaling 3 gave an average , a sentence of five months in prison. The efficiency of this error led to constant exploitation in trade, to model price expectations of buyers. For example, a company could offer three versions of the same service, so the cheapest option seem more attractive compared to the alternatives more expensive than if you were only proposed.
For the same reason, the auctions are usually set a starting price. Another important aspect of the system 1 is that, when faced with a difficult question, it tends to provide the answer to another question, simple, without us realizing. Professor Kahneman gives the example of a study conducted on a group of German students. Some of them have received the following two questions: “ How happy are you? ” And “ How many romantic encounters you had last month? “. Others received the questions in reverse order.
If in the first case there is no correlation between the responses in the second case could observe a correlation between the number of meetings and the happiness shown by students. Professor Kahneman explains: “ In order to respond correctly to the question” How happy are you? “, We need to think more. Students were asked first about romantic encounters have not felt the need to think, because they have substituted the answer to this question with the answer to another – “ how happy are my love life? “.
Students are aware that their love life is not the only important aspect to them, but system 1 gave an easy answer, and they used it. ” Regarding happiness, says Dr. Kahneman, memories play an important role. People do not own one, but two: experimental self (the experiencing self) and self memory (Remembering the self). Most people are guided by the second. To illustrate, Dr. Kahneman readers a question: would you be willing to pay for a great vacation, but at the end of which you should drink a potion that would erase any memory of the trip and you also remain free photos and videos?
Probably not. To illustrate the difference between memories and experiences, Kahneman tells the dialogue she had with a member of the public after a lecture. The caller told him about when listening to a symphony exceptional captivated at the end of which there was a huge noise, because the disc was scratched. “ The end broke my audition,” he said. Kahneman explained that, in fact, the experience was not destroyed, because he enjoys music for 20 minutes. Had been affected, indeed, the memory of this experience.
The confusion of the two issues is a cognitive error that can have unpleasant consequences. This was demonstrated in an experiment the volunteers were subjected to two painful experiences. Then asked to choose one of them to be repeated, they chose the most painful of them overall, which last longer but it ended with a less intense pain, because it leaves a good memory . “ Self experimental tends to have a strong enough voice when we plan our activities. When people make decisions not wonder ‘ what I feel and for how long? “ And tend to neglect in favor of living memory that will emain,” says Kahneman. “ In the past we learn, usually to maximize the quality of future memories, not the quality of future experiences. I call this “ the tyranny of memory self, ‘” writes Kahneman. Refferences http://en. wikipedia. org/wiki/Thinking, _Fast_and_Slow http://www. nytimes. com/2011/11/27/books/review/thinking-fast-and-slow-by-daniel-kahneman-book-review. html? _r= 0 http://whytoread. com/why-to-read-thinking-fast-and-slow-by-daniel-kahneman/ http://www. findingtheworld. com/the-secrets-of-the-human-brain-two-mechanisms-that-control-our-lives/