HEURÍSTICA Y SUS SESGOS EN EL PROCESO DE TOMA DE DECISIONES MILITARES

Lectura recomendada del mes

Share on Social Networks

Compartir enlace

Usar vinculo permanente para compartir en redes sociales

Share with a friend

Por favor iniciar sesión para enviar esto document por correo!

Incrustar en tu sitio web

Select page to start with

Post comment with email address (confirmation of email is required in order to publish comment on website) or please iniciar sesión to post comment

16. 52 September-October 2010  MILITARY REVIEW 44. I am indebted to MAJ Nick Ayers, U.S. Army, for his explanation of tank gun- nery training. 45. Judgment under Uncertainty: Heuristics and Biases , ed. Daniel Kahneman and Amos Tversky, (New York: Cambridge University Press, 1982), 10. 46. For a complete description, see Holt, 31-32. 47. See <http://www.globalsecurity.org/military/ops/iraqi_freedom.htm>. 48. For this simple example, we assume independence of events. However, most of these events are conditional on the success of other events; therefore, Bayesian analysis may be more appropriate. The point of the example is that people do not usually think even in terms of simple independent probability, let alone more complex conditional probability. 49. 0.75*0.75*0.75*0.75*0.75*0.75 = 0.1779 or 17.79 percent. 50. See <http://www.globalsecurity.org/military/systems/ground/iaaps.htm>. 51. 0.95*0.95*0.95*0.95*0.95 = 0.77 = 77 percent. To be equivalent to the M1 tank, each APS component would have to have a success rate above 95 percent (actual answer is greater than 95.64 percent). 52. This problem is relatively simple to analyze when the probabilities involve objective engineering data. They become much harder when we consider the subjective probabilities found in social situations. 53. 1-0.77 = 0.23 = 23 percent 54. Judgment under Uncertainty: Heuristics and Biases , ed. Daniel Kahneman and Amos Tversky, (New York: Cambridge University Press, 1982), 16. 55. Bayesian inferential techniques may be appropriate tools for overcoming anchoring; however, they take time to model and understand. 56. FM 3.0, Operations (Washington, DC: GPO, 27 February 2008), 5-11. 57. See Christopher R. Paparone and George Reed, “The Re fl ective Military Practitio- ner: How Military Professionals Think in Action,” Military Review 88, no. 2 (2008): 66-77. 58. Ibid., 74.

8. 44 September-October 2010  MILITARY REVIEW frequency in the third position. This experiment highlighted the dif fi culty of modifying established search sets. When we wish to fi nd a word in the dictionary, we look it up by its fi rst letter, not its third. Our available search sets are constructed in unique patterns that are usually linear. We tend to think in a series of steps versus in parallel streams. 27 The effectiveness of our search set has a big impact on operations in Iraq and Afghanistan. When observing IED strikes and ambushes along routes, we typically search those routes repeatedly for high- value targets, yet our operations rarely fi nd them. Our search set is mentally constrained to the map of strikes we observe on the charts in our operation centers. We should look for our adversaries in areas where there are no IEDs or ambushes. They may be more likely to hide there. In another scenario, our enemy takes note of our vehicle bumper numbers and draws rough boundaries for our respective unit areas of operation (AOs). They become used to exploiting operations between unit boundaries and their search set becomes fi xed; therefore, we should take advantage of their bias for established bound- aries by irregularly adjusting our unit AOs. From this example, we can see that to better structure our thinking to escape search set bias, we should think along a spectrum instead of categorically. 28 (Using both methods allows us to think in opposites which may enhance our mental processing ability.) Imaginability Bias . When confronted with a situation without any available memory, we use our imagination to make a subjective premonition. 29 If we play up the dangerous elements of a future mission, then naturally we may perceive our likeli- hood of success as low. If we emphasize the easy elements of a mission, we may assess our probabil- ity of success too high. The ease or lack thereof in imagining elements of the mission most likely does not affect the mission’s true probability of success. Our psychological pre-conditioning to risk (either low or high) biases our assessment of the future. Following the deadly experience of the U.S. Army Rangers in Mogadishu in 1993, force protection issues dominated future military deployments. Deployments to Haiti and Bosnia were different from Somalia, yet force protection issues were assumed tantamount to mission success. We could easily imagine dead American soldiers dragged through the streets of Port-au-Prince or Tuzla. This bias of imaginability concerning force protection U.S. Army , SPC Eric Cabral 1LT Matthew Hilderbrand, left, and SSG Kevin Sentieri, Delta Company, 1st Battalion, 4th Infantry Regiment, patrol in search of a weapons cache outside Combat Outpost Sangar in Zabul Province, Afghanistan, 27 June 2010.

4. Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for r eviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden es timate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a pen alty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE SEP 2010 2. REPORT TYPE 3. DATES COVERED 00-09-2010 to 00-10-2010 4. TITLE AND SUBTITLE Heuristics and Biases in Military Decision Making 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) U.S. Army Combined Arms Center,Fort Leavenworth,KS,66027 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S) 11. SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT Same as Report (SAR) 18. NUMBER OF PAGES 13 19a. NAME OF RESPONSIBLE PERSON a. REPORT unclassified b. ABSTRACT unclassified c. THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18

5. 41 MILITARY REVIEW  September-October 2010 HEURISTICS AND BIASES a result, we suffered tremendous organizational distress as we struggled for answers to the insur- gency in Iraq. We were trapped in a mental cave of our own making and were unable to escape our preconceived notions of military operations and decision making. 6 Fortunately, some have come to see the short- comings of the classical MDMP process. It is ill- suited for the analysis of problems exhibiting high volatility, uncertainty, complexity, and ambiguity. The Army’s nascent answer, called “Design,” looks promising. As outlined in the new version of FM 5-0, Operations Process , Chapter 3, Design is de fi ned as “a methodology for applying critical and creative thinking to understand, visualize, and describe complex, ill-structured problems and develop approaches to solve them.” 7 Instead of a universal process to solve all types of problems (MDMP), the Design approach acknowledges that military commanders must fi rst appreciate the situation and recognize that any solution will be unique. 8 With Design, the most important task is framing a problem and then reframing it when conditions change. 9 Framing involves improvisation and on-the- spot experimentation, especially when we face time and space constraints in our operating envi- ronment. FM 6-0, Mission Command , Chapter 6, states, “Methods for making adjustment decisions fall along a continuum from analytical to intui- tive . . . As underlying factors push the method further to the intuitive side of the continuum, at some point the [planning] methodology no longer applies.” 10 In the course of intuitive deci- sion making, we use mental heuristics to quickly reduce complexity. The use of these heuristics exposes us to cognitive biases, so it is important to ask a number of questions. 11 What heuristics do we use to reduce the high volatility, uncer- tainty, complexity, and ambiguity, and how do these heuristics introduce inherent bias into our decision making? How do these biases affect our probabilistic assessments of future events? Once apprised of the hazards rising from these heuristic tools, how do we improve our deci- sions? This article explores these questions and their implications for the future of military decision making. Behavioral Economics The examination of heuristics and biases began with the groundbreaking work of Nobel Laureate Daniel Kahneman and Professor Amos Tversky. Dissatis fi ed with the discrepancies of classical economics in explaining human decision making, Kahneman and Tversky developed the initial tenets of a discipline now widely known as behav- ioral economics. 12 In contrast to preexisting classi- cal models (such as expected utility theory) which sought to describe human behavior as a rational maximization of cost-bene fi t decisions, Kahne- man and Tversky provided a simple framework of observed human behavior based upon choices under uncertainty, risk, and ambiguity. They pro- posed that when facing numerous sensory inputs, human beings reduce complexity via the use of heuristics. In the course of these mental processes of simplifying an otherwise overwhelming amount of information, we regularly inject cognitive bias. Cognitive bias comes from the unconscious errors generated by our mental simpli fi cation methods. It is important to note that the use of a heuristic does not generate bias every time. We are simply more prone to induce error. Additionally, this bias is not cultural or ideological bias—both of which are semi-conscious processes. 13 Kahne- man and Tversky’s identi fi ed phenomena have withstood numerous experimental and real-world tests. They are considered robust, consistent, and predictable. 14 In this article, we will survey three important heuristics to military decision making: availability, representativeness, and anchoring. 15 In the course of intuitive decision making, we use mental heu- ristics to quickly reduce complexity. The use of these heuristics exposes us to cognitive biases...

6. 42 September-October 2010  MILITARY REVIEW Availability When faced with new circumstances, people naturally compare them to similar situations resid- ing in their memory. 16 These situations often “come to one’s mind” automatically. These past occur- rences are available for use, and generally, they are adequate for us to make sense of new situations encountered in routine life. However, they rarely are the product of thoughtful deliberation, especially in a time-constrained environment. These available recollections have been unconsciously predeter- mined by the circumstances we experienced when we made them. These past images of like circum- stances affect our judgment when assessing risk and/or the probability of future events. Ultimately, four biases arise from the availability heuristic: retrievability bias, search set bias, imaginability bias, and illusory correlation. Retrievability bias. The frequency of similar events in our past reinforces preconceived notions of comparable situations occurring in the future. For example, a soldier will assess his risk of being wounded or killed in combat based on its frequency of occurrence among his buddies. Likewise, an of fi - cer may assess his probability of promotion based on the past promotion rates of peers. Availability of these frequent occurrences helps us to quickly judge the subjective probability of future events; however, availability is also affected by other fac- tors such as salience and vividness of memory. For example, the subjective probability assessment of future improvised explosive device (IED) attacks will most likely be higher from a lieutenant who witnessed such attacks than one who read about them in situation reports. Bias in their assessment occurs because the actual probability of future attacks is not related to the personal experience of either of fi cer. 17 Similarly, consistent fi xation on a previous event or series of events may also increase availability. 18 Naval of fi cers most likely experienced a temporary rise in their subjective assessment of the risk of ship collision after the highly publicized reports of the collision between the USS Hartfor d and USS New Orleans . 19 The true probability of a future collision is no more likely than it was prior to the U.S. Marine Corps SSgt Tommy Webb of Headquarters Battalion, Marine Forces Reserve, teaches a class on grid coor- dinates and plotting points on a map, 22 February 2010. The course emphasizes combat conditioning, decision making, critical thinking skills, military traditions, and military drill. These professional courses must focus on critical re fl ection when examining new problems in order to avoid bias. US Marine Corps photo by Lance CPL Abby Burtne.

14. 50 September-October 2010  MILITARY REVIEW Overcoming this anchoring phenomenon is dif- fi cult. Even when test subjects are apprised of the bias, research has shown anchoring and inadequate adjustment persist. In dealing with highly volatile, uncertain, complex, and ambiguous environments, military professionals need to improvise and experi- ment with a variety of new methods. These activities are part of the critical task of reframing the problem, outlined in FM 5-0. In order to avoid anchoring, it may be necessary to reframe a problem anew; however, this may be a dif fi cult proposition in a time-constrained environment. 55 Summary The volatility, uncertainty, complexity, and ambiguity of our operating environment demand that military professionals make rapid decisions in situations where established military decision making processes are either too narrow or inef- fective. The fast tempo of operational decisions potentially may render any elaborate approach, either MDMP or Design, infeasible. As a result, commanders and staff may fi nd themselves engaged in more intuitive decision making. FM 3-0, Operations, states that intuitive decision making rests on “reaching a conclusion that emphasizes pattern recognition based upon knowl- edge, judgment, experience, education, intelli- gence, boldness, perception, and character.” 56 This article has identi fi ed several heuristics that people use to make intuitive decisions to emphasize the potential cognitive biases that subconsciously arise and can produce poor outcomes. When subjective assessments, ego, and emotion are intertwined with cognitive processes, we realize that intuitive decision making is fraught with potential traps. We must constantly strive to avoid these mental snares and plan to compensate for them when they arise. The solution may lie in the organizational embrace of the concept of re fl ective practice as advocated by previous authors in this journal. 57 Instead of the usual striving toward a “best practices” meth- odology, which is also full of potential heuristic biases, re fl ective practice calls for “valuing the processes that challenge assimilative knowledge (i.e. continuous truth seeking) and by embracing the inevitable con fl ict associated with truth seek- ing.” 58 Institutionalizing this approach may help us to avoid some of the intrinsic human mental frailties that inhibit good decision making. MR DOD The XM1203 Non-Line-of-Sight Cannon was a mobile 155-mm cannon intended to provide improved responsiveness and lethality to the unit of action commander as part of the U.S. Army’s Future Combat Systems project, Yuma, AZ, 2009.

3. 40 September-October 2010  MILITARY REVIEW Major Blair S. Williams, U.S. Army, is a Joint planner at U.S. Strategic Com- mand. He holds a B.S. from the U.S. Military Academy (USMA), an M.S. from the University of Missouri, and a Ph.D. from Harvard University. He has served in a variety of command and staff positions, including deployments to Iraq and Afghanistan, as well as an assignment as an assistant professor of economics in the Department of Social Sciences at USMA. _____________ PHOTO: U.S. Army SSG Clarence Washington, Provincial Reconstruc- tion Team Zabul security forces squad leader, takes accountability after an indirect fi re attack in Qalat City, Zabul Province, Afghanistan, 27 July 2010. (U.S. Air Force photo/SrA Nathanael Callon) If we now consider brie fl y the subjective nature of war—the means by which war has to be fought—it will look more than ever like a gamble . . . From the very start there is an interplay of possibilities, probabilities, good luck, and bad that weaves its way throughout the length and breadth of the tapestry. In the whole range of human activities, war most closely resembles a game of cards. —Clausewitz, On War . 1 C ARL VON CLAUSEWITZ’S metaphoric description of the condition of war is as accurate today as it was when he wrote it in the early 19th century. The Army faces an operating environment characterized by volatility, uncertainty, complexity, and ambiguity. 2 Military professionals struggle to make sense of this paradoxical and chaotic setting. Succeed- ing in this environment requires an emergent style of decision making, where practitioners are willing to embrace improvisation and re fl ection. 3 The theory of re fl ection-in-action requires practitioners to question the structure of assumptions within their professional military knowledge. 4 For commanders and staff of fi cers to willingly try new approaches and experiment on the spot in response to surprises, they must critically exam- ine the heuristics (or “rules of thumb”) by which they make decisions and understand how they may lead to potential bias. The institutional nature of the military decision making process (MDMP), our organizational culture, and our individual mental processes in how we make decisions shape these heuristics and their accompanying biases. The theory of re fl ection-in-action and its implications for decision making may sit uneasily with many military professionals. Our established doctrine for decision making is the MDMP. The process assumes objec- tive rationality and is based on a linear, step-based model that generates a speci fi c course of action and is useful for the examination of problems that exhibit stability and are underpinned by assumptions of “technical- rationality.” 5 The Army values MDMP as the sanctioned approach for solving problems and making decisions. This stolid template is comforting; we are familiar with it. However, what do we do when our enemy does not conform to our assumptions embedded in the process? We discovered early in Iraq that our opponents fought differently than we expected. As Heuristics and Biases in Military Decision Making Major Blair S. Williams, U.S. Army The author is indebted to COL(R) Christopher Paparone, MAJ Rob Meine, MAJ Mike Shek- leton, and COL(R) Doug Williams for reviewing this article and providing insightful suggestions for its improvement.

11. 47 MILITARY REVIEW  September-October 2010 HEURISTICS AND BIASES is individually differentiated from his peers due to his high training score. This available information potentially causes the lieutenant to order informa- tion based upon its perceived level of importance. The high detection ability in training may facilitate overcon fi dence in actual ability and neglect of the base-rate of actual insurgents in the population of only 10 percent. The result is that the lieutenant is far more likely to mistake the innocent civilian for an insurgent. 39 Outside of the lieutenant’s mind (and ego), the base-rate actually has a far greater impact on the probability that the apprehended man is an innocent civilian rather than an insurgent. 40 Insensitivity to sample size . Consider a problem from Afghanistan: We suspect two primary drug traf fi cking routes along the Afghan-Pakistani border. A small village is located along the fi rst suspected route, while a larger village is located along the other suspected route. We also suspect that local residents of each village guide the opium caravans along the mountainous routes for money. Human intelligence sources indicate that thirty men from the small village and sixty- fi ve men from the large village engaged in guide activities over the last month. Furthermore, coalition check points and patrols recently con fi rmed the G2 long-term estimate that on average, twenty- fi ve percent of the male population of each village is engaged monthly in guide activity. The smuggling activity fl uctuates monthly–sometimes higher and other times lower. Which vil- lage is likely to experience more months of over forty percent participation rate in smuggling? If you selected the large village, then you are incor- rect. If you guessed it would be 25 percent for both villages, you are also incorrect. The small village would have greater fl uctuations in activity due to the “law of large numbers.” As population size grows, the average number becomes more stable with less variation; therefore, the larger village’s monthly percentage of guide activity is closer to the long– term average of 25 percent. The smaller village has greater monthly deviations from the long-term aver- age value. This example highlights that insensitivity to sample size occurs because many people do not consider the “law of large numbers” when making probability assessments and decisions. 41 Misconceptions of chance . Many people mis- understand the elements of chance. For example, suppose you observe roulette in a casino. The following three sequences of red and black could occur: RBRBRB or RRRBBB or RBBBBB. Which sequence is more likely? The answer is that all of these sequences are equally likely; however, if you were like most people in similar experi- ments, then you most likely picked RBRBRB. 42 This sequence is the most popular because people expect the fundamental traits of the equilibrium sequence (50 percent Black and 50 percent Red) to be represented—yet if you stopped to do the math, each sequence has a probability of 1.56 percent. 43 If the sequence was RBBBBB, then you most likely would hear people say “Red is coming up for sure”—this is the gambler’s fallacy . Many people expect the equilibrium pattern to return after a long run of black; however, the laws of randomness have not changed. The probability of red is equal to black. The implication is that we unconsciously judge future events based on representativeness of sequence, not on probability. Now, consider the following question: Which is more likely: 1) “Iran tests a nuclear weapon in 2013” or 2) “Iran has domestic unrest after its next election and tests a nuclear weapon sometime in 2013?” If you selected the second scenario, then you are incorrect. The reason is the more speci fi c the description, the less likely the event. The two events occurring in the same year are less likely than only one event occurring; however, many people tend to judge an event more likely as more speci fi c infor- mation is uncovered. This human tendency has potential implications for military decision making as situational awareness improves with technol- ogy. Adding new details to a situation may make that scenario seem more plausible, yet the mere discovery of further information does not affect the probability of the situation actually occurring. Failure to identify regression to the mean . Suppose we examine the training records of tank crews during gunnery quali fi cation. 44 Observer- controllers (OCs) may report that praising to a tank crew after an exceptional run on Table VII is normally followed by a poor run on Table VIII.

12. 48 September-October 2010  MILITARY REVIEW They might also maintain that harsh scorn after a miserable run on Table VII is normally followed by a great run on Table VIII. As a result, OCs may assume that praise is ineffective (makes a crew cocky) and that criticism is valuable (makes a crew buckle down and perform). This assump- tion is false due to the phenomenon known as regression to the mean . If a tank crew repeatedly executed Tables VII and VIII, then the crew’s scores would eventually converge (or regress) to an average score over the long term. However, at the beginning of this process, the scores are likely to be highly volatile with some scores alternating far above and others far below the average. OCs may falsely assume that their social interaction with the crew has a causal effect on the crew’s future scores. Kahneman and Tversky write that the inability to recognize the regression to the mean pattern “remains elusive because it is incom- patible with the belief that the predicted outcome should be maximally representative of the input, and, hence, that the value of the outcome variable should be as extreme as the value of the input variable.” 45 In other words, many times we fail to identify settings that follow the regression to the mean phenomenon because we intuitively expect future scores to be representative of a previous score. Furthermore, we attribute causal explana- tions to performance that are actually irrelevant to the outcome. Anchoring When facing a new problem, most people estimate an initial condition. As time unfolds, they adjust this original appraisal. Unfortunately, this adjustment is usually inadequate to match the true fi nal condition. For example, the average number of U.S. troops in Iraq from May 2003 to April 2007 was 138,000. Mounting evidence during this time exposed this initial estimate as insuf fi cient, yet decision makers were anchored on this number over the course of this four-year period. They did not upwardly adjust the number until Iraq was on the verge of a civil war between Sunnis and Shiites. The anchoring phenom- enon kept the value closer to the initial value than it should have been. Historically, anchoring bias has had harmful effects on military operations. As previously identi fi ed, the British in World War II were masters of exploiting human mental errors. They exploited German anchoring bias with the deception scheme called the Cyprus Defense Plan. 46 Following the German seizure of Crete, the British were concerned that the 4,000 troops on Cyprus were insuf fi cient to repel a German attack. Via the creation of a false division headquarters, barracks, and motor pools along with phony radio transmissions and telegrams, the British set out to convince the Germans that 20,000 troops garri- soned the island. A fake defensive plan with maps, graphics, and orders was passed via double agents a lost briefcase. The Germans and Italians fell for the ruse. This deception anchored the Germans on the 20,000 troop number for the remaining three years of the war. In spite of their own analysis that the number might be too high, intelligence intercepts and post-war documents revealed the Germans believed the number almost without question. This exposes another negative effect of anchoring: excessively tight con fi dence inter- vals. The Germans were more con fi dent in their assessment than justi fi ed when considering the contradictory information they had. In summary, the Germans were anchored, made insuf fi cient adjustments and had overly narrow con fi dence intervals. Biases in the evaluation of conjunctive and disjunctive events . Anchoring bias appears in our assessments of conjunctive and disjunctive events. A conjunctive event is comprised of a series of stages where the previous stage must be successful for the next stage to begin. In spite of each indi- vidual stage having a high probability of success, the probability of total event success may be low due to a large number of stages. Unfortu nately, When facing a new problem, most people estimate an initial condition. As time un- folds, they adjust this origi- nal appraisal. Unfortunately, this adjustment is usually inadequate to match the true fi nal condition.

2. “Heurística y sus Sesgos en el Proceso de Toma de Decisiones Militares” de Blair S. Williams Ricardo Simonaio Morata Capitán de Fragata Asesor de la Marina de Brasil en la Academia de Guerra Naval Los profesionales militares se esfuerzan por entender el escenario paradójico y caótico del conflicto. El éxito en este entorno requiere un estilo de toma de decisiones en el que los profesionales estén dispuestos a adoptar la improvisación y la reflexión. La Teoría de la Reflexión en Acción requiere que los profesionales cuestionen la estructura de los supuestos dentro de sus conocimientos profesionales militares. Para que los comandantes y los oficiales del Estado Mayor estén dispuestos a probar nuevos enfoques y llevar a cabo experimento en respuesta a situaciones sorpresa, deben analizar críticamente la Heurística. En ese sentido, en resumidas palabras, el Mayor Blair S. Williams, del Ejercito de los EUA, hizo un articulo explorando los principales sesgos de la Heurística en los Procesos de Toma de Decisiones, ocurridos, de hecho, durante el conflicto. A lo largo del intuitivo proceso utilizamos la heurística mental para reducir la complejidad rápidamente. A pesar de existieren varios sesgos, el uso de estas heurísticas nos expone, en la mayor parte, a los tres sesgos cognitivos mas importantes: - el de la REPRESENTATIVIDAD; - el de la DISPONIBILIDAD; y - el de la ANCLAJE y AJUSTE. Bibliografía Williams , B. (2010). Heuristics and Biases in Military Decision Making . 90 (5), 14. https://www.armyupress.army.mil/Portals/7/military- review/Archives/English/MilitaryReview_20120630MC_art011.pdf Nota de Descargo: Las opiniones expresadas en este documento son de exclusiva responsabilidad de sus autores y no necesariamente representan la opinión de la Academia de Guerra Naval o la Armada del Ecuador.

13. 49 MILITARY REVIEW  September-October 2010 HEURISTICS AND BIASES researchers have shown that many people do not think in terms of total event (or system) probability. Instead, they anchor on initial stage probabilities and fail to adjust their probability assessment. This results in overestimating the likelihood of success for a conjunctive event. A disjunctive event occurs in risk assessment. When examining complex systems, we may fi nd that the likelihood of failure of individual critical components or stages is very small. However, as complexity grows and the number of critical com- ponents increases, we fi nd mathematically that the probability of event (or system) failure increases. However, we again fi nd that people anchor incor- rectly. In this case, they anchor on the initial low probabilities of initial stage failure. Consequently, people frequently underestimate the probability of event failure. This overestimation of success with a conjunctive event and underestimation of failure with a disjunctive event has implications for mili- tary decision making. For example, military planners in 2002 and 2003 may have fallen victim to conjunctive event bias during strategic planning for the Iraq invasion. In order to realize success in Iraq, a number of military objectives had to occur. These included— ● Ending the regime of Saddam Hussein. ● Identifying, isolating, and eliminating Iraq’s WMD programs. ● Searching for, capturing, and driving terrorists out of Iraq. ● Ending sanctions and immediately delivering humanitarian assistance to support the Iraqi people. ● Securing Iraqi oil fi elds and resources for the Iraqi people. ● Helping the Iraqi people create conditions for a transition to a representative self-government. 47 For illustrative purposes, suppose planners gave each stage a 75 percent independent probability of success. 48 This level of probability potentially anchored decisionmakers on a 75 percent chance of overall mission success in Iraq, while the actual probability of success is approximately 18 percent. 49 The total probability of accomplishing all of these objectives gets smaller with the addition of more objectives. As a result, the conclusion by strategic leaders that Operation Iraqi Freedom had a high likelihood of success was potentially overoptimistic and unwarranted. A more recent example of conjunctive event bias occurs in procurement decisions. One of the main selling points of the Future Combat System Manned Ground Vehicle family (MGV) was tank- level survivability combined with low weight for rapid deployability. While the M1 tank relies on passive armor for its protective level, the MGV would reach an equivalent level via increased situational awareness (“why worry about armor when you are never surprised by your enemy?”) and an Active Protective System (APS) that verti- cally deploys an interceptor to strike an incoming threat munition. The Active Protective System is a conjunctive system that requires a chain of stages to occur for overall system success: 1) detect an incoming threat munition, 2) track and identify munition trajectory, 3) deploy appropriate counter- measure, 4) hit incoming munition, and 5) destroy or de fl ect the munition. 50 Again for illustrative purposes, assume that the individual probability of success for each of these fi ve stages is 95 percent. Suppose that the M1A2’s passive armor is only 80 percent effective against the threat munition. Anchoring bias occurs in that people may con fl ate the 95 percent individual stage rate with an overall APS system success rate. This is a false conclu- sion. In this example, the overall APS probability of success is actually 77 percent. 51 When compared to the M1 tank, the APS is actually less survivable than passive armor with this notional data. 52 We could also view the APS as a disjunctive system. Instead of success rate, suppose the failure rate of each component is fi ve percent. Naturally, a fi ve percent failure rate looks better than the M1 tank’s 20 percent failure rate. Framed this way, many people may erroneously anchor on a total system failure probability of fi ve percent, when the disjunctive probability that at least one criti- cal APS component fails is actually 23 percent. 53 Again, we fi nd that the APS is worse than the M1 tank’s passive armor. This simple example shows that disjunctive and conjunctive events are opposite sides of the same coin. Kahneman and Tversky write, “The chain-like structure of conjunctions leads to overestimation; the funnel-like structure of disjunction leads to underestimation.” 54 The direc- tion of the fl awed probability estimate is a matter of framing the problem, yet the bias exists in both types of events.

9. 45 MILITARY REVIEW  September-October 2010 HEURISTICS AND BIASES actually hampered our ability to execute other critical elements of the overall strategic mission. 30 Biases of imaginability may potentially become worse as we gain more situational awareness on the battle fi eld. This seems counterintuitive, yet we may fi nd units with near-perfect information becoming paralyzed on the battle fi eld. A unit that knows an enemy position is just around the corner may not engage it because the knowledge of certain danger makes its members susceptible to in fl ating risk beyond its true value. These Soldiers may envision their own death or that of their buddies if they attack this known position. Units with imperfect information (but well-versed in unit battle drills) may fare better because they are not biased by their imagination. They will react to contact as the situation develops. 31 As an organization, we desire our of fi cers and NCOs to show creativity in making decisions, yet we have to exercise critical re fl ection lest our selective imagination get the best of us. Illusory Correlation. Correlation describes the relationship between two events. 32 People often incorrectly conclude that two events are correlated due to their mentally available associative bond between similar events in the past. 33 For example, we may think that the traf fi c is only heavy when we are running late, or our baby sleeps in only on mornings that we have to get up early. These memorable anecdotes form false associative bonds in our memories. Consider the following example regarding military deception operations from CIA analyst Richard Heuer: The hypothesis has been advanced that deception is most likely when the stakes are exceptionally high. If this hypothesis is correct, analysts should be especially alert for deception in such instances. One can cite prominent examples to support the hypothesis, such as Pearl Harbor, the Nor- mandy landings, and the German invasion of the Soviet Union. It seems as though the hypothesis has considerable support, given that it is so easy to recall examples of high stakes situations...How common is deception when the stakes are not high . . . What are low-stakes situations in this context? High stakes situations are de fi n- able, but there is an almost in fi nite number and variety of low-stakes situations . . . we cannot demonstrate empirically that one should be more alert to deception in high-stakes situations, because there is no basis for comparing high-stakes to low stakes cases. 34 Heuer highlights the potentially pernicious effect illusory correlation can have on our decision making. Exposure to salient experiences in the past generates stereotypes that are dif fi cult to con- sciously break. In fact, we may fall victim to con- fi rmation bias , where we actively pursue only the information that will validate the link between the two events. We may ignore or discard important data that would weaken our illusory correlation. In social settings (such as staff work), the effects of illusory correlation and con fi rmation bias are reinforcing factors to the concept of groupthink , whereby members of a group minimize con fl ict and reach consensus without critically examining or testing ideas. Groupthink generates systematic errors and poor decisions. Scholars have identi fi ed a number of military disasters, such as the Bay of Pigs fi asco and the Vietnam War, as examples of the dangers of heuristics associated with group- think. 35 To avoid illusory correlation, we should ask ourselves whether our intuitive or gut feeling on the relationship between two events is correct and why. This does not come naturally. It takes a deliberative mental effort to ask ourselves a contrary proposition to our assumed correlation. Individually, we may be unable to overcome illu- sory correlation. The solution potentially lies in Exposure to salient experiences in the past generates stereotypes that are dif fi cult to consciously break. In fact, we may fall victim to con fi rmation bias, where we actively pursue only the information that will validate the link between the two events.

7. 43 MILITARY REVIEW  September-October 2010 HEURISTICS AND BIASES collision, yet organizational efforts to avoid colli- sions increased due to the subjective impression that collisions were now somehow more likely. People exposed to the outcome of a probabilistic event give a much higher post-event subjective probability than those not exposed to the outcome. This is called hindsight bias . When combining hindsight bias and retrievabil- ity biases, we potentially fail to guard against an event popularized euphemistically as a black swan . Nassim Taleb describes black swans as historical events that surprised humanity because they were thought of as non-existent or exceedingly rare. We assume all swans are white; they are in our avail- able memory. 20 For example, in hindsight the 11 September 2001 terrorist attacks look completely conceivable; therefore, we hold the various intel- ligence agencies of the U.S. government publicly accountable for something that was not even con- sidered plausible before the event. Furthermore, mentally available disasters set an upper bound on our perceived risk. Many of our precautionary homeland security measures are based on stopping another 9/11 type attack, when in fact the next attempt may take on a completely different context that we cannot imagine (because our searches for past experiences are limited). 21 Availability played a role in the current global fi nancial crisis. Our collective memories contained two decades of stable market conditions. The inability to conceive a major economic downturn and the fl awed assumption that systemic risk to the national real estate market was minuscule contrib- uted to creating a black swan event. 22 Taleb wrote the following passage before the collapse of the asset-backed securities market (a major element of the current economic recession): Globalization creates interlocking fragil- ity, while reducing volatility and giving the appearance of stability. In other words, it creates devastating Black Swans. We have never lived before under the threat of a global collapse. Financial institutions have been merging into a smaller number of very large banks. Almost all banks are interre- lated. So the fi nancial ecology is swelling into gigantic, incestuous banks—when one fails, they all fail. The increased concentra- tion among banks seems to have the effect of making fi nancial crises less likely, but when they happen they are more global in scale and hit us very hard. 23 Given the possibility of black swans, we should constantly question our available memories when faced with new situations. Are these memories leading us astray? Are they making our decisions more or less risky? Are our enemies exploiting this phenomenon? Military planners have done so in the past, seeking the advantage of surprise. For example, the British were masters at exploit- ing retrievability biases during World War II. They employed the COLLECT plan in North Africa in 1941 to obfuscate the exact timing of General Auchinleck’s offensive (Operation Crusader) against Rommel’s forces in Libya. 24 Via of fi cial, unof fi cial, and false channels, the British repeatedly signaled speci fi c dates of the commencement of the operation, only to rescind these orders for plausible reasons. These arti fi cial reasons included the inabil- ity to quickly move forces from Syria to take part in the operation to the failure of logistics ships to arrive in Egypt. Planners wanted to lull Rommel into expecting the repeated pattern of preparation and cancellation so that when the actual operation began, his memory would retrieve the repeated pattern. The plan worked. The British achieved operational deception. They surprised Rommel and after 19 days of fi ghting ultimately succeeded in breaking the siege at Tobruk. The repetitive nature of orders and their cancellation demonstrates the power of availability on human decision making. 25 Search Set Bias . As we face uncertainty in piecing together patterns of enemy activity, the effectiveness of our patterns of information retrieval constrain our ability to coherently create a holistic appreciation of the situation. These patterns are called our search set. A simple example of search set is the Mayzner- Tresselt experiment, in which subjects were told to randomly select words longer than three letters from memory. Experimenters asked if the words more likely had the letter R in the fi rst position or third posi- tion. Furthermore, they asked subjects to estimate the ratio of these two positions for the given letter. They also asked about K , L , N , and V . The subjects overwhelmingly selected the fi rst position for each letter given over the third position, and the median subjective ratio for the fi rst position was 2:1. 26 In fact, the afore mentioned letters appear with far more

1. ARMADA DEL ECUADOR ACADEMIA DE GUERRA NAVAL Guayaquil -o- LECTURAS RECOMENDADAS HEURÍSTICA Y SUS SESGOS EN EL PROCESO DE TOMA DE DECISIONES MILITARES BLAIR S. WILLIAMS, MILITARY REVIEW Lectura Recomendada por : CPFG Ricardo Simonaio Morata Asesor de la Marina de Brasil en la Academia de Guerra Naval 2 020

10. 46 September-October 2010  MILITARY REVIEW a collective staff process where we organize into teams to evaluate competing hypotheses. 36 Representativeness Representativeness is a heuristic that people use to assess the probability that an event, person, or object falls into a larger category of events, people, or things. In order to quickly categorize a new occur- rence, we mentally examine it for characteristics of the larger grouping of preexisting occurrences. If we fi nd it to “represent” the traits of the broader category, we mentally place it into this class of occurrences. This heuristic is a normal part of mental processing, yet it is also prone to errors. Representativeness leads to fi ve potential biases: insensitivity to prior prob- ability of outcomes, base-rate neglect, insensitivity to sample size, misconceptions of chance, and failure to identify regression to the mean. Insensitivity to prior probability of outcomes . Consider the following description of a company- grade Army of fi cer: He is a prudent, details-oriented person. He meticulously follows rules and is very thrifty. He dresses conservatively and drives a Ford Focus . Is this of fi cer more likely to be an aviator or fi nance of fi cer? If you picked fi nance of fi cer, then your ste- reotype of the traits of a typical fi nance of fi cer may have fooled you into making the less likely answer. You may even hold the stereotype that aviators are hot-shot pilots, who fl y by the seat of their pants. It is common to view pilots as individuals who believe rules are made to be broken, and money is made to be spent on fast cars and hard partying. Given these stereotypes, you chose unwisely because there are statistically more aviators than fi nance of fi cers who fi t the given description. As a branch, aviation assesses approximately 20 times more of fi cers than fi nance each year. It is always important to under- stand the size of the populations you are comparing before making a decision. Stereotypes often arise unconsciously; therefore, it is important to remain on guard against their potential misleading effects. Base-rate neglect . Consider the following prob- lem given to cadets at West Point: While on a platoon patrol, you observe a man near a garbage pile on the side of a major road. In recent IED attacks in the area, the primary method of concealment for the device is in the numerous piles of garbage that lay festering in the street (trash removal is effectively non-existent due to insurgent attacks on any government employee—including sanitation workers). You immediately direct one of your squad leaders to apprehend the man. Based on S2 reports, you know that 90 percent of the population are innocent civilians, while 10 percent are insurgents. The battalion S3 recently provided information from detainee operations training—your platoon correctly identi fi ed one of two types of the population 75 percent of the time and incorrectly 25 percent of the time. You quickly interrogate the man. He claims innocence, but acts sus- piciously. There is no IED in the trash pile. What is the probability that you detain the man and that he turns out to be an insurgent rather than a civilian? Most cadets answered between 50 percent and 75 percent. 37 This estimate is far too high. The actual probability is 25 percent. 38 The 75 percent detection probability from the platoon’s training provides available individuating information. Individuating information allows the lieutenant to believe that he President John F. Kennedy addresses the 2506 Cuban Inva- sion Brigade, 29 December 1962, Miami, FL. Cecil Stoughton, White House, in the John F. Kennedy Presidential Library and Museum

15. 51 MILITARY REVIEW  September-October 2010 HEURISTICS AND BIASES 1. Carl von Clausewitz, On War , trans. and ed. Michael Howard and Peter Paret (Princeton University Press, 1976), 85-86. 2. The speci fi c terms volatility, uncertainty, complexity, and ambiguity (VUCA) gained favor in the curricula of the military senior service colleges. For a history of its pedagogical evolution, see Judith Stiehm, The U.S Army War College: Military Education in a Democracy (Temple University Press, 2002). 3. The origins for these concepts come from Nobel Laureate Herbert Simon and Charles Lindblom. Simon’s concept of “satis fi cing” and Lindblom’s notion of “muddling through” challenged the dominant technical-rational view (still prevalent in the operations research community) that optimally ef fi cient solutions can be found to inherently social problems. See Charles E. Lindblom, “The Science of “Muddling Through,” Public Administration Review 19 (1959): 79-88, and Herbert A. Simon, Administrative Behavior , 4th Ed. (Simon and Schuster, 1997). Later theorists applied it to business organizations (Karl E. Weick, “Improvisation as a Mindset for Organizational Analysis,” Organization Science 9, no. 5 [1998]: 543-55) and to codes of professional knowledge (Donald A Schön, Educating the Re fl ective Practitioner [Jossey-Bass, 1987]). There are a number of recent works that apply these concepts to the military: Don M. Snider and Gayle L. Watkins, The Future of the Army Profession , 2d Ed. (McGraw-Hill, 2005) and Christopher R. Paparone and George Reed, “The Re fl ective Military Practitioner: How Military Professionals Think in Action,” Military Review 88, no. 2 (2008): 66-77. 4. Donald A Schön writes that if “we think critically about the thinking that got us into this fi x or this opportunity . . . we may, in the process, restructure strategies of action, understandings of phenomena, or ways of framing problems,” , Educating the Re fl ective Practioner (Jossey-Bass, 1987), 28. 5. “Technical-rationality” is the positive epistemology that has largely structured our current view of knowledge. It is the view that we can reduce the elements of a complex system, analyze them individually, and then reconstruct them into a holistic appreciation of the system. Simultaneous causality and endogeneity make this type of analysis very dif fi cult when analyzing social situations. 6. Plato uses this metaphor to describe a group of people unable to perceive the true nature of the world because they are chained in a cave of their own making. See Gareth Morgan, “Exploring Plato’s Cave: Organizations as Psychic Prisons,” in Images of Organization (Sage, 2006). 7. Field Manual (FM) 5-0 (Washington, DC: U.S. Government Printing Of fi ce [GPO]), 3-1. 8. At its core, Design calls for an open mind that examines problems from multiple lenses. It is not a systems engineering process with a sequence of steps similar to MDMP. It calls for a broader intellectual examination of a problem. Unfor- tunately, educating many in our profession to examine problems in this manner will most likely meet institutional resistance. We are a culture of doers, not thinkers. We decisively execute rather than thoughtfully deliberate. Process checklists are easy to use and require little thought in a time-constrained environment. Under- standing and using Design may require more of fi cers with liberal arts educations over engineering training. The full embrace of a Design-type methodology to face volatile, uncertain, complex, and ambiguous environments may require the com- plete re-tooling of the core curricula at West Point, Command and General Staff College, and the War College. This topic is highly controversial (and provocative). 9. For more on framing effects, see Erving Goffman, Frame Analysis (Cambridge: Harvard University Press, 1974). 10. FM 6-0 (Washington, DC: GPO), 6-116. 11. We are examining individual heuristics as identi fi ed in behavioral econom- ics, not social heuristics (how a culture appraises a situation). The effect of social in fl uences on decision making is a topic beyond the scope of this paper. However, a merging of individual and social in fl uences is proposed in Mark Granovetter, “Economic Action and Social Structure: The Problem of Embeddedness,” The American Journal of Sociology 91, no. 3 (1985), 481-510. 12. See Daniel Kahneman and Amos Tversky, “Judgment under Uncertainty: Heuristics and Biases ” Science 185 (1974), 1124-31; Daniel Kahneman and Amos Tversky, “Prospect Theory: An Analysis of Decision under Risk,” Econometrica 47, no. 2 (1979), 263-92; and Choice, Values, and Frames, ed. Daniel Kahneman and Amos Tversky (New York: Cambridge University Press, 2000). 13. These assumptions are not critical for this analysis of unconscious decision making heuristics. Viewed from a sociological perspective, we could potentially relax these assumptions and examine the complex interplay of unconscious organizational in fl uences on decision making. This would be an interesting topic for future research. 14. In spite of experimental and real world tests, behavioral economics is not without critics. For more, see Mikhail Myagkov and Charles R. Plott, “Exchange Economies and Loss Exposure: Experiments Exploring Prospect Theory and Competitive Equilibria in Market Economics,” American Economic Review 87, no. 5 (1997): 801-28. 15. These heuristics and their attendant biases are previewed in Judgment under Uncertainty: Heuristics and Biases , ed. Daniel Kahneman and Amos Tversky (New York: Cambridge University Press, 1982), 1-20. 16. Professor Christopher Paparone suggests that one might call these ref- erences a search for metaphors. For more, see Christopher R. Paparone, “On Metaphors We Are Led By,” Military Review 88, no. 6 (2008): 55-64. 17. Unless one is to believe the superstitious notion of a Soldier with the unlucky distinction of being a “bullet-magnet.” 18. Kahneman and Tversky write, “Continued preoccupation with an outcome may increase its availability, and hence its perceived likelihood. People are pre- occupied with highly desirable outcomes, such as winning the sweepstakes, or highly undesirable outcome, such as an airplane crash. Consequently, availability provides a mechanism by which occurrences of extreme utility (or disutility) may appear more likely than they actually are,” Judgment under Uncertainty: Heuris- tics and Biases , ed. Daniel Kahneman and Amos Tversky (New York: Cambridge University Press, 1982), 178. 19. Commander, U.S. 5th Fleet Public Affairs, “USS Hartford and USS New Orleans Arrive in Port Bahrain,” 21 March 2009, story number: NNS090321-03, <http://www.navy.mil/search/display.asp?story_id=43630>. 20. See Nassim N. Taleb, The Black Swan: The Impact of the Highly Improbable (Random House, 2007). 21. We see this same type of phenomenon occurring in the sale of insurance. People use the last accident or disaster as an upper limit on what is possible for the future; therefore, they only insure up to this level. 22. The assumption made was that all real estate market fl uctuations are local. At the national level (or systemic-level), the local markets would never fall at the same time. In fact, this is what occurred. 23. Nassim N. Taleb, <http://www.fooledbyrandomness.com/imbeciles.htm>. 24. See Thaddeus Holt, The Deceivers: Allied Military Deception in the Second World War (New York: Scribner, 2004), 39-40. 25. One must be careful using historical examples. The study of military history potentially exposes us to availability-related biases. We do all that reading to learn what has worked and what hasn’t worked in the past, yet this source of professional knowledge can tether us to speci fi c courses of action. If we apply lessons from the past that are incorrectly suited for the problems of today, then we may sow the seeds of disaster. Military history is useful for informing our understanding of the problem, but we must be cautious not to let history inappropriately guide our actions. 26. Mark S. Mayzner and Margaret Tresselt, “Tables of single-letter and bigram frequency counts for various word-length and letter position combinations,” Psy- chonomic Monograph Supplements , 1965, no. 1, 13-32. 27. Although I generalize about mental search sets, it is important to acknowl- edge that some personality types may exhibit parallel thought processes. We might fi nd this capacity in “creative” people, such as painters, musicians, and architects. 28. I am indebted to Professor Christopher Paparone for this insight. Also see Deborah A Stone, Policy Paradox: The Art of Political Decision Making , 2d Ed. (New York: W.W. Norton, 2001). 29. See Daniel Kahneman and Amos Tversky, “Judgment under Uncertainty: Heuristics and Biases ” Science 185 (1974): 1124-31. 30. See John T. Fishel, “Operation Uphold Democracy: Old Principles, New Real- ities,” Military Review 77, no. 4 (1997): 22-30, and Robert F. Baumann, “Operations Uphold Democracy: Power Under Control,” Military Review 77, no. 4 (1997): 13-21. 31. In light of this potential bias, we may want to re-evaluate the allocation of our budget resources. Which contribute more to combat effectiveness—dollars spent on technical systems that enhance situational awareness, or dollars spent on realistic, tough training? 32. In technical terms, correlation is a measure of covariance, which is a measure of the linear dependence between two random variables. It does not imply causality. For example, people carrying umbrellas are positively correlated with the possibility of rain, yet carrying umbrellas does not cause it to rain. 33. See Loren J. Chapman and Jean P. Chapman, “Genesis of popular but erroneous psychodiagnostic observations,” Journal of Abnormal Psychology 72 (1967): 193-204; Loren J. Chapman and Jean P. Chapman, “Illusory correlation as an obstacle to the use of valid psychodiagnostic,” Journal of Abnormal Psychology 74 (1969); and Dennis L. Jennings, Teresa M. Amabile, and Lee Ross, “Informal covariation assessment: Data-based versus theory-based judgments,” in Judg- ment under Uncertainty: Heuristics and Biases , ed. Daniel Kahneman, and Amos Tversky (Cambridge, 1982). 34. Richard J. Heuer, Psychology of Intelligence Analysis (Center for the Study of Intelligence, 1999), 144-45. 35. Irving L. Janis, Groupthink: Psychological Studies of Policy Decisions and Fiascoes , 2d ed. (Boston, MA: Houghton Mif fl in, 1982). I am indebted to Major Robert Meine, U.S. Army, for his comments on this article. He noted that the Army is particularly vulnerable to the effects of groupthink given our rank structure, defer- ence to authority, and organizational structure. 36. Heuer, ch. 8. The military has named this process “red teaming.” 37. This problem was a variation of Kahneman and Tversky’s famous taxicab experiment in Judgment under Uncertainty: Heuristics and Biases , ed. Daniel Kahneman and Amos Tversky (New York, Cambridge University Press, 1982), 156-57. It is similar to a quiz I gave during my Game Theory class at West Point. 38. Mathematically, this problem can be solved using Bayesian inference. 39. Some may feel that the lieutenant should err on the side of caution—assume the man is an insurgent until proven otherwise. This may save the lives of soldiers. However, in the broader context, this approach most de fi nitely will increase the innocent man’s sympathy for the insurgency (as well as his family’s). In fact, he and his kin may begin to actively support or join the insurgency. 40. For more, see Maya Bar-Hillel, “The base-rate fallacy in probability judgments.” Acta Psychologica 44 (1980): 211-33; Maya Bar-Hillel, “Studies of Representativeness,” in Judgment under uncertainty: Heuristics and biases , ed. Daniel Kahneman, Paul Slovic, and Amos Tversky (New York: Cambridge, 1982); and Kahneman and Tversky, “Evidential impact of base rates” in Judgment under Uncertainty: Heuristics and Biases , ed. Daniel Kahneman, Paul Slovic, and Amos Tversky (New York: Cambridge, 1982). 41. See the hospital example in Daniel Kahneman and Amos Tversky, “Sub- jective probability: A judgment of representativeness.” Cognitive Psychology 3 (1972): 430-54. 42. See the coin example in Daniel Kahneman and Amos Tversky, “Subjective probability: A judgment of representativeness,” Cognitive Psychology 3 (1972): 430-54. 43. 0.5*0.5*0.5*0.5*0.5*0.5 = 0.015625 or 1.56 percent. NOTES

Vistas

  • 965 Vistas totales
  • 813 Vistas del sitio web
  • 152 Vistas incrustadas

Acciones

  • 0 Acciones Sociales
  • 0 Me gusta
  • 0 No me gusta
  • 0 Comentarios

Share count

  • 0 Facebook
  • 0 Twitter
  • 0 LinkedIn
  • 0 Google+

Incrusta 1

  • 6 186.5.88.101