Action bias |
Sometimes people have an impulse to act in order to gain a sense of control over a situation and eliminate a problem. (Patt & Zeckhauser, 2000). For example, a person may opt for medical treatment rather than a no-treatment alternative, even though clinical trials have not supported the effectiveness of the treatment. |
Affect heuristic |
This represents a reliance on good or bad feelings experienced in relation to a stimulus. Affect-based evaluations are quick, automatic, and rooted in experiential thought that is activated prior to reflective judgments (see dual-system theory). For example, experiential judgments are evident when people are influenced by risks framed in terms of counts (e.g. “of every 100 patients similar to Mr. Jones, 10 are estimated to commit an act of violence”) more than an abstract but equivalent probability frame (e.g. “Patients similar to Mr. Jones are estimated to have a 10% chance of committing an act of violence to others”).
Affect-based judgments are more pronounced when people do not have the resources or time to reflect. See Slovic, 2000. |
Ambiguity effect |
The tendency to avoid options for which missing information makes the probability seem “unknown.” |
Anchoring or focalism |
The tendency to rely too heavily upon, or “anchor”, on one trait or piece of information when making decisions (usually the first piece of information acquired on that subject) and oftentimes irrelevant. |
Anthropomorphism or personification |
The tendency to characterize animals, objects, and abstract concepts as possessing human-like traits, emotions, and intentions. |
Altruism |
When people make sacrifices to benefit others without expecting a personal reward (Rushton, 1984). While altruism focuses on sacrifices made to benefit others, similar concepts explore making sacrifices to ensure fairness (see inequity aversion and social preferences). |
Anticipated regret |
The feeling we have when we have only a moment to take a certain action and can’t stop imagining how we’ll feel if we don’t do it. |
Association bias |
The way our minds connect unrelated things because of coincidence (as how we blame the messengers of bad news). |
Attentional bias |
The tendency of our perception to be affected by our recurring thoughts. |
Attribute scaling |
Manipulating a metric or scale to magnify or minimize a perceived magnitude. |
Auditory Encoding Price Perception Effect |
We verbally encode written numbers and that affects our perception of how expensive a product is. See “The effects of auditory representation encoding on price magnitude perceptions” – Coulter, Choi, Monroe (2012) |
Automation bias |
The tendency to depend excessively on automated systems which can lead to erroneous automated information overriding correct decisions. |
Availability heuristic |
The tendency to overestimate the likelihood of events with greater “availability” in memory, which can be influenced by how recent the memories are or how unusual or emotionally charged they may be. Closely linked to vividness. |
Availability cascade |
A self-reinforcing process in which a collective belief gains more and more plausibility through its increasing repetition in public discourse (or “repeat something long enough and it will become true”). [Cass Sunstein notes a way to counteract it is to say, “You’re more likely to marry a movie star than have X happen to you.” |
Aversion to extremes |
The tendency to avoid extremes, either too risky or too safe. Too risky of an investment and we tickle our loss aversion bias. Too safe and we have a bit of FOMO. The same applies too expensive and too cheap, which is why restaurants design their menus a certain way |
Backfire effect |
The reaction to disconfirming evidence by strengthening one’s previous beliefs. cf. Continued influence effect. |
Bandwagon effect |
The tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthink and herd behavior. |
Bait & switch heuristic |
When we substitute an easy question like “Does this person look knowledgeable?” for a difficult question, “Does this person actually know something about what I need?” (Phil Tetlock) |
Base rate fallacy or Base rate neglect |
The tendency to ignore base rate information (generic, general information) and focus on specific information (information only pertaining to a certain case). |
Belief bias |
An effect where someone’s evaluation of the logical strength of an argument is biased by the believability of the conclusion. |
Ben Franklin effect |
A person who has performed a favor for someone is more likely to do another favor for that person than they would be if they had received a favor from that person. |
Berkson’s paradox |
The tendency to misinterpret statistical experiments involving conditional probabilities. |
Bias blind spot |
The tendency to see oneself as less biased than other people, or to be able to identify more cognitive biases in others than in oneself. |
Bounded rationality |
A concept proposed by Herbert Simon that challenges the notion of human rationality as implied by the concept of homo economicus. There are limits to our thinking capacity, available information, and time (Simon, 1982). It is similar to the social-psychological concept that describes people as “cognitive misers” (Fiske & Taylor, 1991). (See also satisficing.) |
Cheerleader effect |
The tendency for people to appear more attractive in a group than in isolation. |
Choice architecture |
Presenting choices in a leading way but still allowing for the freedom of choice. |
Choice overload |
Also referred to as ‘overchoice’, the phenomenon of choice overload occurs as a result of too many choices being available to consumers. Choice overload may refer to either choice attributes or alternatives. The greater the number or complexity of choices offered, the more likely a consumer will apply heuristics. Overchoice has been associated with unhappiness (Schwartz, 2004), decision fatigue, going with the default option, as well as choice deferral—avoiding making a decision altogether, such as not buying a product (Iyengar & Lepper, 2000). |
Choice-supportive bias |
The tendency to remember one’s choices as better than they were. |
Clustering illusion |
The tendency to overestimate the importance of small runs, streaks, or clusters in large samples of random data (that is, seeing phantom patterns). It also addresses why we see faces in clouds but not clouds in faces – we humanize things that aren’t really there. |
Cobra effect |
Setting an incentive that accidentally produces the opposite result to the one intended. Also known as the Perverse Incentive. |
Cognitive dissonance |
Refers to the uncomfortable tension that can exist between two simultaneous and conflicting ideas or feelings—often as a person realizes that s/he has engaged in a behavior inconsistent with the type of person s/he would like to be, or be seen publicly to be. According to the theory, people are motivated to reduce this tension by changing their attitudes, beliefs, or actions. (Festinger, 1957) |
Commitment bias |
The tendency to be consistent with what we have already done or said we will do in the past, particularly if public. Inconsistency is not a desirable trait; thus, people try to keep their promises and reflect consistency. |
Commitment device |
A way to lock yourself into following a plan of action that you might not want to do but you know is good for you. In other words, a commitment device is a way to give yourself a reward or punishment to make an empty promise stronger and believable. |
Confirmation bias |
The tendency to search for, interpret, focus on and remember information in a way that confirms one’s preconceptions. |
Congruence bias |
The tendency to test hypotheses exclusively through direct testing, instead of testing possible alternative hypotheses. |
Conjunction fallacy |
The tendency to assume that specific conditions are more probable than general ones. |
Conservatism (belief revision) |
The tendency to revise one’s belief insufficiently when presented with new evidence. |
Continuation probability signaling |
Where we treat each as if our relationship has a future. |
Continued influence effect |
The tendency to believe previously learned misinformation even after it has been corrected. Misinformation can still influence inferences one generates after a correction has occurred. Also see Backfire effect |
Contrast effect |
The enhancement or reduction of a certain perception’s stimuli when compared with a recently observed, contrasting object. If an average looking young adult goes out looking for a date with a supermodel friend, the contrast will work against them. Related to the Cornsweet illusion. |
Control premium |
Refers to people’s willingness to forego potential rewards in order to control (avoid delegation) of their own payoffs. |
Correlation neglect |
According to Colin Camerer, it is “the idea that people neglect how one information source can influence different people and get mistakenly double-counted. It is a reduced-form concept trying to capture something much more basic, which is why and when representations of information structure are simplified. To be clear, I think this kind of error can occur, but I don’t think it is a fundamental major construct that is going to account for a lot of different effects. Twenty years from now people will look back and reminisce about it, as they reminisce about fashion fads.” |
Correspondence bias |
When we fail to account for the difference in the informational sources. For example, if one person took an easier exam and scored higher than another person taking a harder exam, people tend to judge the person taking the easier exam as more knowledgeable (despite the fact that they know that the test is easier). F. Gino – Sidetracked |
Courtesy bias |
The tendency to give an opinion that is more socially correct than one’s true opinion, to avoid offending anyone. |
Curse of knowledge |
When better-informed people find it extremely difficult to think about problems from the perspective of lesser-informed people. |
Declinism |
The belief that a society or institution is tending towards decline. Particularly, it is the predisposition to view the past favorably (rosy retrospection) and future negatively. |
Decision fatigue |
There are psychological costs to making decisions. Since choosing can be difficult and requires effort, just like any other activity, long sessions of decision making can lead to poor choices. Similar to other activities that consume resources required for executive functions, decision fatigue is reflected in self-regulation, such as a diminished ability to exercise self-control (Vohs et al., 2008). (See also choice overload and ego depletion.) |
Decision staging |
When people make complex or long decisions, such as buying a car, they tend to explore their options successively. This involves deciding what information to focus on, as well as choices between attributes and alternatives |
Decoy effect |
Preferences for either option A or B change in favor of option B when option C is presented, which is like option B but in no way better. |
Default |
Default options are pre-set courses of action that take effect if nothing is specified by the decision maker (Thaler & Sunstein, 2008), and setting defaults is an effective tool in choice architecture when there is inertia or uncertainty in decision making (Samson, 2014). |
Denomination effect |
The tendency to spend more money when it is denominated in small amounts (e.g., coins) rather than large amounts (e.g., bills). We’re less likely to break a $50 than a $5 bill. |
Diderot effect |
The way goods are purchased to become part of complementary mosaic of identity and the introduction of a new item, that deviates from that identity, causes a spiral of consumption in an attempt to forge a new cohesive whole. This is central to why lottery winners end up destitute after they’ve begun to re-create their lives in a luxurious manner. |
Discount-Distance Congruity Effect |
Numerical difference perceptions may be influenced by the physical distance between two prices. When it comes to numeric discounts, it’s the opposite of the Cornsweet Effect and demonstrates that discounts further apart are more effective at conveying value than those that are closer together. See “The effects of physical distance between regular and sale prices on numerical difference perceptions” – Coulter/Norberg 2008 |
Disposition effect |
The tendency to sell an asset that has accumulated in value and resist selling an asset that has declined in value. |
Distinction bias |
The tendency to view two options as more dissimilar when evaluating them simultaneously than when evaluating them separately. |
Diversification bias |
People seek more variety when they choose multiple items for future consumption simultaneously than when they make choices sequentially, i.e. on an ‘in the moment’ basis. Diversification is non-optimal when people overestimate their need for diversity (Read & Loewenstein,1995). In other words, sequential choices lead to greater experienced utility. For example, before going on vacation I may upload classical, rock and pop music to my MP3 player, but on the actual trip I may mostly end up listening to my favorite rock music. (See also projection bias.) |
Dual-system theory |
Dual-system models of the human mind contrast automatic, fast, and non-conscious (System 1) with controlled, slow, and conscious (System 2) thinking. Many heuristics and cognitive biases studied by behavioral economists are the result of intuitions, impressions, or automatic thoughts generated by System 1 (Kahneman, 2011). Factors that make System 1’s processes more dominant in decision making include cognitive busyness, distraction, time pressure, and positive mood, while System 2’s processes tend to be enhanced when the decision involves an important object, has heightened personal relevance, and when the decision maker is held accountable by others (Samson & Voyer, 2012; Samson & Voyer, 2014). |
Dunning–Kruger effect |
The tendency for unskilled individuals to overestimate their own ability and the tendency for experts to underestimate their own ability. That “the miscalibration of the incompetent stems from an error about the self, whereas the miscalibration of the highly competent stems from an error about others.” Also known as the Delusion of Competence. |
Duration heuristic |
The tendency to evaluate services based on their duration rather than on their content. Consumers rely on the duration heuristic because it simplifies the evaluation process. In particular, the duration heuristic is most likely to be seen when the duration of the service experience is evaluable relative to other features and when duration is considered in relation to price. (Imagine how you’d feel if the locksmith was done in 5 minutes versus 35 minutes.) |
Duration neglect |
The neglect of the duration of an episode in determining its value |
Edison effect |
Without testing 1,000 unsuccessful ways to build a light bulb, Edison believed the successful requirements to build the light bulb wouldn’t have been clear. |
Elimination by aspects heuristic |
When decision makers gradually reduce the number of alternatives in a choice set, starting with the aspect that they see as most significant. One cue is evaluated at a time until fewer and fewer alternatives remain in the set of available options (Tversky, 1972); for example, a consumer may first compare a number of television sets on the basis of brand, then screen size, and finally price, etc., until only one option remains. |
Empathy gap |
The tendency to underestimate the influence or strength of feelings, in either oneself or others. A hot-cold empathy gap occurs when people underestimate the influence of visceral states (e.g. being angry, in pain, or hungry) on their behavior or preferences. |
Endowment effect |
The tendency for people to demand much more to give up an object than they would be willing to pay to acquire it. |
Exaggerated expectation |
Based on the estimates, real-world evidence turns out to be less extreme than our expectations (conditionally inverse of the conservatism bias). |
Experimenter’s or expectation bias |
The tendency for experimenters to believe, certify, and publish data that agree with their expectations for the outcome of an experiment, and to disbelieve, discard, or downgrade the corresponding weightings for data that appear to conflict with those expectations. |
Fairness |
In behavioral science, fairness refers to our social preference for equitable outcomes. This can present itself as inequity aversion, people’s tendency to dislike unequal payoffs in their own or someone else’s favor. This tendency has been documented through experimental games, such as the ultimatum, dictator, and trust games (Fehr & Schmidt, 1999). |
Bear of better option (FOBO) |
FOBO, the younger sister of FOMO (fear of missing out), has come upon us. Both terms are coined by Patrick McGinnis, a US venture capitalist, while he was a student at Harvard. He proposes that the Fear of a Better Option is the reason we can’t commit to anything.
FOBO as a theoretical concept indicates you’re scared something better is out there, or is going to come along, and as such you find it difficult to make a choice. You might find the choice so difficult, you might be willing to postpone it, indefinitely. |
Fear of missing out (FOMO) |
Social media has enabled us to connect and interact with others, but the number of options offered to us through these channels is far greater than what we can realistically take up, due to limited time and practical constraints. The popular concept of FoMO, refers to “a pervasive apprehension that others might be having rewarding experiences from which one is absent” (Przybylski et al., 2013). People suffering from FoMO have a strong desire to stay continually informed about what others are doing (see also scarcity, regret aversion, and loss aversion). |
Focusing effect/illusion |
The tendency to place too much importance on one aspect of an event. / When people are induced to believe that they “must have” a product, they greatly exaggerate the difference that a particular product will make to the quality of their life. |
Forer effect or Barnum effect |
The observation that individuals will give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically for them but are in fact vague and general enough to apply to a wide range of people. This effect can provide a partial explanation for the widespread acceptance of some beliefs and practices, such as astrology, fortune telling, graphology, and some types of personality tests. |
Framing effect |
Drawing different conclusions from the same information, depending on how that information is presented. This technique was part of Tversky and Kahneman’s development of prospect theory, which framed gambles in terms of losses or gains (Kahneman & Tversky, 1979a). |
Frequency illusion |
The illusion in which a word, a name, or other thing that has recently come to one’s attention suddenly seems to appear with improbable frequency shortly afterwards (not to be confused with the recency illusion or selection bias). This illusion may explain some examples of the Baader-Meinhof Phenomenon, when someone repeatedly notices a newly learned word or phrase shortly after learning it. |
Functional fixedness |
Limits a person to using an object only in the way it is traditionally used. |
GI Joe fallacy |
The misconception that decision making is driven by knowledge. “Knowing is half the battle,” said TV cartoon hero GI Joe. |
Gambler’s fallacy |
The tendency to think that future probabilities are altered by past events when they are actually unchanged. The fallacy arises from an erroneous conceptualization of the law of large numbers. For example, “I’ve flipped heads with this coin five times consecutively, so the chance of tails coming out on the sixth flip is much greater than heads.” |
Goal gradient effect |
The increased effort we naturally put forth as we approach the end goal. |
Groupthink |
A set of negative group-level processes, including illusions of invulnerability, self-censorship, and pressures to conform, that occur when highly cohesive groups seek concurrence when making a decision. |
Habit |
Habit is an automatic and rigid pattern of behavior in specific situations, which is usually acquired through repetition and develops through associative learning (see also System 1 in dual-system theory), when actions become paired repeatedly with a context or an event (Dolan et al., 2010). ‘ |
Hanlon’s razor |
An aphorism expressed in various ways, including: “Never attribute to malice that which can be adequately explained by stupidity.” Named after Robert J. Hanlon, it is a philosophical razor which suggests a way of eliminating unlikely explanations for human behavior. (See Ockham’s razor) |
Hard–easy effect |
Based on a specific level of task difficulty, the confidence in judgments is too conservative and not extreme enough |
Hindsight bias |
Sometimes called the “I-knew-it-all-along” effect, the tendency to see past events as being predictable before they happened. |
Hofstadter’s law |
The observation that “It always takes longer than you expect, even when you take into account Hofstadter’s Law.” In other words, time estimates for how long anything will take to accomplish always fall short of the actual time required even when the time allotment is increased to compensate for the human tendency to underestimate it. |
Honesty |
In both business and our private lives, relationships are made and broken based on our trust in the other party’s honesty and reciprocity. |
Hot-hand fallacy |
The “hot-hand fallacy” (also known as the “hot hand phenomenon” or “hot hand”) is the fallacious belief that a person who has experienced success with a random event has a greater chance of further success in additional attempts. |
Hyperbolic discounting |
This is the tendency for people to have a stronger preference for more immediate payoffs relative to later payoffs. Hyperbolic discounting leads to choices that are inconsistent over time – people make choices today that their future selves would prefer not to have made, despite using the same reasoning. Also known as current moment bias, present-bias, and related to Dynamic inconsistency. |
Identifiable victim effect |
The tendency to respond more strongly to a single identified person at risk than to a large group of people at risk. |
Identity economics |
Identity economics describes the idea that we make economic choices based on monetary incentives and our identity. A person’s sense of self or identity affects economic outcomes. This was outlined in Akerlof & Kranton’s (2000) seminal paper which expanded the standard utility function to include pecuniary payoffs and identity economics in a simple game-theoretic model of behavior, further integrating psychology and sociology into economic thinking. |
IKEA effect |
The tendency for people to place a disproportionately high value on objects that they partially assembled themselves, such as furniture from IKEA, regardless of the quality of the end result. |
Incentive |
An incentive is something that motivates an individual to perform an action. It is therefore essential to the study of any economic activity. Incentives, whether they are intrinsic or extrinsic, can be effective in encouraging behavior change, such as ceasing to smoke, doing more exercise, complying with tax laws or increasing public good contributions. Traditionally the importance of intrinsic incentives was underestimated, and the focus was put on monetary ones. Monetary incentives may backfire and reduce the performance of agents or their compliance with rules (see also over-justification effect), especially when motives such as the desire to reciprocate or the desire to avoid social disapproval (see social norms) are neglected. These intrinsic motives often help to understand changes in behavior (Fehr & Falk, 2002). |
Input bias |
Where our judgments are influenced by the input of a situation. For example, a piece of work is judged to be better when we hear that the creator labored over it for a long time. F. Gino – Sidetracked |
Illusion of control |
The tendency to overestimate one’s degree of influence over other external events. |
Illusion of validity |
Belief that incremental acquired information generates additional relevant data for predictions, even when it evidently does not. |
Illusionary progress to goal |
Increased effort due the perception that we are closer to the goal than we really are. |
Illusory correlation |
Inaccurately perceiving a relationship between two unrelated events. |
Illusory truth effect |
A tendency to believe that a statement is true if it is easier to process, or if it has been stated multiple times, regardless of its actual veracity. These are specific cases of truthiness. |
Impact bias |
The tendency to overestimate the length or the intensity of the impact of future feeling states. |
Inequity aversion |
Human resistance to “unfair” outcomes is known as ‘inequity aversion’, which occurs when people prefer fairness and resist inequalities. |
Inertia |
In behavioral economics, inertia is the endurance of a stable state associated with inaction and the concept of status quo bias (Madrian & Shea 2001). In social psychology the term is sometimes also used in relation to persistence in (or commitments to) attitudes and relationships. Decision inertia is frequently counter-acted by setting defaults. |
Information avoidance |
Information avoidance in behavioral economics (Golman et al., 2017) refers to situations in which people choose not to obtain knowledge that is freely available. Active information avoidance includes physical avoidance, inattention, the biased interpretation of information (see also confirmation bias) and even some forms of forgetting |
Information bias |
The tendency to seek information even when it cannot affect action. |
Intertemporal choice |
Intertemporal choice is a field of research concerned with the relative value people assign to payoffs at different points in time. It generally finds that people are biased towards the present (see present bias) and tend to discount the future (see time discounting and dual-self model). |
Insensitivity to sample size |
The tendency to under-expect variation in small samples. |
Irrational escalation |
The phenomenon where people justify increased investment in a decision, based on the cumulative prior investment, despite new evidence suggesting that the decision was probably wrong. Also known as the sunk cost fallacy. |
Lallapaloosa effect |
When multiple biases are acting in concert at the same time. |
Last disaster effect |
When our emergency responses to the current disaster reflect our experience from the last disaster. This varies by geography, too. Annie Duke (3/27/2020) noted that prior to the coronavirus, the biggest and most recent disasters in the United States were hurricanes; whereas in SE Asia, the most recent disasters were epidemics. |
Law of the instrument |
“If all you have is a hammer, everything looks like a nail.” |
Less-is-better effect |
The tendency to prefer a smaller set to a larger set judged separately, but not jointly. |
Look-elsewhere effect |
An apparently statistically significant observation may have actually arisen by chance because of the size of the parameter space to be searched. |
Loss aversion |
The disutility of giving up an object is greater than the utility associated with acquiring it. (see also Sunk cost effects and endowment effect). |
Maslow’s hammer |
The idea that we can be lulled into an overreliance on our most familiar tools—a bias also known as the “law of the instrument.” |
McNamara fallacy |
Relying solely on metrics in complex situations and losing sight of the bigger picture. |
Mere exposure effect |
The tendency to express undue liking for things merely because of familiarity with them. |
Money illusion |
The tendency to concentrate on the nominal value (face value) of money rather than its value in terms of purchasing power. |
Moral credential effect |
The tendency of a track record of non-prejudice to increase subsequent prejudice. |
Negativity bias or Negativity effect |
Psychological phenomenon by which humans have a greater recall of unpleasant memories compared with positive memories. (see also actor-observer bias, group attribution error, positivity effect, and negativity effect). |
Neglect of probability |
The tendency to completely disregard probability when making a decision under uncertainty. (Opposite of the overweighting of small probabilities.) |
Normalcy bias |
The refusal to plan for, or react to, a disaster which has never happened before. |
Not invented here |
Aversion to contact with or use of products, research, standards, or knowledge developed outside a group. Related to IKEA effect. |
Numerosity bias |
Epitomized by the quote from Yogi Berra: “Cut my pizza into four slices, I can’t eat eight.” Despite a difference in the expression of amount, the size of the resource (i.e., the pie) remains the same, but we don’t see it that way. The numerosity bias explains that individuals tend to over-infer quantity when it is represented with higher numeric values or bigger numbers (2,000 cents vs. $20.00). |
Observer-expectancy effect |
When a researcher expects a given result and therefore unconsciously manipulates an experiment or misinterprets data in order to find it (see also subject-expectancy effect). |
Ockham’s broom |
Attributable to Sydney Brenner, it is the principle whereby inconvenient facts are swept under the carpet in the interests of a clear interpretation of a messy reality. |
Ockham’s razor |
The principle whereby gratuitous suppositions are shaved from the interpretation of facts. Created by a Franciscan monk, William of Ockham in the fourteenth century. |
Omission bias |
The tendency to judge harmful actions as worse, or less moral, than equally harmful omissions (inactions). |
Optimism bias |
The tendency to be over-optimistic, overestimating favorable and pleasing outcomes (see also wishful thinking, valence effect, positive outcome bias). |
Ostrich effect |
Ignoring an obvious (negative) situation. |
Outcome bias |
The tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time it was made. |
Output bias |
Where our judgments are influenced by the outcome of a situation; for example, we are more forgiving of risky operations if the results are positive. All these biases show the tendency to focus on irrelevant information when making a decision. F. Gino – Sidetracked (Also see Annie Duke: RESULTING) |
Overconfidence effect |
Excessive confidence in one’s own answers to questions. For example, for certain types of questions, answers that people rate as “99% certain” turn out to be wrong 40% of the time. Overconfidence Bias is that the expert’s overestimation of his abilities increases as the difficulty of the task increases. ie. The more difficult/complex the task is, the more confident the expert is that he can complete the task. Reversely, when the task gets easier, the expert still believes that he can accomplish the task better than an average layperson but his advantage is not as great. |
Peltzman effect |
The reduction of predicted benefit from regulations that intend to increase safety. |
Pareidolia |
A vague and random stimulus (often an image or sound) is perceived as significant, e.g., seeing images of animals or faces in clouds, the man in the moon, and hearing non-existent hidden messages on records played in reverse. |
Pessimism bias |
The tendency for some people, especially those suffering from depression, to overestimate the likelihood of negative things happening to them. |
Planning fallacy |
The tendency to underestimate task-completion times. |
Post-purchase rationalization |
The tendency to persuade oneself through rational argument that a purchase was good value. |
Preference reversal |
Preference reversal refers to how the relative preference for one option over another changes with order or framing. Preference reversals contradict the predictions of rational choice. |
Press secretary |
Jonathan Haidt, Dan Dennet and Robert Kurzban believe the brain’s Press Secretary must, like a president or prime ministers’ press secretary, explain our actions to other people. “I don’t know” is simply not an acceptable answer. Many times the brain’s Press Secretary is correct, but other times, it’s just guessing, piecing together clues in order to give a plausible answer. The tricky thing is, we don’t know when our Press Secretary is using facts and when it’s just guessing. |
Pro-innovation bias |
The tendency to have an excessive optimism towards the usefulness of an invention or innovation throughout society, while often failing to identify its limitations and weaknesses. |
Probability neglect |
The tendency to disregard probability when making a decision under uncertainty and is one simple way in which people regularly violate the normative rules for decision making. Small risks are typically either neglected entirely or hugely overrated. The continuum between the extremes is ignored. The term probability neglect was coined by Cass Sunstein. |
Projection bias |
The tendency to overestimate how much our future selves share one’s current preferences, thoughts and values, thus leading to suboptimal choices. |
Proportionality bias |
A desire that drives us to want the magnitude of the event to match the magnitude of whatever caused it. (Gamblers roll dice gently to get low numbers and more aggressively to get double 6’s. And The idea that JFK was the victim of a lone gunman is intuitively less plausible than the theory that his assassination was part of a much larger plan.) |
Pseudocertainty effect |
The tendency to make risk-averse choices if the expected outcome is positive but make risk-seeking choices to avoid negative outcomes. |
Pseudo set framing |
An arbitrary set manufactured for the sole purpose of creating the idea of wholeness |
Quantification bias |
Overvaluing data we are measuring and undervaluing data we aren’t. (See Tricia Wang at Sudden Compass.) |
Reactance |
The urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to constrain your freedom of choice (see also Reverse psychology). |
Reactive devaluation |
Devaluing proposals only because they purportedly originated with an adversary. |
Recency illusion |
The illusion that a word or language usage is a recent innovation when it is in fact long-established (see also frequency illusion). |
Regressive bias |
A certain state of mind wherein high values and high likelihoods are overestimated while low values and low likelihoods are underestimated. |
Restraint bias |
The tendency to overestimate one’s ability to show restraint in the face of temptation or to manage our self-control. |
Rhyme as reason effect |
Rhyming statements are perceived as more truthful. A famous example being used in the O.J Simpson trial with the defense’s use of the phrase “If the gloves don’t fit, then you must acquit.” |
Risk compensation |
The tendency to take greater risks when perceived safety increases; becoming more careful where they sense greater risk and less careful if they feel more protected. |
Selective perception |
The tendency for expectations to affect perception. |
Semmelweis reflex |
The tendency to reject new evidence that contradicts a paradigm. |
Sexual overperception bias / sexual under-perception bias |
The tendency to over/underestimate sexual interest of another person in oneself. |
Social comparison bias |
The tendency, when making hiring decisions, to favor potential candidates who don’t compete with one’s own particular strengths. |
Social desirability bias |
The tendency to over-report socially desirable characteristics or behaviors in oneself and under-report socially undesirable characteristics or behaviors. |
Somatic marker theory |
Emotional processes guide (or bias) behavior, particularly decision-making. “Somatic markers” are feelings in the body that are associated with emotions, such as the association of rapid heartbeat with anxiety or of nausea with disgust. |
Status quo bias |
The tendency to like things to stay relatively the same (see also loss aversion, endowment effect, and system justification). |
Stereotyping |
Expecting a member of a group to have certain characteristics without having actual information about that individual. |
Subadditivity effect |
The tendency to judge probability of the whole to be less than the probabilities of the parts. |
Subjective validation |
Perception that something is true if a subject’s belief demands it to be true. Also assigns perceived connections between coincidences. |
Survivorship bias |
Concentrating on the people or things that “survived” some process while inadvertently overlooking those that didn’t because of their lack of visibility. Similar to how people process lucky breaks in their lives as brilliant decisions rather than fortunate coincidences. |
Temporal construal |
Near-term events are valued in very concrete ways, but distant-term events are valued in very vague ways. Similar to hyperbolic discounting. |
Time-saving bias |
Underestimations of the time that could be saved (or lost) when increasing (or decreasing) from a relatively low speed and overestimations of the time that could be saved (or lost) when increasing (or decreasing) from a relatively high speed. |
Third-person effect |
Belief that mass-communicated media messages have a greater effect on others than on themselves. |
Triviality / Parkinson’s Law of… |
The tendency to give disproportionate weight to trivial issues. Also known as bikeshedding, this bias explains why an organization may avoid specialized or complex subjects, such as the design of a nuclear reactor, and instead focus on something easy to grasp or rewarding to the average participant, such as the design of an adjacent bike shed. |
Unit bias |
The tendency to want to finish a given unit of a task or an item. Strong effects on the consumption of food in particular. |
Visual Depiction Effect |
We are more attracted to things when shown in a way that helps us visualize ourselves using it. For example, because most people are right handed, showing an image of a cup with the handle on the right side is most effective. See “The Visual Depiction Effect in Advertising: Facilitating Embodied Mental Simulation through Product Orientation” – Elder & Krishna (2012) |
Warm glow |
When someone feels good for helping a charity but does not pay attention to the actual return to the charity. |
Weber–Fechner law |
Difficulty in comparing small differences in large quantities. |
Well-travelled road effect |
Underestimation of the duration taken to traverse oft-traveled routes and overestimation of the duration taken to traverse less familiar routes. |
Winner’s curse |
In mostly auction environments, the winner often over-bids the optimal value. Related to the endowment effect. |
Zero-risk bias |
Preference for reducing a small risk to zero over a greater reduction in a larger risk. |
Zero-sum bias |
A bias whereby a situation is perceived to be like a zero-sum game (i.e., one person gains at the expense of another). |
W.O.O.P. |
We tend to focus on our ideal outcome (I want to lose weight) rather than the obstacles we’ll face to get there (pizza is delicious). Gabriele Oettingen developed a tactic to overcome hurdles.
First you identify your WISH (losing weight) and imagine the OUTCOME (having lost weight). Then you think about a likely OBSTACLE (I love pizza) and make a concrete PLAN to get around it (avoid all pizzerias). |