Friday, June 28, 2019

Professor Thaler’s Nobel Prize Speech (2018)

Richard H. Thaler, “From Cashews to Nudges: The Evolution of Behavioral Economics.” American Economic Review 108(6): 1265–1287, June 2018.

• "In the beginning there were stories [p. 1265]." Those stories were of anomalies with respect to the standard economic model of rational choice. For instance, the "cashews" in the title refer to an incident where guests thanked then-graduate-student Richard Thaler for rendering inaccessible the pre-dinner snacks: the guests seemed to believe that the loss of the option to eat cashews made them better off.

• A key development in behavioral science was the work of Kahneman and Tversky, which indicated that departures from fully rational behavior are systematic. This non-randomness suggests that modifications of the rational model could do a better job at explaining behavior. Further, Kahneman and Tversky's prospect theory offers a simple explanation for some of these systematic departures.

• Standard economic theory often is presented as a descriptive theory of how people choose as well as a normative theory of how people should choose. It is better at the latter (normative) task than at the former (descriptive) task. 

Thaler and Shefrin propose a planner/doer model of intrapersonal (or intra-firm) conflict; the planner component of a decision maker can alter the doer's incentives by employing commitment strategies or deploying the (payoff-compromising) induction of guilt. 

• Economics typically assumes that dollars are fungible, that the best way to spend them is independent of where the dollars originated. But people assign income to different "mental accounts" according to its origin. Some income might be allocated to a mental checking account, so it is psychologically available for spending, whereas other funds might be assigned to mental savings, and therefore, are psychologically unavailable for quotidian consumer purchases. Dollars, in practice, are not fungible -- as anyone who has ever "played with house money" knows. 

• Is it fair to raise prices just because demand increases, as when a snowstorm makes snow shovels highly sought after? Many people (but not many economists!) think that such price increases are unfair, and might punish businesses that engage in such behavior. 

• The endowment effect is when mere “ownership” of an object (like a coffee mug) seems to raise the valuation that the “owner” places on the object. Endowment effects reduce the willingness-to-trade of owned objects, and some of Professor Thaler's work shows that endowment effects are common even when decision makers operate in markets, and when they have opportunities to learn over time.

• Even in financial markets, with their ongoing nature, high stakes, and sophisticated participants, security prices are not always right: for instance, parts of a firm can bizarrely be deemed more valuable in financial markets than is the entire firm (even though the complementary parts are not themselves of negative value). 

• Draft picks in the National Football League are mispriced: the best value for teams seems to lie in draft picks in the early part of the second round. The people making the draft picks might be excessively optimistic about their ability to identify winners when making first round picks.

• In the standard rational choice model, nudges (aspects of the choice architecture that don't affect standard payoffs or incentives) would not have much influence on choices -- but they often do. 

 Some firms try to use nudges to take advantage of those systematic departures from rational behavior on the part of their consumers; Professor Thaler calls this nefarious sort of choice architecture "sludge."




Tuesday, June 25, 2019

Golman, Hagmann, and Loewenstein (2017) on Information Avoidance

Russell Golman, David Hagmann, and George Loewenstein, “Information Avoidance.” Journal of Economic Literature 55(1): 96-135, March 2017 [pdf].

• Why do you not want to see my outline of this article?


• Rational people in an individual decision-making situation – that is, situations where information possession holds no implications for inter-personal strategic concerns – should always be happy to receive accurate information if it is freely available. But people often actively avoid acquiring such information.

• Patients avoid medical information, teachers don’t read their evaluations, and, in down markets, investors don’t check the value of their portfolios. Golman et al. refer to “active information avoidance” as involving the avoidance of information even if the info is free and the person knows it is available.

• A studied ignorance might lend us moral wiggle room: for instance, if we “don’t know” the conditions on factory farms, we can live with our dining choices. (OK, that example is mine, not in the article.) We might want to maintain plausible deniability as a way to avoid unwelcome responsibilities.

• Even if you learn information, you might (strangely?) choose to ignore it.

• We might have protected beliefs, so we actively avoid information that might challenge those beliefs. We could go so far as to seek to silence dissenters from our views.

• Surely some information avoidance is good: if you will eat dessert anyway, perhaps it is best not to know its caloric load. Perhaps you can live a more pleasant life by not knowing of some looming medical problem, or about your partner’s infidelity.

• When new information becomes universally available, confirmation biases can lead to further polarization of opinions. New (controverting) information might even remind you of (and sustain) the reasons that you held your initial beliefs in the first place.

• Intelligence does not protect people against biased beliefs; rather, it might make it easier to justify one’s preferred beliefs.

• People can self-sabotage  undertake actions not fitted to their abilities, for instance  to avoid learning the quality of their talents.

• People might avoid information because they prefer that compound lotteries (that resolve over time) might best be enjoyed by only learning the final, overall outcome. (Do I want to know the score of the White Sox game in the third inning? If the Sox are behind, I will be sad, and if they are ahead, then the only new “news” I can get later occurs if they lose, and the loss will then be extra painful.)

• Sometimes people are just plain curious, they seek out information that is of no instrumental value to them.

• Humans often are mistaken in thinking that the receipt of bad news will unleash anxiety  and therefore they might be mistaken in trying to shield themselves from bad news. Sometimes the elimination of uncertainty can begin the process of adaptation to the new normal.

• Bad news attracts our attention, and so we might try to avoid potentially bad news as a method to prevent this “distraction” from occurring.

• Regret aversion might lead people to avoid information on how things would have turned out if they had taken a different path.

• People tend to be overly optimistic, and such optimism appears to hold benefits. Information might be avoided or ignored for the sake of sustaining optimism  within individuals as well as groups

• One reason people might try to maintain existing beliefs is that they are serviceable, even if flawed – you can only rebuild a leaky boat at sea piecemeal, you can’t tear it down and start fresh. 

• If you are committed to a belief, you will not want to acquire information challenging it. The belief might even be part of your self-image, your identity.  

• Information avoidance might help with self-control issues, such as side-stepping temptations or maintaining motivation. 

• Spoiler alert! People might avoid information to maintain pleasant suspense.