10.10.2018

Here is a little article on political correctness that is sure to spark some lively conversation in interpersonal and intercultural communication classes.

https://www.amny.com/opinion/columnists/mike-vogel/a-new-low-for-political-correctness-outrage-1.21777884

10.03.2018

Biases in Making Choices


Biases in Making Choices

This is a word copy of an article that was published in Etc: A Review of General Semantics 73, October 2016, pp. 314-320.  (The journal is a bit behind and this was actually published October 2018.) The purpose of the article was not to propose new theories but just to put some of these cognitive biases together and apply them to choice-making. And, as usual, the insights of General Semantics make a lot of confusing things a lot clearer.

Making choices is not an easy task. It regularly creates stress and regret. Everyone wants to make the right choices or at least what we imagine the right choice might be.
The process is complicated and made less effective than might be because of a variety of cognitive biases that impair logical thinking and analysis and lead to errors of judgment, misevaluations, and bad choices. The trick is to identify the biases and to confront them with more logical, more mindful, analysis.
Here we single out just five of the many biases (the ambiguity bias, the bandwagon bias, the anchoring bias, the confirmation bias, and the status quo bias), explaining how they operate, offering some examples, and proposing counter measures you can take to help reduce the influence of these biases.
The Ambiguity Bias
All choices involve some degree of ambiguity—there is always some unknown factor that can alter the effectiveness of the choice. The ambiguity bias leads you to select the choice that is least ambiguous, the choice that is most certain (Ellsberg, 1961, 2001; Chew, Ebstein, & Zhong, 2012).
For example, assume you’re buying a used car and you’re trying to decide between two very similar cars—one has 70,000 miles and the other has a broken mileage counter and so may have fewer or more miles than 70,000. All other things being equal, you’d be more likely to choose the one with known mileage even if it’s high.
Consider a student selecting a course taught by two professors; one has average ratings on Ratemyprofessor.com and the other is unrated.  It’s likely that the student would select the professor who has a rating, even though it’s just average.
Or consider selecting a restaurant from the reviews on Yelp. Of the nearby choices, one has a 3-star rating and the other is unrated. More than likely you’d chose the 3-star restaurant.
In all of these cases, you figure that the known average (or even below average) is better than the unknown which could be a lot worse. Of course, if the professor’s rating as horrible and the restaurant is given no stars, you’d likely go with the unknown. But, when things are at the acceptable level of tolerance, you’d likely choose the known.
It’s interesting to note in this connection that cultures differ widely in their tolerance of ambiguity (Hofstede, Hofstede, & Minkov, 2010).  Persons from Singapore, Jamaica, Denmark, Sweden, Hong Kong, and other high ambiguity tolerant cultures are more tolerant of uncertainty and ambiguity while persons from Greece, Portugal, Guatemala, Uruguay, Belgium, and other low ambiguity tolerant cultures are less so and experience greater discomfort when situations are ambiguous. Students from low ambiguity tolerant cultures, for example, will want assignments that are clear and specific and will feel uncomfortable with ambiguous assignments; students from high ambiguity tolerant cultures will respond in opposite ways.
            The antidote to the ambiguity bias is—when possible—to reduce the ambiguity so that all choices can be examined more objectively. When that’s not possible—as in the case of the broken mileage counter—you’d need to make inferences based on other factors—the wear of the tires, the condition of the seats, the dings, and so on—that might help you better estimate mileage. And the student might reduce uncertainty by sitting in on a class or two with each instructor before making a decision.
The Bandwagon Bias
The term bandwagon seems to have been used, originally, in reference to the wagon that carried the band in the circus. Later, politicians would use a wagon with a band to promote their candidate and urged voters to jump on the bandwagon in support their candidate.
            As a cognitive bias, the bandwagon bias refers to the very common tendency to go with the crowd, to believe what others believe and to do what others do (Nelson, 2016). Going along with the crowd reduces the stress on choice making. Even if you make the wrong choice, you have lots of company and that’s comforting. As business leader Lei Jun put it: “Things get much easier if one jumps on the bandwagon of existing trends.” But that’s obviously not always the most effective choice.
            And, it’s not just the majority that influence us; it’s also the attractiveness of the other people. This attractiveness bias leads you to follow and to be influenced by those you consider attractive. That is, you’re more likely to believe and act in the way attractive people do than the way unattractive people do. It’s one of the reasons that clothing models are all above average in attractiveness. It’s also one of the reasons that political polls are so important and so influential; you want to vote for and support a winner, not a loser and so polls may influence voters more than report voter preferences. In groups, this tendency is referred to a groupthink, the tendency to agree with the majority and to not voice opposition (Janis, 1983; Richmond, McCroskey, & McCroskey, 2005).
            The antidote here is to recognize that the majority—however attractive--can be wrong. “Even when the experts all agree,” noted Bertrand Russell, “they may well be mistaken.” A good example of this is seen in the bystander bias where a group of people, even while witnessing a crime, may do nothing (Darley & LatanĂ©, 1968), a bias articulated after observing that a crowd of people did nothing while a woman was being murdered. More important, perhaps, is to recognize that other people are not you. They have different needs, wants, motivations, objectives, talents, and so on. What is right for them—however, many or however attractive—is not necessarily right for you.
The Anchoring Bias
The anchoring bias, as you can guess from its name, leads you to focus on just one bit of information about the choices and to ignore or give less importance to other factors. It anchors your thinking to one aspect; usually, it’s the first impression (McElhany, 2016; Dean, 2013).
Often it takes the form of anchoring decisions to what you’re used to or what you expect based on past experiences. For example, let’s say you’re a college student and have worked at a variety of fast food restaurants for $13 an hour. Now, you’re offered a job at $14 an hour and so you take it. You take it because you’ve anchored your pay at $13 and the increase of a dollar is seen as a reason to take the new job.
The anchoring bias leads you to evaluate your choices according to some baseline. For example, in buying a house, the anchoring figure is the asking price and it is around this asking price that negotiations will take place. The same is true for a car; the sticker price is the baseline; it’s the anchor.
            The antidote here is to examine the negatives of your focus and the positives of the other aspects of the choice. Recognizing that the anchor is in fact an anchor and is getting in the way of your impartial examination of other choices is likely to help reduce the effect of this bias. And, of course, you can always try to reset the anchor: I’m looking for a BMW 5 for X-amount; what can you do for me?
The Confirmation Bias
Umberto Eco once wrote that “followers of the occult believe in only what they already know, and in those things that confirm what they have already learned.” In reality most people, though not occult followers, seek confirmation for their beliefs; they operate with the confirmation bias, a bias that leads you (sometimes consciously and sometimes unconsciously) to seek confirmation of your selected choice (Cherry, 2017). Whatever choice you want to be right will lead you to seek reasons why that is the right choice and why the other possible choices are not as good. It leads you, in fact, to seek out confirming evidence and to ignore evidence that is disconfirming.
            And so, if you decide you want to marry Chris, you’ll look for reasons why Chris would be an appropriate life partner. If your preconception is that German cars are the best, you’ll focus on evidence that supports this preconception—advertisements for Audi and Mercedes or talks with those who are pleased with their German cars. At the same time, you’d avoid ads for Lexus and Hyundai and dismiss or minimize any positive reactions from those who like cars that are not German.
The problem with the confirmation bias is that it leads you to limit your information search to confirming evidence and to avoid evidence that would disconfirm your preconceived choice. And, of course, it is exactly this disconfirming evidence that can lead to a more careful, thorough, and unbiased analysis.
            The antidote here is to analyze your choice by actively seeking disconfirming evidence. If you decide to buy a BMW, look for evidence against this choice—you already likely have evidence as to why you should buy it; you need to balance that evidence with other evidence, especially contradictory evidence. One simple way to do this is to read positive reviews of other choices and negative reviews of your selected choice. Listing—literally making a list—the negative aspects of your choice and the positive aspects of other choices, can help balance the evaluation. Taken too far, however, and you’ll never make a choice at all.
The Status Quo Bias
The status quo bias leads you to make decisions that essentially retain what you already have (Samuelson & Zeckhauser, 1988; Henderson, 2016). The status quo bias leads you to choose the familiar over the unfamiliar. The well-known adage—Better the devil you know, than the devil you don’t know—captures the status quo bias and also supports the ambiguity bias where you prefer the known to the unknown.
In many cases, you make a decision to not make a decision or not to change and simply stick with what you have. The emotional advantage here is that not making a choice doesn’t seem like making a choice and so it enables you (1) to avoid the stress of making a choice and (2) to avoid the regret that follows many choices.
Not surprisingly, you’re more likely to want to remain with the status quo when there is an overabundance of choices. When there are too many possible choices, the entire process can appear too confusing and too complex. And here the status quo feels a lot more comfortable than going through the process of examining all these potential choices and, perhaps, making a mistake in the process.
Examples of the status quo bias are all around us—retaining your current insurance without looking for less expensive policies or retaining your cell phone service or simply renewing without examining alternative plans. The status quo bias also seems a likely reason why so many unhappy couples stay together. It’s easier to remain with the status quo and endure the unhappiness.
The status quo bias is closely related to another natural tendency and that is the risk avoidance bias; you want to avoid risk. Because you’re probably like most people and risk aversive, staying with the status quo enables you to avoid losing something—a partner, money, a job, for example. Even though a change may well bring additional benefits, it may also lead you to incur a loss and losing (say, money) is more distressing and unpleasant than gaining money is enjoyable Ellsberg, 1961, 2001). The potential benefits of a decision to change are seen as less important, less consequential than the potential downsides or negative effects of a decision to change that may turn out to be a poor one.
Another closely related bias is the omission bias. The negative impact of a wrong choice is less if it was a choice to do nothing and more if it was a choice to do something, that is, to change the status quo (Schwartz, 2016). So, for example, the fear of a child becoming ill leads many parents to not have their children vaccinated for any of a variety of viruses. If the child does get sick, the negative impact would be greater if some action was taken than if no action was taken. One of the problems with this way of thinking is that the weight of the evidence is clearly on the side of doing something (that is, getting vaccinated) and against doing nothing (that is, not getting vaccinated). Yet, the omission bias persists (Ritov & Baron, 1990).
Recognizing that you may be influenced by this bias—bringing it to a mindful state—will help reduce its effects. Perhaps it will also help to recall Ronald Reagan’s observation that “status quo, you know, is Latin for ‘the mess we’re in’.” Or, equally appropriate, is leadership theorist Warren Bennis’ claim that “the manager accepts the status quo, the leader challenges it.” Visualizing what things would be like if changes were made may also prove of value.

All of these cognitive biases are based on faulty assumptions. So:
·         To combat the ambiguity bias: Don’t assume you know or can know everything. Everything is to some degree ambiguous and the goal should be to reduce uncertainty as much as possible.
·         To combat the bandwagon bias (and its related attractiveness bias and bystander bias): Don’t assume that the majority is always right.
·         To combat the anchoring bias: Don’t assume that what comes first is necessarily the most important.
·         To combat the confirmation bias: Don’t assume that your beliefs aren’t getting in the way of logical analysis.
·         To combat the status quo bias (and its related risk avoidance bias and omission bias): Don’t assume that doing nothing is necessarily the best choice or that taking risks is necessarily a bad choice.
References
Cherry, K. (2017). What is a confirmation bias? Retrieved February 3, 2018 from https://www.verywell.com/what-is-a-confirmation-bias-2795024

Chew, S. H., Epstein, R. P., & Zhong, S. (2012). Ambiguity aversion and familiarity bias: Evidence from behavioral and gene association studies. Journal of Risk and Uncertainty 44, 1-18.


Dean, J. (2013). Anchoring effect: How the mind is biased by first impressions. Psyblog. Retrieved February 3, 2018 from http://www.spring.org.uk/2013/05/the-anchoring-effect-how-the-mind-is-biased-by-first-impressions.php.

Ellsberg, D.  (1961). Risk, ambiguity, and the savage axioms. Quarterly Journal of Economics, 75 (4): 643–669doi:10.2307/1884324

Ellsberg, D. (2001). Risk, ambiguity, and decision. New York: Taylor & Francis, 2001. 

Henderson, R. (2016). How powerful is status quo bias? Psychology Today. Retrieved February 3, 2018 from https://www.psychologytoday.com/blog/after-service/201609/how-powerful-is-status-quo-bias

Hofstede, G., Hofstede, G., & Minkov, M. (2010). Cultures and organizations: Software of the mind (3rd ed.). NY: McGraw-Hill.

Janis, I. (1983). Victims of group thinking: A psychological study of foreign policy decisions and fiascoes (2nd ed.). Boston, MA: Houghton Mifflin.

McElhany, R. (2016). The effects of anchoring bias on human behavior. Retrieved February 3, 2018 from https://www.sagu.edu/thoughthub/the-affects-of-anchoring-bias-on-human-behavior.

Nelson, K. (2016), Bandwagon effect—cognitive biases (Pt. 8). Retrieved February 3, 2018 from https://evolveconsciousness.org/bandwagon-effect-cognitive-biases-pt-8/.

Richmond, V. P., McCroskey, J. C., & McCroskey, L. L. (2005). Organizational communication for survival: Making work, work. Boston, MA: Allyn & Bacon.

Ritov, I., & Baron, J. (1990). Reluctance to vaccinate: Omission bias and ambiguityJournal of Behavioral Decision Making. 3 (4): 263–277. doi:10.1002/bdm.3960030404

Schwartz, B. (2016). The paradox of choice: Why more is less. Revised edition. New York: HarperCollins.

Samuelson, W., & Zeckhauser, R. (1988). Status quo bias in decision making. Journal of Risk and Uncertainty, 1, 7-59.


*Joseph A. DeVito is Professor Emeritus, Hunter College of the City University of New York.