HOUSE_OVERSIGHT_015522.jpg
Extracted Text (OCR)
310 M. Hoffman et al.
Norms. People are typically conditionally cooperative, meaning that they are will-
ing to cooperate more when they believe others contribute more. For example, stu-
dents asked to donate to a university charity gave 2.3 percentage points more when
told that others had given at a rate of 64 % than when they were told giving rates
were 46 % (Frey & Meier, 2004). Hotel patrons were 26 % more likely to reuse their
towels when informed most others had done the same (Goldstein, Cialdini, &
Griskevicius, 2008). Households have been shown to meaningfully reduce electric-
ity consumption when told neighbors are consuming less, both in the United States
(Ayres, Raseman, & Shih, 2012) and in India (Sudarshan, 2014).
Such conditional cooperation is easily explained by the game theory model:
When others give, one can infer that one is expected to give and may be socially
sanctioned if one does not.
Strategic Ignorance. Those at high risk of contracting a sexually transmitted dis-
ease (STD) often go untested, presumably because if they knew they had the STD,
they would feel morally obliged to refrain from otherwise desirable activity that
risks spreading the STD. Why is it more reproachable to knowingly put a sexual
partner at risk when one knows one has the STD than to knowingly put a sexual
partner at risk by not getting tested? There is evidence that we sometimes pursue
strategic ignorance and avoid information about the negative consequences of our
decisions to others. When subjects are shown two options, one that is better for
themselves but worse for their partners and one that is worse for themselves but bet-
ter for their partners, many choose the option that is better for their partners. But,
when subjects must first press a button (at no cost) to reveal which option is better
for their partners, they choose to remain ignorant and simply select the option that
is best for themselves (Dana, Weber, & Kuang, 2007).
This quirk of our moral system is again easy to explain with the above model.
Typically, information about how one’s actions affect others is hard to obtain, so
people cannot be blamed for not having such information. When one can get such
information easily, others may not know that it is easy to obtain and will not punish
anyone who does not have the information. For example, although it is trivially easy
to look up charities’ financial ratings on websites like charitynavigator.org, few
people know this and could negatively judge those that donate without first check-
ing such websites. And even when others know that one can get this information
easily, they might suspect that others do not know this, and so avoid punishing,
since others won’t expect punishment. To summarize, strategic ignorance prevents
common knowledge of a violation and so is likely to go unpunished. We again
emphasize that we will be lenient of strategic ignorance, even when punishment is
not literally an option.
Norm of Reciprocity. We feel compelled to reciprocate favors, even if we know
that the favors were done merely to elicit reciprocation and even if the favor asked
in return is larger than the initial one granted (Cialdini 2001). For instance, mem-
bers of Hare Krishna successfully collect donations by handing out flowers to dis-
embarking passengers at airports, even though passengers want nothing to do with
the flowers: They walk just a few feet before discarding them in the nearest bin.
HOUSE_OVERSIGHT_015522