PURETICS... |
|
| |
. : About me : .
. : Recent Posts : .
What Happened????????? . : Archives : .
Dec 5, 2006 . : Spare : . Whatever Here |
. : Links : .
. : Spare : . Whatever Here
. : Credits : .
Template By Caz . : Spare : . Whatever Here More blogs about puretics. nsw recruitment Counter |
|
Wednesday, May 21, 2008The art of integrityTo paraphrase Shakespeare's Falstaff, "honor pricks us on." And although Sir John famously concludes "I'll none of it," the reality is that for most people, honor is more than a "mere scutcheon." Many colleges have honor codes, sometimes elaborated into complex systems: The list includes small colleges (e.g., Gustavus Adolphus, Haverford), large universities (e.g., the University of Virginia, Texas A&M), Ivies like Dartmouth and Princeton, sectarian institutions like Brigham Young, science-tech (Caltech) as well as liberal-arts (Reed) colleges, and, with particular solemnity, the three military academies. The code at West Point is especially terse and predictably directive: "A cadet will not lie, cheat, or steal, nor tolerate those who do." The first "three commandments" — thou shalt not lie, cheat, or steal — speak for themselves. Of particular interest for our purposes, however, is that fourth admonition: "nor tolerate those who do." (Sure enough, Prince Hal shows himself true to this martial virtue when he eventually — and for many of us, hurtfully — turns away from Falstaff, showing that as king he disowns Fat Jack's dishonorable behavior.) Doesn't it stand to reason that everyone would be intolerant of violators? After all, when someone lies, cheats, or steals, it hurts the rest of us while making a mockery of society itself (cue Immanuel Kant, and his categorical imperative). The "fourth commandment" should, therefore, be altogether logical and hardly need specifying. The problem for theorists — if not for the "naturally intolerant" — is that blowing the whistle on liars, cheaters, or thieves is likely to impose a cost on the whistle-blower, while everyone else benefits from her act of conscience. Why not mind your own business and let someone else do the dirty work? Isn't that why we have police: to, as the word suggests, police the behavior of others, at least in part so we don't have to do so ourselves? A conceivable explanation is that if no one else perceives the transgression or, similarly, if no one else is willing to do anything about it, then perhaps the miscreant will get away with it, whereupon everyone — including you — will be worse off. In short, turning in a violator could be a simple act of self-aggrandizement, if the cost of doing so is less than the shared penalty of keeping silent. Another possibility, of course, is that people are indeed predisposed to ignore code violations, which is precisely why the "fourth commandment" exists — because otherwise malefactors would be tolerated. Yet another, and of particular interest to evolutionists, is that people are, at least on occasion, inclined to do things that are detrimental to their personal benefit so long as their actions are sufficiently beneficial to the larger social unit. That process, known as "group selection," has a long and checkered history in biological theory. Since natural selection should consistently reward selfish acts, how to explain the existence of morality that induces people to behave, as Bertolt Brecht puts it in The Threepenny Opera, "more kindly than we are"? These days, evolutionary explanations lean heavily on kin selection (also known as inclusive fitness theory, whereby apparent altruism at the level of bodies can actually be selfishness playing itself out at the level of genes), and on reciprocity, essentially "you scratch my back, I'll scratch yours." But there is also the possibility that beneficent acts are biologically generated by a payoff enjoyed by the group, of which the altruist is a member. At one point in The Descent of Man and Selection in Relation to Sex, Darwin gave impetus to the group selectionists: "Although a high standard of morality gives but a slight or no advantage to each individual man and his children over the other men of the same tribe, … an increase in the number of well-endowed men and advancement in the standard of morality will certainly give an immense advantage to one tribe over another." But just because Darwin said it doesn't make it true. The problem is that even if people "well endowed" with morality provide their "tribe" with an "immense advantage," those same people run the risk of being immensely disadvantaged within their group if such endowment equates to spending time, energy, or money on behalf of others, or running risks that help the larger social unit while hurting the altruist. As a result, although group selection — along with its companion concept, "the good of the species" — was uncritically accepted for about a century, it has been deservedly out of favor for several decades, displaced by the understanding that selection operates most effectively at the lowest possible level: individuals or, better yet, genes. Or does it? Maybe reports of group selection's demise have been greatly exaggerated. Various mathematical models now suggest that under certain stringent conditions, selection could (at least in theory) operate at the level of groups. Some of the most promising formulations involve so-called multilevel selection, in which natural selection operates in simultaneous, nested baskets: among genes within individuals, among individuals within groups, among groups within species, and presumably among species within ecosystems, among ecosystems on the planet Earth, and (why not?) among planets in galaxies, and galaxies in the universe. Received wisdom these days is that if a behavior is costly for the individual, it is unlikely to evolve, regardless of whether it is beneficial for the group or the species. Nonetheless, even if we grant that group selection has probably been inconsequential when it comes to changing gene frequencies, that does not mean that selection at the group level hasn't been instrumental in shaping human psychology, producing some pro-social tendencies via cultural evolution rather than its genetic counterpart. And so we return to honor codes, violators thereof, and those who turn them in. Notably, along with the fertile mathematical modeling, there has been a flurry of experimental simulations by economists and social psychologists showing that under certain conditions, people are inclined to engage in "third-party punishment." That is, they will punish cheaters, even at distinct cost to themselves. If two people are playing, say, prisoner's dilemma, and one cheats ("defects," in game-theory terminology), the other may well defect in turn; that is the basis of the celebrated "tit for tat" strategy, which is selfish, or at least self-protective, and therefore not perplexing. Especially interesting for those of us who contemplate honor systems, however, are those third-party simulations in which an observer is given the chance to reward or punish defectors. To an extent that has surprised most biologists, third-party punishment is doled out quite enthusiastically, with the self-appointed adjudicators willingly absorbing a cost in order to police the behavior of others. Devotees of group selection (some of whom evidence an almost religious zeal, perhaps because they have been wandering in the biological wilderness since the mid-1960s) have seized on these results as demonstrating how human moral psychology may well have been shaped by an urge to benefit one's group, even at substantial personal expense. Such behavior can also be explained, however, by what evolutionary theorists call the "three R's": reputation, reciprocation, and retribution. Be moral, and your reputation will benefit (and thus, your fitness); you might also profit from the reciprocal morality of others. And if you are seen as immoral, you run the risk of painful retaliation. A parallel suite of inducements could generate third-party punishment, including an inclination to turn in honor-code violators, even if, paradoxically, society also takes a dim view of the "snitch" or "stool pigeon." That raises the problem of who administers the rebuke to code violators. Ideally it should be everyone, but that simply opens the opportunity for yet more defection, on the part of individuals who stand back and let others do the dirty work. One answer is for punishment to be meted out not only toward defectors, but toward anyone who refrains from punishing them. Next step, then, is to punish those who won't punish those who defect, and so on, ad infinitum. In the close-formation battle phalanxes favored by the Roman legions, each foot soldier carried a sword in his right hand and a shield in his left. Hence, each legionnaire depended on the man alongside to provide protection via the other's shield. Desertion in battle was a capital offense, with punishment to be administered on the spot; moreover, anyone who failed to kill a deserter was himself subject to immediate death! ("A legionnaire will not run away during battle, nor tolerate those who do." Nor will he tolerate those who tolerate those who do.) There are other responses to observing a cheater, including being more likely to cheat oneself, that may go a long way toward explaining all the officially orchestrated intolerance of cheating. Bad enough if one man breaks ranks and runs; worse yet if that induces everyone else to do the same. Consider a familiar circumstance in which the transgression, and the penalty for tolerating a transgressor, are both considerably less drastic: how difficult it is to wait at a crosswalk when all those around you are jaywalking. The consistent results of third-party-punishment experiments — willingness or eagerness to enforce group norms, even when doing so is costly to the "enforcers" — can be seen as revealing something nasty and spiteful about people: They will go out of their way, even enduring personal financial loss, just to be mean to someone else. But in addition to this glass-half-empty interpretation, there is a half-full counterpart. The fact that people will punish a cheater in a so-called public-goods situation, even if doing so may be costly, is evidence for a kind of altruism. By maintaining social norms at their own cost, "punishers" are unpaid policemen, making a kind of selfless citizen's arrest. Although such behavior is admittedly rare at crosswalks, it is clear that people are readily inclined to turn on cheaters, as anyone who has ever bridled at the boor who breaks into line at a ticket window or supermarket checkout can readily attest. In those cases, and unlike jaywalking, code violations impose a disadvantage on those who wait responsibly in their queue. Yet, even here, it is tempting to swallow one's annoyance and hope that someone else will step up and enforce the norm for everyone else. There is much to be said for having such enforcers around, not only because the possibility of someone's acting on the cadet's fourth commandment probably makes cheating less likely in the first place, but also — and not least — because someone else is thereby charged with the task. Continue reading...
|