Category Archives: Morality

Unholy Trinities

The superstitious say that bad things come in threes, though this is probably due to the clustering illusion, cognitive bias, and an emphasis on trinities in western culture. We can only hope, pathetically, that all the blood shed over Arianism was not for nothing. I am feeling superstitious today because it has been a gloaming week here in America. It began (first) with Duck Dynasty “star” Phil Robertson giving a gruesome speech, to applause from Christians at a prayer breakfast, about the rape, killing, and torture of a hypothetical atheist family:

Two guys break into an atheist’s home. He has a little atheist wife and two little atheist daughters. Two guys break into his home and tie him up in a chair and gag him. And then they take his two daughters in front of him and rape both of them and then shoot them and they take his wife and then decapitate her head off in front of him. And then they can look at [the atheist father-husband] and say, “Isn’t it great that I don’t have to worry about being judged? Isn’t it great that there’s nothing wrong with this? There’s no right or wrong, now is it [sic] dude?” Then [they] take a sharp knife and take his manhood and hold it in front of him and say: “Wouldn’t it be something if this [sic] was something wrong with this? But you’re the one who says there is no God, there’s no right, there’s no wrong, so we’re just having fun. We’re sick in the head, have a nice day.”

Who exactly is sick in the head? Is it the Christians in the audience who applauded this hate speech or the Christians who are now defending it? Ironically, I am glad that these people — who clearly suffer from an absolute failure of moral imagination — believe in a “moral” God. Without such beliefs, they might feel free to act out these sorts of sick fantasies. This is the kind of thing that plays well in large parts of camouflage-wearing Christian America. God may yet save the South, but it has not happened yet.

Moving north to Indiana, where things are supposedly more sober, we find (second bad thing) that the “Religious Freedom Restoration Act” has been enacted. For those who did not know that religious liberty was under siege in Indiana, this may seem a bit strange. It would indeed be odd if Hoosiers, protected in their religious beliefs by the Constitution and favored in those beliefs by tax-exempt status, were being prevented from worshiping as they see fit. Needless to say, nothing of the sort was happening. What did happen is that Indiana’s ban on gay marriage was overturned last year, so horrified lawmakers in the state needed to strike back. They apparently were having nightmares about “religious” bakers, florists, and photographers being forced to do gay wedding business.

Let’s be clear about this: when we are talking about “religion” in Indiana, we are talking about Christianity. Eighty percent of all Hoosiers are Christian.* So while Christian proponents of this law talk loftily about “religious liberty,” it really has nothing to do with imperiled beliefs. For the non-sophists among us, the intent and purpose of the law is clear: it enables Indiana business owners to refuse anyone service if it would offend their Christian religious sensibilities. While Indiana’s governor appeared on national television today to assure us that the law won’t be used that way because Hoosiers are “nice” and “don’t discriminate,” this is hardly assuring. Having just given religionists a legal weapon that can be wielded, are we now to believe this will not happen? This is an especially pertinent question for Indiana, which has a history of not being nice.

Let us not forget that during the 1920s, Indiana was the national epicenter for the Ku Klux Klan. In 1925, thirty percent of Indiana’s white males were members and the Indiana KKK had over 250,000 members (largest of any state). That same year, over half the elected members of the Indiana General Assembly were Klan members, as was the Governor and many other high ranking state-local officials. While some may wish to say this is long past and best forgotten, the Indiana Magazine of History instructs otherwise in its lesson plan on the subject:

As a political influence, the Klan faded quickly in Indiana, but its social and cultural influence dovetailed more subtly into Hoosier life. Klan literature capitalized on American racism, nativism, patriotism, and traditional moral and family values. Klan members targeted blacks, Catholics, and Jews, but also immigrants, political radicals, feminists, intellectuals, gamblers, bootleggers, thrill-seeking teenagers, and motion picture producers. In one sense, Indiana’s Klan was a populist organization: it engaged community interests, presented a program of action, and promised political changes.

The Klan’s message of patriotism, American superiority, and Protestant Christianity united native-born Hoosiers across many lines — gender, geography (north and south), class (white and blue collar), religious (many denominations of Protestants), and residential (urban and rural). But this populist club also propagated a negative and wicked influence. Historians have found no documentary evidence to directly link Hoosier Klan members to lynchings in Indiana, but their marches, burned crosses, brazen publications, and boycotts of community businesses evoked fear, intimidation, and lifelong trauma. Historian James Madison has observed that Indiana’s Klan “cannot be dismissed as either an aberration or as simply the insidious appeal of a fanatical few. Nor should the Klan be seen as thoroughly dominating the state and accurately reflecting racist, violent, or provincial beliefs shared for all time by all Hoosiers” (The Indiana Way, 291). Somewhere in the middle we find the meaning of the Klan in Indiana history.

Given this sordid history, with its lingering cultural legacy now making an appearance in the form of a Christian “religious freedom” law, we should justly be suspicious. One way to evaluate a law is to ask if it stands the test of different times. We should thus consider whether Indiana’s new RFRA would have been a good law during the 1920s, when the Protestant KKK was dominant in the state. How might white-Christian Hoosiers have used RFRA back then? Would they have been nice? Would they have used it to discriminate? These are of course just rhetorical questions. Hoosiers should be ashamed.

And just to show that neither the South nor Indiana are alone in their Christian foibles, here in Colorado we find our third event to complete the cluster. Some may have heard about the young woman in Longmont whose 34-week-old fetus was cut from her stomach by a lunatic who wanted a baby of her own. Fortunately the expectant mother survived but unfortunately the developing child did not. One of Colorado’s state legislators, Republican Gordon Klingenschmitt, linked this tragedy to biblical prophecy and claims that the crime was committed because God is punishing America for legal abortion. Klingenshmitt, a former Navy chaplain and current Christian minister, here lays out his logic:

God Bless and/or Curse America, but please only in clusters of threes. This was quite enough for one week.

Did you like this? Share it:

Science of Good & Evil

Are the New Atheists “scared” and panicking? Are they “fervently vocal” because they realize that religion is not in retreat and is instead flourishing? I don’t have answers to these questions because I don’t know any New Atheists, don’t read their books, don’t listen to their podcasts, don’t attend their gatherings, and don’t pay them much mind. These don’ts derive from my assessment of New Atheism as a cultural or dialectical response to an historically particular form of western Christian religion. To combat this peculiar form and its Abrahamic relatives, New Atheists fight on a field of theist choosing. Because the parameters of this debate have been established by western theists, evangelical atheists counter with a series of conceptual inversions. Ironically, this forces a mirror substitution of one metaphysics for another. While this may be well and good within the confines of the cultural and philosophical gutter, where New Atheists and their Christian opponents do loud and dirty battle, it offers little to those of us not bound by the sterile binaries of belief/unbelief and theism/atheism.

My understanding of this localized (i.e., the US/Britain) and provincial (i.e., Christians/Atheists) phenomenon owes something to John Gray, who for several  years now has been scourging the New Atheists for their foibles and faults. With his latest crack of the whip over at the Guardian, Gray takes aim at Sam Harris and his dubious arguments for “scientific morality.” While Harris claims that morals can be derived from and founded on science, it seems odd that the morals he deduces match perfectly with liberal values. Surely this is no coincidence and Gray is justly skeptical. He notes that “science” (there is no such reified or unified thing) has historically been deployed on behalf of all manner of morals, many of them odious. Harris, a neuroscientist by training and polemicist by penchant, has not finally discovered the elusive philosopher’s stone which transmutes science into morals:

Following many earlier atheist ideologues, Harris wants a “scientific morality”; but whereas earlier exponents of this sort of atheism used science to prop up values everyone would now agree were illiberal, Harris takes for granted that what he calls a “science of good and evil” cannot be other than liberal in content.

Harris’s militancy in asserting these values seems to be largely a reaction to Islamist terrorism. For secular liberals of his generation, the shock of the 11 September attacks went beyond the atrocious loss of life they entailed. The effect of the attacks was to place a question mark over the belief that their values were spreading – slowly, and at times fitfully, but in the long run irresistibly – throughout the world. As society became ever more reliant on science, they had assumed, religion would inexorably decline. No doubt the process would be bumpy, and pockets of irrationality would linger on the margins of modern life; but religion would dwindle away as a factor in human conflict. The road would be long and winding. But the grand march of secular reason would continue, with more and more societies joining the modern west in marginalising religion. Someday, religious belief would be no more important than personal hobbies or ethnic cuisines.

This progressive march of science and secularism, which was never more than a minority movement found mostly in Europe, has been rudely interrupted:

Today, it’s clear that no grand march is under way. The rise of violent jihadism is only the most obvious example of a rejection of secular life…The resurgence of religion is a worldwide development. Russian Orthodoxy is stronger than it has been for over a century, while China is the scene of a reawakening of its indigenous faiths and of underground movements that could make it the largest Christian country in the world by the end of this century. Despite tentative shifts in opinion that have been hailed as evidence it is becoming less pious, the US remains massively and pervasively religious – it’s inconceivable that a professed unbeliever could become president, for example.

These are the facts, Gray asserts, which have thrown New Atheists into a panic and accounts for their quixotic quest to establish a “science of good and evil.” This is a phrase, coined by Harris, which immediately arouses suspicion for anyone well-versed in Nietzsche, whose Beyond Good and Evil (1886) and Genealogy of Morals (1887) thoroughly historicized and deconstructed the contingent categories of “good” and “evil.” Alert to these issues, Gray brings them the fore:

How could any increase in scientific knowledge validate values such as human equality and personal autonomy? The source of these values is not science. In fact, as the most widely-read atheist thinker of all time argued, these quintessential liberal values have their origins in monotheism.

The new atheists rarely mention Friedrich Nietzsche, and when they do it is usually to dismiss him…It’s impossible to read much contemporary polemic against religion without the impression that for the “new atheists” the world would be a better place if Jewish and Christian monotheism had never existed. If only the world wasn’t plagued by these troublesome God-botherers, they are always lamenting, liberal values would be so much more secure.

Awkwardly for these atheists, Nietzsche understood that modern liberalism was a secular incarnation of these religious traditions. As a classical scholar, he recognised that a mystical Greek faith in reason had shaped the cultural matrix from which modern liberalism emerged. Some ancient Stoics defended the ideal of a cosmopolitan society; but this was based in the belief that humans share in the Logos, an immortal principle of rationality that was later absorbed into the conception of God with which we are familiar. Nietzsche was clear that the chief sources of liberalism were in Jewish and Christian theism: that is why he was so bitterly hostile to these religions. He was an atheist in large part because he rejected liberal values.

While this is not an entirely accurate, and certainly not complete, rendering of Nietzche’s genealogical project, it’s accurate and complete enough for Gray’s well-taken point. Science, sensu lato, has some enlightening things to say about morals, or what I would call talking primate ethics. History, in my estimation, has even more enlightening things to say about the development of morals. But I’m not sure that anything Sam Harris says about the so-called “science of good and evil” is enlightening; indeed, it may be darkening.

Hobbes-Moral-Philosophy

Did you like this? Share it:

Slavish Conscience

Over at the London Review of Books, Adam Phillips criticizes self-criticism in an essay that includes this brilliant bit:

We are never as good as we should be; and neither, it seems, are other people. A life without a so-called critical faculty would seem an idiocy: what are we, after all, but our powers of discrimination, our taste, the violence of our preferences? Self-criticism, and the self as critical, are essential to our sense, our picture, of our so-called selves. Nothing makes us more critical – more suspicious or appalled or even mildly amused – than the suggestion that we should drop all this relentless criticism, that we should be less impressed by it and start really loving ourselves. But the self-critical part of ourselves, the part that Freud calls the super-ego, has some striking deficiencies: it is remarkably narrow-minded; it has an unusually impoverished vocabulary; and it is, like all propagandists, relentlessly repetitive. It is cruelly intimidating…and it never brings us any news about ourselves. There are only ever two or three things we endlessly accuse ourselves of, and they are all too familiar; a stuck record, as we say, but in both senses – the super-ego is reiterative. It is the stuck record of the past…and it insists on diminishing us. It is, in short, unimaginative; both about morality, and about ourselves. Were we to meet this figure socially, this accusatory character, this internal critic, this unrelenting fault-finder, we would think there was something wrong with him. He would just be boring and cruel. We might think that something terrible had happened to him, that he was living in the aftermath, in the fallout, of some catastrophe. And we would be right.

Phillips is right: there is something seriously wrong with the homunculi in our heads. With Freud as his theory-master and Hamlet as ego-actor, Phillips engages with conscience, that most intractable and culturally inflected aspect of ourselves. Though Michel Foucault merits no mention in his essay, Phillips is also talking about discipline: that resolve, sometimes steely but always nagging, which seemingly arises from within but which is implanted from without. In near modernity, or in Abrahamic times and places, this conscience or discipline is the voice of God, whose state-serving accoutrements present as morals. In modernity, or in consumer-capitalist times and places, this conscience or discipline is the voice of the Market, whose state-serving accoutrements present as desires. These are the shaming and punishing voices of masters, in which case we are slaves.

— Cris

self-criticism-2-300x236

Did you like this? Share it:

Dishonor Thy Father

From the metaphorical standpoint of selfish genes and their male human vessels, the worst possible fitness outcome is to invest in another man’s child while mistakenly believing the child carries half your genes. This view, espoused by evolutionary psychologists, receives ironic support from marriage rules and adultery sanctions promulgated by many world religions. But, as I explained in One Flew Over the Cuckold’s Nest and EP & Paternity Paranoia, this view is wrong.

In these posts, I noted that biological paternity is a non-issue for many hunter-gatherers. Among foragers, children are usually raised by large alloparenting groups in which the biological father may or may not play an important role. Thus, the identification and attribution of “father” is fluid, malleable, and often inconsistent with genetic parentage. Despite this variability, there is usually at least one person (or several) who will be identified and addressed as “father.”

It is therefore surprising to learn that an ethnic group in China’s Himalayan region, the Mosuo, take this paternity-plasticity to another level: the Mosuo do not recognize “fathers” and do not even have a word for “father.” This remarkable fact is a product of “walking marriages” which give women the right to have overnight male visitors as they wish. These visits, which obviously may result in biological paternity, do not consequently lead to fatherhood:

Most significantly, when children are born, the father may have little or no responsibility for his offspring (in fact, some children may not even know who their father is). If a father does want to be involved with the upbringing of his children, he will bring gifts to the mother’s family, and state his intention to do so. This gives him a kind of official status within that family, but does not actually make him part of the family. Regardless of whether the father is involved or not, the child will be raised in the mother’s family, and take on her family name.

This does not mean, however, that the men get off scot-free, with no responsibilities for children. Quite the opposite, in fact. Every man will share responsibilities in caring for all children born to women within their own family, be they a sister, niece, aunt, etc. In fact, children will grow up with many “aunts” and “uncles”, as all members of the extended family share in the duties of supporting and raising the children.

Although the Masuo are agrarian, there are strong echoes here of hunter-gatherer practices and flexibility. Though there is no historical data by which to judge the issue, this could be a cultural survival that has been adapted to new ways of life. It seems to be working for the Masuo:

The result – as different as it may be from other systems – is a family structure which is, in fact, extremely stable. Divorce is a non-issue…there are no questions over child custody (the child belongs to the mother’s family), splitting of property (property is never shared), etc. If a parent dies, there is still a large extended family to provide care. 

So here we have another ethnographic example of a society that contradicts the standard and widely-accepted stories about pair bonding and ritual marriage. Among the Masuo, there is no “marriage” of the kind theoretically envisioned by evolutionary psychologists and doctrinally affirmed by post-Neolithic religions. With all this in mind, a more apt aphorism might be “Honor thy Alloparenting Group.” These groups, variably consisting of genetic and fictive kin, bear little resemblance to the historically derived (i.e., post-Neolithic) ideal of dyadic nuclear families.

Triptych Holy Kinship_Frankfurt, Staedelsches Junstinstitut_1509

Did you like this? Share it:

EP & Paternity Paranoia

Among the many problems that ail evolutionary psychology or EP, one of the most glaring is the field’s ignorance of hunter-gatherer ethnography. When evolutionary psychologists have finished testing their thoroughly modern and deeply acculturated subjects, they usually claim to have identified some deep-seated or hard-wired psychological propensity. With the supposedly universal trait in hand, evolutionary psychologists then explain how it would have made sense, and thus adaptively evolved, in “ancestral environments.”

Aside from the ex post facto or just-so storytelling that usually follows, few evolutionary psychologists have intensive knowledge of hunter-gatherers. Consequently, evolutionary psychologists freely speculate about ancestral environments. While hunter-gatherers are imperfect proxies for the evolutionary past — and certainly are not static exemplars of that past, they at least provide us with constraining data.

When evolutionary psychologists ask themselves how some emotional trait or psychological propensity might have worked in ancestral environments, their first methodological step should be to evaluate or test the trait using the hunter-gatherer ethnohistoric record. The next step should be to evaluate or test the trait using primate studies, and the third should be testing with the hominin archaeological record. Because most evolutionary psychologists skip all three steps (or are largely ignorant of these three constraining datasets), they tell speculative just-stories about “ancestral environments.”

If an allegedly universal trait or propensity (1) is not found or is not significant among hunter-gatherers, (2) is not found or is not significant among non-human primates, and (3) is not evidenced by hominin archaeology, the trait-propensity probably did not evolve as an adaptation in ancestral environments. Moreover, if we can archaeologically or historically identify places and times where the trait-propensity appears and subsequently develops, the trait-propensity probably is cultural or learned.

Eschewing this methodology, evolutionary psychologists often mistakenly identify fairly recent cultural-historical developments as “evolutionary” and “ancestral.” A classic example of this mistake is the supposed “evolutionary-biological” problem of cuckoldry. As evolutionary psychologists spin this particular story, the worst possible genetic-fitness outcome for a man is to be cuckolded and then unknowingly raise another man’s child. The horror, they (and the math) say!

But as everyone familiar with hunter-gatherer ethnography knows, paternity assurance is a non-issue in such societies. Biological fatherhood, while often known and acknowledged, is in most cases not of paramount or even primary importance. It is often the case that the mother’s brother will be the most important male relationship in a child’s life and this biological “uncle” will be called “father.” In other cases, a child may have many “fathers” consisting of “uncles” and “grandfathers.” These uncles and grandfathers may be biological, fictive, or both, and they are often the most important adult male figures in a child’s life. In still other cases, children freely circulate among group members and may be adopted by non-related adults who are then called “mother” and “father.”

There are additional variations on these themes, but the message we get from them is consistent and clear: biological paternity is not a matter of major or overriding concern. This is because “father” relationships are structured so differently in these societies. As I explained in “One Flew Over the Cuckold’s Nest,” biological fatherhood and paternity assurance became important concepts, indeed overriding concerns, only in those societies that settled down to produce food. For these societies, the phase change known as the Neolithic transition was accompanied by shifts from communal to private property, and in conjunction with private property, shifts toward patriarchy and primogeniture.

These are the historical circumstances and cultural conditions in which biological fatherhood and paternity assurance become great anxiety inducers. These concerns did not evolve in prehistoric or “ancestral” environments for “adaptive” or biological reasons. Paternity paranoia is a product of particular times and places. It is not a universal trait or genetic imperative.

With these things in mind, we can evaluate a recent Atlantic article touting a new study that “looks at the evolutionary psychology behind ideas of sexual morality.” As is often the case with EP studies, what sounds promising quickly devolves into yet another ancestral story:

We’ve evolved to consider sex, the researchers argue, as a game of finite resources. For our ancestors, multiple sexual partners meant things could get knotty when it came to proving whose kids were whose. For women who depended on men for their livelihoods (and the livelihoods of their offspring), that uncertainty meant losing out on the support of their male partners. Bad news. For men, it meant investing in the well-being of children they hadn’t necessarily fathered. Also bad news.

The connection between sexual behavior and morality, then, may have come about as a way of keeping a gender-based social order intact. “Through moralizing,” the researchers wrote, “individuals can promote behavior which serves their own personal and coalitional interests.” Back in the day, judgment was a form of defense.

The key to this story is what precisely is meant by “our ancestors” and “back in the day.” If these researchers are referring to sedentary, food-producing ancestors who developed notions of property, patriarchy, and primogeniture, then this story makes sense. It was in these societies that gender-based hierarchies were created, female dependence was encouraged, and in which individual — rather than group — interests came to the fore. But this makes it a provincial cultural and historical story, not a universal evolutionary and biological one.

If, however, these researchers are talking about our hunting and gathering ancestors, then this story is surely wrong. These ancestors probably had multiple sexual partners, did not worry about paternity, and did not moralize these issues.

What the study in question actually shows is that in modern or post-Neolithic societies, female dependence on males correlates strongly with moral judgments (and religious strictures) against adultery and promiscuity. While historians and anthropologists have known this for quite a long time, it’s always nice to have psychologists experimentally and statistically confirm what we already knew.

Patriarchy-Penis-God

Did you like this? Share it:

Evolution of Neolithic Hells

It is hard to know what our correspondents at the Economist were thinking when they published this coal lump journey through all kinds of Hells on December 22, but it sure was cheeky. I’ve always been a bit partial to Jean-Paul Sartre’s rendering of hell as an exitless room filled with other people, but that’s just misanthropic me. Hells come in all shapes and sizes so there is a bit of something for everyone.

Despite all this variety, one thing is clear: hells have histories and are imagined only in food producing (Neolithic and post-Neolithic) societies. Hells don’t exist in hunter-gatherer and similar small-scale societies. There is obviously something about settling down to live in ever larger groups that drives people to hellish distraction. It’s an ingenious displacement whereby threats of actual punishment get an assist from fears of potential punishment. Hells are of course useful constructs for getting people to internalize the disciplines required for life in larger-scale societies, but they are not indicia of intellectual progress or symbolic sophistication.

“Happy Hunting Ground” by Sophie Iremonger

Did you like this? Share it:

Ethnohistoric Plagiarism (Wrong)

In a recent post I commented on the classic ethnohistory by George Bird Grinnell, The Cheyenne Indians: History and Society (Vol. 1, 1923) and The Cheyenne Indians: War, Ceremonies, and Religion (Vol. 2, 1928). Because the post addressed Cheyenne ethics, I quoted the following from Volume 1:

The Indian child was carefully trained, and from early childhood. Its training began before it walked, and continued through its child life. The training consisted almost wholly of advice and counsel, and the child was told to do, or warned to refrain from doing, certain things, not because they were right or wrong, but because the act or failure to act was for his advantage or disadvantage later in life. He was told that to pursue a certain course would benefit him, and for that reason was advised to follow it. His pride and ambition were appealed to to, and worthy examples among living men in the tribe were pointed out for emulation.

Of abstract principles of right and wrong, as we understand them, the Indian knew nothing. He had never been told of them. The instructor of the Indian child did not attempt to entice him to do right by presenting the hope of heaven, nor to frighten him from evil by the fear of hell; instead, he pointed out that the respect and approbation of one’s fellow men were to be desired, while their condemnation and contempt were to be dreaded.

The Indian lived in public. He was constantly under the eyes of the members of his tribe, and most of his doings were known to them. As he was eager for the approval of his fellows, and greedy of their praise, so public opinion promised the reward he hoped for and threatened the punishment he feared. (pp. 103-04, 1972 Bison edition).

Last night I was reading the classic ethnohistory by Ernest Wallace and E. Adamson Hoebel, The Comanches: Lords of the South Plains (1952). Wallace was a professor of history at Texas Tech and Hoebel was a professor of anthropology at the University of Minnesota-St. Paul. As professors, they should not have been plagiarists. Yet it appears they were.

Let’s compare Grinnell’s Cheyenne comments (above) to Wallace and Hoebel’s Comanche comments (below):

The training of children consisted almost entirely of precept, advice, and counsel. A baby soon learned that he would not have his whims gratified by crying. He was told to do, or warned to refrain from doing, certain things, not because they were right or wrong, but because they were to his advantage or disadvantage. He was taught that he would benefit by acting in a certain way. His pride and ambition were appealed to, and worthy examples among living men in the tribe were pointed out for emulation.

Of abstract principles of right or wrong, as we understand them, the Comanches knew little or nothing. The approach to life was pragmatic. The child was not coaxed toward good with the hope of Heaven as a reward or frightened from evil by the fear of Hell as punishment. Instead, he was shown by word and example that the respect and approbation of his fellow tribesmen were to be desired, and their condemnation and contempt were to be dreaded and avoided. He saw that the men who were brave and generous were applauded and respected.

In Comanche society it was impossible to live a secluded life. The members were aware of the conduct of each person. Each was eager for the approval of his fellows and greedy for their praise, and public opinion promised the reward he hoped for and threatened the punishment he feared — lack of esteem among the People. This was their method of control. (pp. 123-24, 1986 Oklahoma edition).

Wallace and Hoebel not only plagiarized — they also failed to cite Grinnell’s book. This omission was surely deliberate, given that the professors knew that what was true of the Cheyenne may not have been true of the Comanche.

This is not good. I know that among Comanche scholars there are questions about the reliability of the Wallace-Hoebel book. This simply adds to those questions.

I find it ironic that Wallace and Hoebel chose to plagiarize a passage discussing right from wrong, or ethics.

Did you like this? Share it: