Whatever happened to empathy?

Some time ago I was listening to an interview with Alice Walker, the famous poet, on NPR. She was lamenting the loss of civility and the disintegration of empathy. She said—and I’m paraphrasing—that humanity needs to relearn how to be empathetic. We have become too self-centered and too self-absorbed. The word humanity was not a paraphrase. She used that specific word several times.

Humanity. What is that, exactly? There are two ways the term is commonly used. Sometimes it is meant to refer to all human beings taken collectively, as a synonym for the human species. Sometimes it refers to a psychological posture, a disposition toward empathy and compassion. Walker was invoking the former, while bemoaning the scarcity of the latter. Her words were compelling, and her passion was irresistible. Even the most illiberal and cynical listeners would be hard-pressed to disagree with her. Nevertheless, she was wrong. Or, more accurately, what she said, taken at objective face value, was logically absurd and absent of meaningful content. Humanity cannot be empathetic. Humanity is incapable of expressing empathy, or of demonstrating any other psychological—or, for that matter, physical—attribute. Simply put, humanity is not really a thing. Humanity doesn’t exist as an actual entity, as a tangible inhabitant of the universe. Humanity is a rhetorical device.

Humanity doesn’t exist. Anywhere. It never has and never can. Humanity is one of a considerable number of commonly used abstract terms and expressions, such as “the American people” or “the global economy,” that, while they have no actual material designation in the world, nonetheless suggest something of paramount significance is being referenced. There is a particularly pernicious symbolic illusion at play here. Even as shorthand for “the human species,” or simply “humans,” humanity is invariably used in ways that are deceptive and potentially misleading. For example, it has been said that humanity has walked on the moon, when in fact only a dozen individual humans have done so. And if Walker meant to say “the human species needs to become more empathetic,” then she was speaking gibberish—unpoetic gibberish. The human species is as incapable of empathy as it is of walking on the moon. The human species is a taxonomic category, a conceptual convenience, not an actual thing in the world.

Humans, individual human beings, individual people, are actual things in the world. And, while it is true that there are several individual humans who could probably use a little more empathy, there are several folks out there who ooze empathy from their pores. When Walker says that humanity has lost, or is in need of learning, empathy, what is she saying? That there are more and more people who are less and less empathetic? And, further, perhaps, that the world would be a better place if these people could add a bit more empathy into their daily thoughts and actions? Given the context of the conversation, however, it appeared that she intended to mean something more than just this. I do remember her saying, specifically, that humanity needs to “learn” empathy. What does that mean? How can an abstraction learn? Where is this newly acquired knowledge to be housed? And, surely, she didn’t mean that all people need to learn this; such a blanket statement is entirely unwarranted. Again, there are plenty of people who are at this very moment operating at the very top end of the human niceness spectrum. I know several personally.

Perhaps some additional context might help to sort this out. Walker’s NPR interview was given in the midst of congressional hearings relating to a belligerent and misogynistic supreme court justice nominee who had been accused of sexual assault, and on the heels of the President of the United States openly and publicly mocking the nominee’s psychologically wounded accuser in extremely demeaning ways. The man is clearly an oaf who does not deserve to be in any leadership position, no matter how trivial (and by “the man,” I mean both the supreme court nominee who was, unfortunately, confirmed, and the President). The media is saturated to the very brim with similar stories, alongside stories of mass shootings paired with open disdain for any suggestions that something substantive should be done to prevent them, and a number of other clear indicators that empathy—in even the most rudimentary sense of that term—appears to be a rarified element of contemporary society. Walker’s passionate concern is clearly justified. But, again, the dearth of empathy cannot be humanity’s doing. So, if not humanity, then what?

Human empathy is an evolved capacity, an adaptive capacity that came about because of its potent utility as a tool for maintaining group cohesion. In terms of survival value, it is second only to our inborn sense of fairness and the resulting social norms of reciprocity that are a defining feature of the anarchistic and largely egalitarian human lifestyles that predominated up until just the last few thousand years. Neither anarchism nor egalitarianism exist in the modern world. They disappeared among the civilized the moment that the civilized came into being. The elimination of these features of the social landscape is part and parcel of the civilizing process. An egalitarian civilization would not be a civilization. Anarchistic civilization is an oxymoron. Hierarchical divisions of power and sharp inequalities in access to essential resources are necessary conditions for civilization—even stronger, they are, to a large extent, what civilization ultimately is: a complex collection of mechanisms for creating and maintaining the unequal distribution of power and resources. The second you add a power differential to society, the second that people no longer have equal and unrestricted access to essential resources, is the second that empathy starts to lose its survival utility. Modern civilized society pushes empathy to the vanishing point; the degradation of empathy is a direct result of forced participation in a system based on rigged competition, a system that leads to perpetually expanding chasms of inequality, a system that rewards selfishness and overtly punishes empathetic behavior.

Walker is right about the shortage of empathy. But she is wrong to blame humanity or to suggest that humanity can play any role whatsoever in a possible solution. Even if we allow the slippery non-thing of humanity to mean something concrete, humanity has nothing to do with the lack of empathy in the world because the situations Walker is responding to have nothing to do with actual humans. They have to do with the operative design of civilized society, with the complex collection of bureaucratically structured systems of power that are being forcefully imposed on people. Humanity (whatever that really means) isn’t the problem. The problem is global corporate consumer society itself. The problem is civilization.

The stale bread of progress

I remember my first egg salad sandwich. I’m not sure how old I was, perhaps only four. It was a plain, unadulterated sandwich. No pickles. No lettuce. Just hardboiled egg smashed up with cheap mayonnaise inside a single folded slice of margarine-smeared bread—moist, somewhat sweet, gluten-rich, additive-laced, snow-white Wonder Bread. I was an instant fan. And, in retrospect, it wasn’t so much the egg salad itself that I liked, but the way that the margarine-egg salad combination enhanced the spongy-sweetness of the over-processed bread.

Bread was a staple in my diet as a kid. It was a staple for most other American kids during the 1960s and 1970s as well, and probably still is. At the time of my first egg salad sandwich, the average American got upwards of 30% of their daily calories from white bread—a pound and a half per week—and I am pretty sure that my weekly intake was at least average. I don’t eat much bread these days. In fact, I actively avoid it. I regularly go entire months without eating bread in any form whatsoever. For the last eight years or so, I have been eating “Paleo.”

The Paleo diet is not really a diet in the way that word is typically used. Although it is frequently lumped in with other “fad” diets, and often conflated with the high-fat, high-protein Keto diet, the Paleo diet, in its plainest, non-commercial, non-fad form, is simply the use of ancestral lifestyles as a way of framing food choices. It is different from most other diets in that it is primarily proscriptive, merely a list of food genres not to eat, along with loose suggestions about what to eat instead.

A quick rule of thumb with respect to what is allowed and what is disallowed on the Paleo diet is to ask, for each menu item you are considering, “Would this, or something like this, have been available to eat 20,000 years ago?” And if the answer is “No,” then don’t eat it. The basic idea is that humans have evolved to thrive on a wide variety of food substances—we are, after all, omnivores—but the overwhelming bulk of that “evolution” occurred in a hunter-gatherer context, prior to the agricultural revolution. Agricultural products that were not available in a wild form are foreign substances according to our body’s evolved expectations, and our physical systems are not prepared to deal with them to the extent that they are prepared to deal with more authentically human foods. This precludes dairy products, all modern versions of cereal grains, most legumes, and anything assembled from artificial, factory-generated ingredients.  

With all “lifestyle” diets, there are variations in strictness. Some vegetarians still eat eggs and dairy. Some avoid the eggs, and some, vegans, attempt to avoid all animal byproducts entirely. Likewise, with eating Paleo. On the ultra-strict end of things, you have folks who not only avoid grains and legumes, but nightshades (tomatoes, potatoes, peppers, squash, melons) as well, because most of the edible nightshades are New World foods, and would not have been available to humans in the Pleistocene; in addition, nightshades contain toxic alkaloids and lectins that have been anecdotally linked to immune system issues in sensitive individuals. I am on the opposite end of the Paleo-strictness continuum. Not only do I eat nightshades, but I also eat grains, legumes, and dairy on occasion—an occasional pizza or deli sandwich—and more-than-occasional wine. I estimate that I eat Paleo close to 90% of the time—actually, any way that you choose to measure that 90%: according to calories (90% or more come in a Paleo-approved form) or meals (one or two meals per week might include cheese or grains or legumes).

All diets are controversial. Sometimes the controversy has to do with differing views of the nutritional benefits. Sometimes it has to do with whether the diet is truly effective relative to its stated purpose—in all but a small minority of cases, the main purpose is weight loss. The Paleo diet has proved controversial for both of these reasons. But it is also controversial for another reason. It is controversial because the mere notion that anything humans did prior to civilization could be superior in an unqualified way to what civilized humans are doing now flies in the face of the deeply entrenched orthodoxy of human progress, a demonstrably false orthodoxy that has been with us since at least as far back as the Enlightenment.

***

Civilization is a way of life that, in many ways, runs counter to our evolved physical, social, and psychological expectations as hunter-gatherers, and the mismatch would not be possible to maintain, let alone endure, without a thorough and extremely effective system of justification, a network of beliefs—many of them clearly false—that function to validate and perpetuate the civilized status quo. To challenge these “truths” of civilization is to challenge core notions about what it means to be a human being. The “march of human progress,” the idea that human history is progressive, that human innovation and achievement are continually making things better and better, is a keystone belief. And the fact that it is easily disproven by even the most superficial objective examination of the facts means that it needs to be aggressively defended, actively and proactively.  

In July of 2018, there was an article published in The Proceedings of the National Academy of Sciences that got considerable media attention, as scientific articles go, about the discovery of 14,500-year-old bread from an archaeological site in Jordan. The bread that the archeologists found was more along the lines of a tortilla, made from wild harvested grain and the root of an aquatic plant (presumably as a binding agent), and baked in a stone oven by prehistoric people known as the Natufians. The scientific community was supposedly shocked by this finding. Why? What is surprising about a group of people who figured out how to process grass seed in a way that makes it more portable and perhaps more palatable? Imagine that instead of the Natufians we were talking about a group of people in a slum in the Philippines who figured out how to do exactly the same thing. What makes the first case surprising and the second not is that the first is a challenge to the orthodoxy of human progress. Each of the sites I found that reported on this article made a big deal about the dates involved. This “bread” happened too soon to fit easily into the standard narrative of progress. These people were uncivilized, after all. And, in an attempt to assimilate the new information into existing orthodoxy—and in an egregious misapplication of hindsight—it was speculated that the discovery of bread might have played a pivotal role in the beginnings of agriculture itself. The Natufians were living in the part of the world that would later be known as the fertile crescent, the place where the first verifiable large-scale agriculture occurred. The bread’s popularity might have encouraged the intentional cultivation of the grass seed it was made from.

But one of the most flagrant distortions of the findings, in a reflexive attempt to safeguard the orthodoxy of progress, can be seen in an article from the site Geek.com with the headline “Turns Out Early Humans weren’t Paleo – Ancient Bread Oven Discovered.” The article began by denigrating the Paleo diet and its followers: “Paleo dieting is trendy. In essence, its practitioners think it best, or at least try to limit themselves to foods that a few very poorly informed people think early humans ate. Anyone familiar with the bulk of the research on what ancient humans ate could tell you that the practice is silly, but the plan got another nail in the coffin earlier this week.” The folks over at Geek.com are actively antagonistic toward the Paleo diet. And, given that they bill themselves as a technology news weblog, they are clearly committed to promoting the orthodoxy of progress. The use of the pejoratives “silly” and “poorly informed” are not accidental. They reflect a strawman rhetorical strategy designed to nip any potential challenge to the orthodoxy of progress in the bud. Ignoring the fact that the Natufians were not even close to being “early humans”—humans have existed for somewhere between 250,000 and 2.5 million years, depending on how nitpicky you are with your definition of “human”—they are wrong to assume that there is any meaningful relationship between the wild-harvested einkorn grain-tuber flatbread and what ends up on the grocery shelves today, other than that both were baked in an oven. In addition, the Natufian bread meets the major criteria for a Paleo food: wild harvested, non-domesticated, non-GMO seeds milled by hand and mixed with organic wild-harvested tubers is about as far away from Wonder Bread as it is possible to be and still serve as a vehicle for egg salad.

The danger of orthodoxy is that it works behind the scenes, framing our world view and preventing us from detecting critical flaws in our thinking. The orthodoxy of progress crosses political boundaries, showing up in conservative rhetoric and undergirding left-leaning—progressive—ideology. Many folks in the environmental activism community have also fallen prey to its seductive siren song, believing that there are innovative regulatory or technological solutions to the problem of civilization. I cannot count the times I have heard someone say that we should adopt strategy X as a way of slowing down the increase in atmospheric carbon or reducing population growth or limiting species extinction, as if slowing an increase in something or reducing its growth or imposing limits somehow fixes things, or, at the very least, buys us more time so that progress can work its magic. Unfortunately, no matter how much you slow the increase or reduce growth, the fact that things are still increasing and growing means that they are getting progressively worse. Suppose that you had your hand on a burner that was becoming increasingly hot, causing you to suffer increasing levels of pain. Slowing down the rate at which the burner heats up is not going to make your pain go away. You need to pull your hand off the burner for that.

And as far as the Paleo diet goes, food is important, but it is only one part of an authentic human lifestyle. We are also being forced to engage in artificial “processed” behavior and to participate in unnatural forms of interpersonal interaction that leave us socially and emotionally malnourished. The folks over at Geek.com might be right. The Paleo diet might be silly, and its popularity might simply reflect a point in the natural life course of yet another consumer fad. But maybe, just maybe, if you start to eat from an undomesticated plate, you might start to wonder what it would be like to think with an undomesticated mind.

Primal unity in a willow tree

I bought my wife a weeping willow for her birthday shortly after we were married, and planted the three-foot sapling in the middle of our front yard, between the front porch and the main sidewalk. It became apparent almost immediately that my choice of location was a horrible mistake. The particular variety that I purchased can grow up to six feet in a single season, topping out at over fifty feet tall. Weeping willows are relatively “fat” trees as well, with a dripline diameter roughly equal to their height. My feeble attempts to keep the tree in check through pruning only seemed to make it grow faster, and it quickly overwhelmed our small front yard, pushing its way up and over the porch roof, and merging with the basswood and linden trees in the parking strip, shading the sidewalk beneath a dark arboreal tunnel that forced the neighbor kids to duck low as they rode through on their bicycles. One evening, and for no reason other than spontaneous impulse, I took a chainsaw and cut the tree down.

For years afterward, the residual roots would send up shoots that had to be periodically hacked into submission. One spring, I planted a cutting from one of these volunteer shoots out by the river at the far end of the back yard, a spot where it could spread out and express its true nature—where I probably should have planted the tree in the first place. The cutting grew fast. Extremely fast. A decade later, when we were forced by circumstance to move to a different state, the tree was well on its way to its fifty-foot height expectancy, and had assumed the iconic weeping willow shape, with a broad umbrella of limbs and branches draped with leafy stems cascading to the ground.

When we moved, I couldn’t bear to leave the tree behind. It was originally a gift to my wife, after all. So, I rooted a few cuttings and packed them into large pots, and wedged them into the back of the U-Haul. Now, two years later, one of those cuttings sits in front of me as I write this, a healthy but awkward spider of willow switches splayed in the sun on the deck of our second-floor apartment, waiting patiently for the day when it can cast a broad wispy shadow across some future backyard.

There is something strange in this. When I look at the tree in the pot on my deck, I can’t help but see it as the same tree that I planted twenty years ago. Not a piece or a part of the same tree, not a permutation or the generational offspring of the same tree, but the original tree itself. The tree that I chopped down and the tree that I planted in the back yard—the tree that likely still stands there—and the tree in the pot on my deck, despite the fact that they occupy different physical spaces and different—some overlapping—moments in time, are all one in the same organic manifestation, one in the same being, one in the same tree.

Such a thing seems ridiculous from an analytical perspective, from the perspective of clear-headed material objectivity that holds that the world is populated with independent entities and objects separated with borders and boundaries from other independent entities and objects. But I can’t shake the perception. It is too fundamental, too primal, too deeply rooted, as it were. And it is a perception, immediate and present, not an afterthought or a product of reflection, not something superadded after the fact. It is on par with, and as unshakable as, my experience of my arms as extensions of my own body. When I look at the tree in the pot, I see it in my mind’s eye extending across time and space in a way that fuses the multiple, historically separate plants into a single willow tree, one tree in multiple places, one tree emerging in memory as multiple forms, multiple faces of the same living being. 

This reminds me of something I read once about an indigenous perspective regarding animals, in which the deer killed today is the same deer that was killed yesterday—although they are unique, distinct, and separate occurrences, they are both identical manifestations of the same, unitary deer-being. My experience of the tree is something like that: a strong sense of a primal unity expressing itself in superficial multiplicity, a single tree-process stretching across time and space, capable of taking on a multitude of transient local forms.