The Cheap Trick of Whiteness

A half-truth, as John Lukacs is fond of saying, is more dangerous than a lie, because the element of truth in it, speaking to our hearts and minds, can mask the accompanying falsehood. We see this in the current embrace of multiculturalism, which propagates the dangerous lie that a civilized human society can exist—whether at the level of the family, the city, or the nation—without a unifying culture.  (That, and not the claim that all cultures are equally “valid” or valuable, or even that all other cultures are more to be admired than ours, is the greatest danger posed by multiculturalism.) Despite the evident falsity of this claim (history presents no example of a lasting society without a dominant, unifying culture), the ideology of multiculturalism has flourished in the United States not because it has been imposed by political and cultural institutions, such as public schools and universities (though it has), nor because the former elites of the once-dominant culture in this country have been ill prepared to defend that culture as a unifying force (though they have), but because of the element of truth that the proponents of multiculturalism use like a katana to slice through any resistance to their destructive agenda: Diversity, like unity, is a positive good.

We do not have to draw on parallels from agriculture about the dangers of large-scale monoculture, or from genetics about the dead end of restrictive gene pools, to recognize this truth. It is not simply boredom that leads us to seek out new friends and to sample different cuisines, to learn languages other than the one we were born into and to study the history of other civilizations, or even the far-flung corners of our own. Russell Kirk argued that diversity—true diversity, not multiculturalism masquerading under that name—is a conservative principle, because (like all other true conservative principles) it is a reflection of the good, the true, and the beautiful. The Christian God Himself is a diversity in unity.

The problem, as always, is one of balance. Unity is a positive good; diversity is a positive good; but either one, taken to the extreme, destroys the other. Variety (the saying goes) is the spice of life, and sometimes a dish can become unbalanced because too little salt has been added. Yet, as any good cook knows, it is easier to destroy a dish through an overabundance of spices. Multiculturalism, as practiced in the United States, isn’t a measured dose of garlic or cumin or harissa incorporated into a hearty beef stew; it’s a cup of MSG poured on top of a Big Mac. The initial dish is toxic enough without any help from the Orient’s secret salt.

If the half-truth of multiculturalism is that diversity is a positive good, the half-truth that some opponents of multiculturalism push beyond the limit is that unity is a positive good. When unity becomes the highest value, we end up not with, say, the vibrant yet diverse Christian civilization of Europe in the Middle Ages but with the excesses of Enlightenment rationalism and (as I discussed last month) the post-Christian hypermonotheism of Islam. And among the excesses of Enlightenment rationalism is found an obsession with race as a unifying principle, among both liberals who see “whiteness” as the root of all evil, and some of their opponents who increasingly see it as the sole source and foundation of everything worth preserving.

A.D. 2015 will long be remembered here in Rockford as the year when that great “white power” band Cheap Trick (“Mommy’s all white / Daddy’s all white / They just seem a little weird”) finally received the recognition that they deserve, with the announcement that they will be inducted into the Rock and Roll Hall of Fame in Cleveland, Ohio. Bad puns aside, it is hard to imagine four men who are collectively more white than Rick Nielsen, Robin Zander, Tom Petersson, and Bun E. Carlos (Brad M. Carlson, who says that he chose his stage surname because “We sounded like a bunch of Swedes”).  Yet it is absurd to speak of Cheap Trick as a “white” band, even in the sense that it is legitimate to speak of their fellow 2016 inductees N.W.A. as a black one.  Cheap Trick’s music cannot be reduced to a product of their “genetic endowment,” or even to some generic “white culture.”  Nielsen, Zander, Petersson, and Carlos are men of a certain time and a certain place—the mid–20th to early 21st century Upper Midwest, and specifically Northern Illinois—and their music has its feeder roots here and now (and then), even if other roots run deeper. Their longevity is the result, in large part, of their continued connection to this place and to the people who make their home here. As my barber recently noted, Rockford has changed a lot since he was young, but if you’re trying to find out something about a fellow Rockfordian you’ve never met, chances are you know not just one but several people who have worked with him, eaten with him, had one too many drinks with him, or worshiped with him.

Too many use the terms patriotism and nationalism today as if they were interchangeable, but they mean radically different things, especially in the context of a nation spread across an entire continent. Patriotism not only implies a connection to a certain people but demands a mutual connection to a certain place. There may be reasons why it is hard for me, a native of West Michigan, to be a Rockford patriot even after 20 years of living here, but it is many orders of magnitude easier than being a generically American one. America is not a place; it is many places—thousands of towns and regions and 50 states, all within the bounds of a continental empire that even in its infancy was more political than cultural. (The cultural differences between the original states, and even within each state, are almost incomprehensible to those whose historical imaginations have been fed from infancy on a steady diet of Thanksgiving turkey.) This country has always had, by its very nature, an inherent diversity that nationalism at best glosses over and at worst, reflecting its roots in Enlightenment rationalism, seeks to destroy in favor of an artificial unity. The subtitle of this magazine notwithstanding, there can be no single, deep, and lasting “American culture,” but there have been and still are many American cultures, local and regional, and the stronger they are, the more likely it is that the country as a whole will manage to survive.

Fame, alas, is fleeting, and the music of Cheap Trick may not be remembered outside of Rockford a century from now, much less four centuries, but what is true of Nielsen and Zander, Petersson and Carlos is just as true of Bach and Beethoven, Brahms and Mozart. When the multiculturalists dismiss the latter as “Dead White European Males,” and some of their opponents respond by lumping them together as “White Western Christians,” both sides turn these great composers into abstractions, as if the works of each one were (absurdly) interchangeable with those of any of the others. Notre Dame de Paris, Hagia Sophia, Saint Paul’s Cathedral, and Sagrada Família were all built by Christian men of varying shades of whiteness, but the individual beauty and majesty of each edifice arises from the differences between those men and their cultures as much as it does from their underlying unity. Those who look at these churches through a monochromatic lens will never experience their full beauty—much less the fullness of truth that each represents and was built to honor. That some of those people, in fact, celebrated the blasphemous suicide of Dominique Venner in Notre Dame de Paris in May 2013 speaks volumes about what they truly worship.

Ostensibly, one of the reasons Venner chose to commit his “eminently political” act in the sanctuary at Notre Dame was to awaken the people of Europe to the dangers of Islamic immigration—a real threat that he correctly understood might spell the death of Europe as we know it. But the nations of Europe have faced this threat before, and they did not repel it through individual or mass suicide. Jan Sobieski, Janos Hunyadi, and Giovanni da Capestrano were all Western white Christian men named John who fought Islam, but they did not do so on behalf of the abstractions of “Europe” or “the West,” much less of “whiteness.” Each fought for the truth incarnate in his native land and people, in “the ashes of his fathers / and the temples of his gods.”

Abstractions draw man away from reality and lead him to despair; a firm grounding in reality gives man hope—or at least something that he can fight for when the odds seem overwhelming. A man, history shows us, will fight for his wife and children; for his family and friends; for his home and native land. Given time, talent, and resources, he may build things that last for generations yet to come. He may go to his grave knowing that his name may be lost to the ages within a century or two, but his presence will still be felt.

If, however, he abjures all of this, cuts his ties to his native soil (and never puts down roots anywhere else), makes few lasting friendships, chooses not to marry (or, if he marries, refuses to have children), and devotes his life instead to battles that are so large they cannot be won on behalf of an abstraction spun out of centuries of mass delusion—then such a man has not fought the enemies of civilization; he has joined forces with them.       

First published in the February 2016 issue of Chronicles: A Magazine of American Culture.

A New Teleology

That which will be is to some extent the cause of that which is. What is going to happen tomorrow is already to some extent the cause of what is happening today; indeed, of what has happened yesterday. We are not merely the products of the past; we are also the creators of the future . . .

— John Lukacs, Historical Consciousness: The Remembered Past

Richard M. Weaver famously declared that “ideas have consequences.” His book of that title presents an argument that is somewhat more nuanced than the way in which the phrase has been used by decades of conservatives since; but both Weaver and those he inspired have largely seen the role of ideas in history as linear, a matter of cause and effect. Hold a certain set of ideas (an ideology), and you will act—or, at least, be more likely to act—a certain way. Ideas, in this view, are motives, pushing us along; change the dominant ideas of a culture, and you will change the future.

There’s something attractive, even comforting, in this understanding of the grand, sweeping role that ideas play, even though—or perhaps, to some extent, because—it denies personal moral agency. At its worst, it’s the conservative equivalent of the most simplistic form of Darwinism. If everything that has happened and is happening can be reduced to broad historical trends tied to one particular cause, then none of us can be held personally accountable for being carried away in that stream; and conversely, if we can spread the right ideas far and wide, that stream not only can but will be diverted.

But the role that ideas play in shaping human action and the course of history is more complex and more personal than the popular understanding of the phrase “ideas have consequences.” As John Lukacs often wrote, “Men do not have ideas; they choose them.” There’s nothing deterministic or mechanical about those choices. As Lukacs writes in Historical Consciousness, “in historical life events are not only ‘pushed’ by the past but ‘pulled,’ too, by the future; desires, aspirations, expectations, perceptions, premonitions, purposes, all play their parts . . . ”

Desiresaspirationsexpectationsperceptionspremonitionspurposes: These words aren’t the language of mechanical causality, nor words that would normally remind us of the phrase “ideas have consequences”; they are, instead, words that we associate with free will, with moral agency, with personal beliefs and actions rather than impersonal historical trends.

Every action is a moral action; and that includes our choice of ideas. The elevation of opinion—“that’s just your opinion; it’s not what I believe”—as the chief currency of intellectual life is, at root, an avoidance of our duty to conform our lives to what is true.

“[W]e are products, and creators, of the past,” Lukacs writes, “and also creators, and products, of the future.” The central word here is we: With respect to both the past and the future, we remain—both mankind as a whole, and each of us personally—at the center of history.

Such a recognition is not only liberating, freeing us from the grip that determinism holds on the mind of modern man, but also profoundly Christian, returning free will to its central role in the life of every person. And that brings with it a deep responsibility to choose our ideas, and to act on them, with care.

Every action is a moral action; and that includes our choice of ideas. The elevation of opinion—“that’s just your opinion; it’s not what I believe”—as the chief currency of intellectual life is, at root, an avoidance of our duty to conform our lives to what is true. Pope Benedict XVI called this the “dictatorship of relativism”; but it is a dictatorship imposed not from without but embraced from within, because conforming our lives to the truth is harder than not doing so. “You shall know the Truth, and the Truth shall set you free”; but freedom comes with moral responsibilities that bondage allows us to shirk.

Weaver believed that the great intellectual break in the history of mankind was the rise of nominalism and the abandonment of the classical and Christian insistence that everything that exists takes part in a reality beyond this world. There are only things we call horses; there is no essential horse-ness. Yet from the standpoint of epistemology, our understanding of how we know what we know, the world of the forms was always an abstraction, a conclusion we reached through our empirical study of the world around us rather than a realm we accessed directly. The nominalists insisted that there were only horses because we can never directly experience horse-ness.

Yet, pace Weaver, the more important intellectual shift occurred somewhat later, when we began to apply the concept of mechanical causality in science to human thought and action. When men longed for heaven, their desires and aspirations could pull them forward; when they cast their eyes down to earth, they made themselves no better than the animals, acting on base instinct and physical necessity—yet seeing in this slavery to the material a kind of freedom from the need to seek the truth and to act on it.

What we need today is a new teleology, a recognition of the end for which man was created, and a desire to live our lives with that end in sight.

Returning to Earth

What lies at the root of the abstractionism that I discussed last month, which afflicts the modern world like a mania, especially here in the United States?  Walker Percy dubbed the phenomenon angelism, by which he did not mean that those who exhibit it have evolved to a state of moral purity but that we have individually and collectively cut ourselves loose mentally from the ties that bind us to the world and the people around us.  And yet (for reasons that should be obvious) we have not been able, through such abstraction, to overcome the limitations that are inherent in human life and the material world.  Stymied by our inability to overcome those limitations, we have come increasingly to despise the world and our place in it.  And so our response is not to become more human but less so, as Percy’s Dr. Tom More put it so clearly in Love in the Ruins almost 50 years ago:

For the world is broken, sundered, busted down the middle, self ripped from self and man pasted back together as mythical monster, half angel, half beast, but no man.  Even now I can diagnose and shall one day cure: cure the new plague, the modern Black Death, the current hermaphroditism of the spirit, namely: More’s syndrome, or: chronic angelism-bestialism that rives soul from body and sets it orbiting the great world as the spirit of abstraction whence it takes the form of beasts, swans and bulls, werewolves, blood-suckers, Mr. Hydes, or just poor lonesome ghost locked in its own machinery.

Walker Percy did not live to see the rise of social media (he died in 1990), but the various forms that social media have taken and the conduct they have engendered among so many of their users would not have surprised him.  For all of the potential that social media have to draw people closer together, to rekindle ties with old friends and relatives, to keep us rooted in one another and therefore in the communities in which we are mutually a part, in practice they have all too often enabled the opposite: Social media allow us to engage in flights of fancy, to escape from the reality of our lives by imagining ourselves (consciously or even unconsciously) to be someone different, or even just to cast aside the manners and mores that are essential to civilized life in an actual community.

There have been dozens of investigative articles over the past several years on the phenomenon of “trolling”—people exhibiting behavior toward others with whom they interact online that would, in face-to-face encounters, skirt the line of diagnosable sociopathy, or even cross over it.  A common theme runs through all of them: When trolls meet the reporters, they behave much differently in person.  They are frequently shy, almost invariably polite, and express hurt when the reporters ask them about their actions online in tones that imply condemnation or disapproval.  The reporters themselves experience cognitive dissonance—they expect to dislike, even hate, the trolls but find themselves liking and even sympathizing with them.

The behavior exhibited by trolls looks increasingly like one extreme of a broader phenomenon that afflicts an ever-wider swath of users of social media, and I don’t mean just white nationalists and “social-justice warriors” on Twitter.  More and more of us find it both easy and a relief to create identities on social media that do not reflect the reality of our everyday lives—even if we use our own names.  (And I use us here not as a rhetorical device but as a recognition that I have strayed in this direction myself over the years before recognizing that I had loosed the bonds of earth and needed to return to reality.)

Were Walker Percy still alive, I suspect he would see in this parallels to the psychological condition of dissociation.  With our increasing use of social media (and other electronic media, such as email and texts) as a substitute for the hard reality of dealing with flesh-and-blood human beings, we create alternative unrealities that consume more and more of our attention and consciousness until, one day, we look in the mirror and no longer recognize the man we see there.  We become strangers to ourselves, but the ghosts we have created through our abstraction can never truly replace the creatures that God has made us to be.  Bound by time and ties to people and place, we have only two options: keep raging against reality and losing our true self in the process, or start recovering that true self by accepting the limitations inherent in it, and returning to earth.    

First published in the April 2019 issue of Chronicles: A Magazine of American Culture.

Life Is Not a Fantasy

The reality of place has weighed heavily on me from a very young age.  My knowledge of self has always been inseparable from the place in which I live.  My understanding of who I am has been closely tied to those with whom I most often interact—family, friends, coworkers, neighbors, and even those with whom I have a nodding acquaintance (a phrase that has become unfortunately abstract in a world that no longer values simple signs of courtesy and respect).  Remove me from familiar places, and I become a stranger in a strange land, longing for my home.

Even when, as a typical teenager, I longed to leave my hometown, my departure always ended, in my imagination, with my return.  A life elsewhere, among other people, is an abstraction: Home is reality.

Of course, I no longer live in my hometown—and yet, in fact, I do.  In Huntington, as in Rockford, as in Spring Lake, I have walked the streets until they have become a part of me, and found my place among a people who are not simply passing through but are deeply rooted in this portion of God’s green earth and the little bit of civilization that has been built upon it, for all intents and purposes autochthonous and autonomous, a true community made up not of individuals with entirely separate lives but of persons whose sense of themselves is tightly woven with their sense of their neighbor and of their place.

Chaucer was the first to claim that familiarity breeds contempt, and most (if not all) of us can point to concrete examples that seem to prove his adage true.  Yet these words are, at best, a half-truth, which makes them (as John Lukacs reminds us) more dangerous than a lie.  Because it is even more true to say that familiarity breeds community, and that civilization cannot arise among an agglomeration of rootless individuals, but only among men and women who are rooted in a particular place and in deep knowledge of one another.

These brief thoughts were occasioned by continued reflection on what role, if any, aphantasia—my complete inability to create mental images—may have had on the development of my theological, philosophical, and political understanding.  As I mentioned last month, I was initially dismissive of David Mills’s suggestion even to consider this.  But the centrality of incarnationalism in my theological understanding, my visceral rejection of abstraction in philosophy, and my preference for localism in politics, economics (broadly understood), and culture, taken together, do seem like the positions one might expect a person who can’t imagine an orange sheep with five legs perched on the dome of the Huntington County courthouse to have arrived at.

On the other hand, shouldn’t we expect a Catholic who has truly encountered Christ to place the Incarnation at the center of his theological thought and, therefore, to reject philosophical abstraction in favor of an epistemology resembling a traditional Aristotelian empiricism?  If even God must become man in order for us truly to know Him, why would we think that we can have true knowledge of anything else outside of experience?  Even book larnin’ must build on experience, moving from analogy to analogy, and the mental images created by people who are not aphantasic of things they have not directly experienced are still conditioned by their actual experiences.  Thus, the presentation of the Blessed Virgin in medieval art as more European than Middle Eastern is no more a form of cultural imperialism than images emerging from other Christian communities at roughly the same time of Mary with Asian or Ethiopian features.  We know what we know because we have experienced it.  Even those with the ability to create extraordinarily vivid mental images—hyperphantasia, we might call it—cannot conjure up a mental figment that does not correspond in some way to something they have experienced.

Yet there are Catholics today who intellectually accept the Incarnation as a reality but whose theology is otherwise maddeningly abstract, and philosophical abstraction ism, like centralism in politics, economics, and culture, has become more the norm among the intellectual classes than the exception.  Over the last century—and accelerating exponentially in recent years—those tendencies have spread beyond the intellectual classes into the broader populace.  Mass communications, and now social media, have turned abstractionism into a form of mania, a type of mental illness no longer confined to individuals but affecting society as a whole.

Walker Percy saw it coming nearly 50 years ago, and it’s no coincidence that this Catholic convert made the hero of Love in the Ruins (1971) and its sequel, The Thanatos Syndrome (1987), both a psychiatrist and a descendant of St. Thomas More.  The answer to the abstraction that’s making us all mad lies in the faith that is the substance of things hoped for, the evidence of things not seen.  Far from abstraction, that faith is an experience, a personal relationship with the God made Man; not a fantasy, but the ultimate ground of reality.

First published in the March 2019 issue of Chronicles: A Magazine of American Culture.

Picture This

Last year, just before his 21st birthday, my son Jacob learned of a condition called aphantasia.  In its strictest form, aphantasia is the inability to create mental images.  Like many such conditions, aphantasia affects those who have it to varying degrees.  In Jacob’s case, his mental images are very fuzzy and indistinct.  In my case, they are utterly nonexistent.  When I close my eyes and try to conjure up an image, all I see—all I have ever seen—is blackness.

I was a few months shy of 50 years old when Jacob made his discovery.  I had never heard of aphantasia, and my first reaction was disbelief—not that such a condition could exist, but that it wasn’t universal.  From the time I became aware of language implying that we should be able to create mental images (at will or involuntarily), I had always assumed that such language was metaphorical.  I had never thought that “Picture this” was meant—much less could have been meant—as a literal command.

The next several days were both disconcerting and exciting, as I experimented with family and friends and coworkers.  I discovered that the ability to make mental images is not consistent—while almost everyone else, it seems, can conjure up a vivid image to some extent, there is a range of detail, as the difference between my experience and Jacob’s had already indicated.  My daughter Cordelia has a very active and extraordinarily malleable ability to create mental images, and she quickly grew tired of my interrogations:

Me: Imagine a sheep.  What color is it?

Cordelia: White.

Me: How many legs does it have?

Cordelia: Four.

Me: Are you sure it’s white?  Isn’t it purple?

Cordelia, with a nervous laugh: It is now.

Me: And doesn’t it have six legs?

Cordelia, exasperated: It does NOW.

At 13 years old, Cordelia has just won two local prizes for her art—hardly, it seems to me, a coincidence.  In a similar interrogation, her older sister Grace, now 18, saw an orange sheep with five legs standing on the dome of the Huntington County courthouse, once I told her it was there.  Grace’s images, though, are more cartoonish, while Cordelia’s are vividly realistic, even when they cannot exist in reality.

Assuming you haven’t turned the page already—since, in all likelihood, you have no trouble visualizing images and you find my inability to do so an uninteresting defect—the point of this column is not to introduce you to aphantasia, much less to declare myself special for having this condition, and even less to excite your pity.  Rather, it is to explore the implications of a question that the quondam Chronicles author David Mills raised when I discussed aphantasia on Facebook.  In response to my self-diagnosis, David asked whether I thought my condition may have affected my politics, and how.

Because of the way in which David phrased a portion of his question (“For example, does this make you more rational/more principled/less swayed by emotion or [as a critic of your politics would say] less sympathetic/less caring?”), I responded a bit churlishly at the time, but I’ve thought a lot about his question over the intervening year, and I think that David may be on to something.

My political views, as well as my religious ones, have always been deeply connected to my epistemology—my understanding of how we know what we know.  Epistemologically, I am an empiricist—not in the modern, limited sense that excludes any experience that is not reproducible and quantifiable, but in the Aristotelian sense: We have no knowledge of reality except through our experience.  Even our leaps of intuition depend, at base, on prior experience.

By my early 20’s, as a grad student in political theory at The Catholic University of America and long before I learned of aphantasia, I had become a dedicated foe of philosophical abstraction—and the social and political consequences of the modern embrace of it.  Reconstructing society on the basis of theories that have no basis in the lives of real people living in real places makes as much sense to me as worshiping an orange sheep with five legs perched on the dome of the Huntington County courthouse would.  I have no use for economic “laws” based in “self-interest” that are contradicted by the everyday experience of family and community life.  I recognize that men and women and even children routinely set aside their “self-interest” out of love for others, and that characterizing such actions as exceptions to the norm is, in fact, an attempt to redefine the norm.

The love of a mother for her child, family ties, the bonds that bind a community together—all of these are things that we can and do experience, but they are not quantifiable in ways that translate into economic laws or political systems.  They are, however, all experiences that remind us, as Christians, of our encounter with the One Who created us, Who mourned our fall, and Who died to save us from ourselves.

A god who does not become man must remain, in a very real sense, forever outside of human experience.  Those who are not aphantasic may conjure him up, but they risk creating him in their own image.  A God Who becomes man, however, is like the angel whom Jacob faced at the ford of the Jabbok: someone with whom one must wrestle—a reality, and not an abstraction.  And wrestling with Him must inevitably affect how one views the rest of the world.

First published in the February 2019 issue of Chronicles: A Magazine of American Culture.