Returning to Earth

What lies at the root of the abstractionism that I discussed last month, which afflicts the modern world like a mania, especially here in the United States?  Walker Percy dubbed the phenomenon angelism, by which he did not mean that those who exhibit it have evolved to a state of moral purity but that we have individually and collectively cut ourselves loose mentally from the ties that bind us to the world and the people around us.  And yet (for reasons that should be obvious) we have not been able, through such abstraction, to overcome the limitations that are inherent in human life and the material world.  Stymied by our inability to overcome those limitations, we have come increasingly to despise the world and our place in it.  And so our response is not to become more human but less so, as Percy’s Dr. Tom More put it so clearly in Love in the Ruins almost 50 years ago:

For the world is broken, sundered, busted down the middle, self ripped from self and man pasted back together as mythical monster, half angel, half beast, but no man.  Even now I can diagnose and shall one day cure: cure the new plague, the modern Black Death, the current hermaphroditism of the spirit, namely: More’s syndrome, or: chronic angelism-bestialism that rives soul from body and sets it orbiting the great world as the spirit of abstraction whence it takes the form of beasts, swans and bulls, werewolves, blood-suckers, Mr. Hydes, or just poor lonesome ghost locked in its own machinery.

Walker Percy did not live to see the rise of social media (he died in 1990), but the various forms that social media have taken and the conduct they have engendered among so many of their users would not have surprised him.  For all of the potential that social media have to draw people closer together, to rekindle ties with old friends and relatives, to keep us rooted in one another and therefore in the communities in which we are mutually a part, in practice they have all too often enabled the opposite: Social media allow us to engage in flights of fancy, to escape from the reality of our lives by imagining ourselves (consciously or even unconsciously) to be someone different, or even just to cast aside the manners and mores that are essential to civilized life in an actual community.

There have been dozens of investigative articles over the past several years on the phenomenon of “trolling”—people exhibiting behavior toward others with whom they interact online that would, in face-to-face encounters, skirt the line of diagnosable sociopathy, or even cross over it.  A common theme runs through all of them: When trolls meet the reporters, they behave much differently in person.  They are frequently shy, almost invariably polite, and express hurt when the reporters ask them about their actions online in tones that imply condemnation or disapproval.  The reporters themselves experience cognitive dissonance—they expect to dislike, even hate, the trolls but find themselves liking and even sympathizing with them.

The behavior exhibited by trolls looks increasingly like one extreme of a broader phenomenon that afflicts an ever-wider swath of users of social media, and I don’t mean just white nationalists and “social-justice warriors” on Twitter.  More and more of us find it both easy and a relief to create identities on social media that do not reflect the reality of our everyday lives—even if we use our own names.  (And I use us here not as a rhetorical device but as a recognition that I have strayed in this direction myself over the years before recognizing that I had loosed the bonds of earth and needed to return to reality.)

Were Walker Percy still alive, I suspect he would see in this parallels to the psychological condition of dissociation.  With our increasing use of social media (and other electronic media, such as email and texts) as a substitute for the hard reality of dealing with flesh-and-blood human beings, we create alternative unrealities that consume more and more of our attention and consciousness until, one day, we look in the mirror and no longer recognize the man we see there.  We become strangers to ourselves, but the ghosts we have created through our abstraction can never truly replace the creatures that God has made us to be.  Bound by time and ties to people and place, we have only two options: keep raging against reality and losing our true self in the process, or start recovering that true self by accepting the limitations inherent in it, and returning to earth.    

First published in the April 2019 issue of Chronicles: A Magazine of American Culture.

Life Is Not a Fantasy

The reality of place has weighed heavily on me from a very young age.  My knowledge of self has always been inseparable from the place in which I live.  My understanding of who I am has been closely tied to those with whom I most often interact—family, friends, coworkers, neighbors, and even those with whom I have a nodding acquaintance (a phrase that has become unfortunately abstract in a world that no longer values simple signs of courtesy and respect).  Remove me from familiar places, and I become a stranger in a strange land, longing for my home.

Even when, as a typical teenager, I longed to leave my hometown, my departure always ended, in my imagination, with my return.  A life elsewhere, among other people, is an abstraction: Home is reality.

Of course, I no longer live in my hometown—and yet, in fact, I do.  In Huntington, as in Rockford, as in Spring Lake, I have walked the streets until they have become a part of me, and found my place among a people who are not simply passing through but are deeply rooted in this portion of God’s green earth and the little bit of civilization that has been built upon it, for all intents and purposes autochthonous and autonomous, a true community made up not of individuals with entirely separate lives but of persons whose sense of themselves is tightly woven with their sense of their neighbor and of their place.

Chaucer was the first to claim that familiarity breeds contempt, and most (if not all) of us can point to concrete examples that seem to prove his adage true.  Yet these words are, at best, a half-truth, which makes them (as John Lukacs reminds us) more dangerous than a lie.  Because it is even more true to say that familiarity breeds community, and that civilization cannot arise among an agglomeration of rootless individuals, but only among men and women who are rooted in a particular place and in deep knowledge of one another.

These brief thoughts were occasioned by continued reflection on what role, if any, aphantasia—my complete inability to create mental images—may have had on the development of my theological, philosophical, and political understanding.  As I mentioned last month, I was initially dismissive of David Mills’s suggestion even to consider this.  But the centrality of incarnationalism in my theological understanding, my visceral rejection of abstraction in philosophy, and my preference for localism in politics, economics (broadly understood), and culture, taken together, do seem like the positions one might expect a person who can’t imagine an orange sheep with five legs perched on the dome of the Huntington County courthouse to have arrived at.

On the other hand, shouldn’t we expect a Catholic who has truly encountered Christ to place the Incarnation at the center of his theological thought and, therefore, to reject philosophical abstraction in favor of an epistemology resembling a traditional Aristotelian empiricism?  If even God must become man in order for us truly to know Him, why would we think that we can have true knowledge of anything else outside of experience?  Even book larnin’ must build on experience, moving from analogy to analogy, and the mental images created by people who are not aphantasic of things they have not directly experienced are still conditioned by their actual experiences.  Thus, the presentation of the Blessed Virgin in medieval art as more European than Middle Eastern is no more a form of cultural imperialism than images emerging from other Christian communities at roughly the same time of Mary with Asian or Ethiopian features.  We know what we know because we have experienced it.  Even those with the ability to create extraordinarily vivid mental images—hyperphantasia, we might call it—cannot conjure up a mental figment that does not correspond in some way to something they have experienced.

Yet there are Catholics today who intellectually accept the Incarnation as a reality but whose theology is otherwise maddeningly abstract, and philosophical abstraction ism, like centralism in politics, economics, and culture, has become more the norm among the intellectual classes than the exception.  Over the last century—and accelerating exponentially in recent years—those tendencies have spread beyond the intellectual classes into the broader populace.  Mass communications, and now social media, have turned abstractionism into a form of mania, a type of mental illness no longer confined to individuals but affecting society as a whole.

Walker Percy saw it coming nearly 50 years ago, and it’s no coincidence that this Catholic convert made the hero of Love in the Ruins (1971) and its sequel, The Thanatos Syndrome (1987), both a psychiatrist and a descendant of St. Thomas More.  The answer to the abstraction that’s making us all mad lies in the faith that is the substance of things hoped for, the evidence of things not seen.  Far from abstraction, that faith is an experience, a personal relationship with the God made Man; not a fantasy, but the ultimate ground of reality.

First published in the March 2019 issue of Chronicles: A Magazine of American Culture.

Picture This

Last year, just before his 21st birthday, my son Jacob learned of a condition called aphantasia.  In its strictest form, aphantasia is the inability to create mental images.  Like many such conditions, aphantasia affects those who have it to varying degrees.  In Jacob’s case, his mental images are very fuzzy and indistinct.  In my case, they are utterly nonexistent.  When I close my eyes and try to conjure up an image, all I see—all I have ever seen—is blackness.

I was a few months shy of 50 years old when Jacob made his discovery.  I had never heard of aphantasia, and my first reaction was disbelief—not that such a condition could exist, but that it wasn’t universal.  From the time I became aware of language implying that we should be able to create mental images (at will or involuntarily), I had always assumed that such language was metaphorical.  I had never thought that “Picture this” was meant—much less could have been meant—as a literal command.

The next several days were both disconcerting and exciting, as I experimented with family and friends and coworkers.  I discovered that the ability to make mental images is not consistent—while almost everyone else, it seems, can conjure up a vivid image to some extent, there is a range of detail, as the difference between my experience and Jacob’s had already indicated.  My daughter Cordelia has a very active and extraordinarily malleable ability to create mental images, and she quickly grew tired of my interrogations:

Me: Imagine a sheep.  What color is it?

Cordelia: White.

Me: How many legs does it have?

Cordelia: Four.

Me: Are you sure it’s white?  Isn’t it purple?

Cordelia, with a nervous laugh: It is now.

Me: And doesn’t it have six legs?

Cordelia, exasperated: It does NOW.

At 13 years old, Cordelia has just won two local prizes for her art—hardly, it seems to me, a coincidence.  In a similar interrogation, her older sister Grace, now 18, saw an orange sheep with five legs standing on the dome of the Huntington County courthouse, once I told her it was there.  Grace’s images, though, are more cartoonish, while Cordelia’s are vividly realistic, even when they cannot exist in reality.

Assuming you haven’t turned the page already—since, in all likelihood, you have no trouble visualizing images and you find my inability to do so an uninteresting defect—the point of this column is not to introduce you to aphantasia, much less to declare myself special for having this condition, and even less to excite your pity.  Rather, it is to explore the implications of a question that the quondam Chronicles author David Mills raised when I discussed aphantasia on Facebook.  In response to my self-diagnosis, David asked whether I thought my condition may have affected my politics, and how.

Because of the way in which David phrased a portion of his question (“For example, does this make you more rational/more principled/less swayed by emotion or [as a critic of your politics would say] less sympathetic/less caring?”), I responded a bit churlishly at the time, but I’ve thought a lot about his question over the intervening year, and I think that David may be on to something.

My political views, as well as my religious ones, have always been deeply connected to my epistemology—my understanding of how we know what we know.  Epistemologically, I am an empiricist—not in the modern, limited sense that excludes any experience that is not reproducible and quantifiable, but in the Aristotelian sense: We have no knowledge of reality except through our experience.  Even our leaps of intuition depend, at base, on prior experience.

By my early 20’s, as a grad student in political theory at The Catholic University of America and long before I learned of aphantasia, I had become a dedicated foe of philosophical abstraction—and the social and political consequences of the modern embrace of it.  Reconstructing society on the basis of theories that have no basis in the lives of real people living in real places makes as much sense to me as worshiping an orange sheep with five legs perched on the dome of the Huntington County courthouse would.  I have no use for economic “laws” based in “self-interest” that are contradicted by the everyday experience of family and community life.  I recognize that men and women and even children routinely set aside their “self-interest” out of love for others, and that characterizing such actions as exceptions to the norm is, in fact, an attempt to redefine the norm.

The love of a mother for her child, family ties, the bonds that bind a community together—all of these are things that we can and do experience, but they are not quantifiable in ways that translate into economic laws or political systems.  They are, however, all experiences that remind us, as Christians, of our encounter with the One Who created us, Who mourned our fall, and Who died to save us from ourselves.

A god who does not become man must remain, in a very real sense, forever outside of human experience.  Those who are not aphantasic may conjure him up, but they risk creating him in their own image.  A God Who becomes man, however, is like the angel whom Jacob faced at the ford of the Jabbok: someone with whom one must wrestle—a reality, and not an abstraction.  And wrestling with Him must inevitably affect how one views the rest of the world.

First published in the February 2019 issue of Chronicles: A Magazine of American Culture.

Pontius Pilate, Ora Pro Nobis

To the leaders of the Free Speech Movement of the 1960’s, self-censorship—once known as civility and decorum—was as dangerous as the social enforcement of civility by private organizations and by public educational institutions, and those social norms were, in turn, just as destructive as attempts by government to limit the freedom of speech guaranteed by the First Amendment.  Yet the chief aim of the Free Speech Movement was not the same as the aim of the authors and ratifiers of the First Amendment.  The provision that “Congress shall make no law . . . abridging the freedom of speech” was intended to prevent a legal stifling of political debate that would allow a dominant faction in the federal government to concentrate power at the expense of the states and the people.  (We can see how well that worked.)  That freedom of speech would eventually be invoked to defend the word f--k, the depraved imagination of Larry Flynt, and even the promotion of murder would have boggled James Madison’s mind.

The ultimate aim of the Free Speech Movement, on the other hand, was to make a decisive break with the institutions and practices that had emerged from, and sustained, what we once called Christendom.  Those who rallied behind the banner of free speech recognized that words had power—both the power to build up and (more importantly for their purposes) the power to tear down.  Those who want to create and sustain civilization and those who wish to destroy it have the same tool at their disposal.

Back then, as free speech progressed from tittering over the seven dirty words to campus sit-ins to throwing firebombs both figurative and literal, some conservatives (more of the Kirkian variety than of the Nixonian one) recognized the Free Speech Movement for what it was: less of a political threat than a civilizational one.  The importance of civility and decorum is no more self-evident to those who have never exercised them than the need for a knife and a fork is to the barbarian who is used to eating with his hands.  Restraint in speech, like table manners, is a learned behavior, and a mark of civilization.

While table manners speak to man’s sense of his own dignity, a man can remain dignified if forced, by circumstance, to grab a turkey drumstick or to cup his hand in a running stream.  Civility and decorum in speech, however, reflect something even deeper: the recognition that speech is a moral act and, therefore, that the choice of one’s words matters.  Language can reveal the truth, or it can deceive; and the chief reason we choose words that reveal the truth is to communicate that truth to others.  And we attempt to communicate truth to others not to do damage to them, but because we know that the truth is something they need to know.

The constructive use of language, then, is tied very closely to tradition—not tradition as a collection of things that are passed down but, as Josef Pieper saw it, an action that conveys truth from person to person and from generation to generation.  Indeed, language is the chief vessel of tradition, properly understood.  And for Christians, all truth has both its root and its end in the Truth that created and sustains us, and that gave Himself to save mankind because we chose to believe, and then to imitate, the Father of Lies.  It is no mere coincidence that John calls that Truth the Word.

A funny thing happened, though, over the last 50 years, reaching its apotheosis in the past few.  An increasing number of those who declare themselves the defenders of civilization and of Christianity have come to regard civility and decorum not as aids in communicating the truth but as shackles preventing “us” from triumphing over “them.”  And so they have embraced the idol of free speech, for the same reason as those activists of the 1960’s whom they would never acknowledge as their forebears: They are more interested today in destruction than they are in preservation, much less in the construction of a truly Christian civilization.  They attack not only their putative enemies (whom they resemble more than they will ever admit), but also those they would once have embraced as their allies, when the latter dare to suggest that words have meaning, that language is properly used to convey truth, and that the ends can never justify the means because all lies have their source in the Father of Lies, just as all truth belongs to the Word Who said, “I am the Way, and the Truth, and the Life.”

With Pontius Pilate, they dismiss truth as “fake news,” standing between them and political power, the modern equivalent of the friendship of Caesar.  But Pilate, seeing the Man he had condemned to death hanging upon the Cross, wrote words of truth and defended them: “What I have written, I have written.”  Early traditions claim that he was baptized and may even have suffered a martyr’s death.  If so, we could use his intercession today.

First published in the January 2019 issue of Chronicles: A Magazine of American Culture.

Quod Scripsi, Scripsi

Reader: I wasn’t quoting you.  I was characterizing your analysis as such.

Me: You were mischaracterizing my analysis.  What I have written, I have written.  What you have written, I did not.

Reader: Says you.

Words have meaning.  We live our lives, for the most part, in a world in which, on a clear spring day, one can say, “The sky is blue,” and everyone else will cheerfully agree (or wonder why you’re bothering to state the obvious).  When we make plans to meet at five o’clock at the Rusty Dog for a drink, no one thinks that means a slice of pumpkin pie at Nick’s Kitchen at 7:30.  We routinely travel at our own pace on the highway, but we all understand how fast we’re supposed to be going when we see the words SPEED LIMIT 70.

Community depends on our ability to communicate.  Language barriers don’t make community impossible, but they do make it much harder to achieve.  If you don’t know a word of French, you’re not likely to feel at home in the streets of Paris, much less in a village in Bretagne.  And if you don’t know that the cheese toastie listed on the menu is localspeak for a grilled-cheese sandwich or understand why the waitress is asking you if you’d like a sack for your leftovers, you might feel a bit out of sorts when you first move to Northeast Indiana.

As daunting as it is for most of us to learn a foreign language, we can do so if necessary, and coming to understand and eventually adopt regionalisms is a sign that you’re taking root in a community.  Stubbornly insisting on speaking only English in a café in Brest or asking a waitress in Fort Wayne to put the remains of your child’s grilled-cheese sandwich in a bag is a sign that, at best, you’re an outsider who wishes to remain that way, and more likely that you’re an ass.  We expect people to try to make themselves understood, and when they do, most of us, most of the time, will make an effort to try to understand them.

Or at least, once upon a time, we did.  Today, when discussing any subject that is in the least bit controversial, more and more Americans not only are unwilling to make any effort to understand others but seem to consider such willingness a weakness.  It is not necessary to understand what another person is saying; it is only necessary to decide where he stands in relation to you.  Once you have made that determination—rightly or wrongly—you can judge everything he says without actually hearing or reading, much less understanding, a word.

As late as 30 years ago, self-identified conservatives still prided themselves on their embrace of logic and reason and evidence.  By then, fewer and fewer of them were studying philosophy or reading history, but they continued to acknowledge the value of both.  Against the rise of an illiberal left that was increasingly embracing the irrationalism of deconstructionism and postmodernism, they continued to defend clarity of thought and expression.

Those days are long gone.  Today, when it comes to his attitudes toward the importance of language and logic and clarity of thought and expression, the average political conservative is just as much in thrall to deconstructionism and postmodernism as the average political liberal.  He would vociferously deny it, of course.  Conservatives don’t believe in such things, just as they didn’t use to believe in divorce or abortion or gay marriage—until, of course they did.  We are all good liberals now.

It may be tempting to blame this change in attitudes among conservatives on the rise of Donald Trump, but his inability to separate what is true from what he wishes to be true is more a symptom of this malady than a cause.

The true cause was what Russell Kirk called “the conservative rout” (the original title of what became The Conservative Mind).  The essence of the revolt of modernity against the classical and Christian world consisted in the subjugation of more and more of human life to politics.  The conservative counterrevolution, from Burke to Kirk, fought to contain politics within its proper—and limited—sphere, and to reassert the primacy of religion and culture.

The left’s Long March Through the Institutions didn’t begin with Antonio Gramsci and his disciples; they simply put a name, and brought a clear sense of purpose, to a movement that began in the Renaissance, achieved full force in the French Revolution, and reached its nadir when conservatives decided that President Reagan had it wrong in his First Inaugural Address: Government was the solution to the problem after all.  The triumph of politics over religion and culture was complete; everything since then has been nothing more than the logic of the revolution playing itself out.

And that includes the rise of scorn for clarity of thought and expression, both in one’s own utterances and in the words of others.  The lines that I quoted at the beginning of this column are repeated incessantly (though usually not quite so succinctly) by self-proclaimed conservatives on conservative (in this case, conservative Catholic) websites, and of course on social media.  We no longer take others’ words seriously, because we are no longer serious in the choice of our own words.  Language was once a tool for expressing truth; now, it is merely a weapon for winning arguments.  The ends justify the means, and the truth be damned.

First published in the December 2018 issue of Chronicles: A Magazine of American Culture.