Adventures in Liturgy: Funeral, or Celebration of Life?

Recently, I was distributing Holy Communion during a Mass of Christian Burial. The coffin was to my immediate right, and the family of the deceased to my immediate left. The Communion Procession was moving in an orderly fashion, when suddenly there was a bottleneck. When I looked up to see what was happening, I couldn’t believe my eyes: having just received Our Lord in the Blessed Sacrament, people were greeting members of the immediate family who were sitting in the front row. I was stunned! I whispered quietly, “Please keep moving, you are blocking the other communicants.”

How did we get here? Answering this question is simultaneously simple, and equally complex. While one may say people no longer know how to act properly in public, I propose that there are other realities at work as well.

The General Introduction to the Order of Christian Funerals states, Christians celebrate the funeral rites to offer worship, praise, and thanksgiving to God for the gift of a life which has now returned to God, the author of life and the hope of the just.[1] Our worship, whether at a funeral or many regular parish Masses, has become so anthropocentric, that we have lost a grip on the reality that we gather to worship, praise, and thank God; instead we often make ourselves the source, center, and end of our liturgical celebration. At a funeral, we gather not for a celebration of life, but to encounter the mercy of God and the promise of eternal life found only in Christ.

Secondly, we live in a world without sin. To admit that there is sin in the world and that we are sinners does not mean that we are bad people. To admit that we are sinners and that there are acts that are objectively right or wrong, proclaims that we are human beings who need to be redeemed through the Paschal Mystery of Christ. Death is a consequence of sin. The Church through its funeral rites commends the dead to God’s merciful love and pleads for the forgiveness of their sin.[2] To admit that we are sinners is to acknowledge that the deceased, and all those present, is truly human, and that God alone is the healer of our pain, and the source of forgiveness.

Death is very hard, and the reality of separation from those we love most dearly is heart wrenching. At the rite of final commendation and farewell, the community acknowledges the reality of separation and commends the deceased to God. In this way it recognizes the spiritual bond that still exists between the living and the dead and proclaims its belief that all the faithful will be raised up and reunited in the new heavens and a new earth, where death will be no more. [3]We have come from God and we are returning to God: our origin is a reality, and to return to God our goal. Is this basic reality present to the minds and hearts of believers today? While life is to be lived and lived to the fullest of the potential God has given us, do we keep before us that our time on earth is not what gives us meaning, but rather that we are destined for God? The preaching, life, liturgy, and catechesis of the Church needs to proclaim loudly that our citizenship is in heaven, and from it we await a savior, the Lord Jesus Christ.[4] A celebration of life fails to lead us to embrace our true citizenship.

If the Liturgy truly forms our faith and shapes our living, our approach to death and the Rites of Christian burial may reflect more accurately that we believe that all the ties of friendship and affection which knit us one throughout our lives do not unravel in death.[5]

Post by: Fr. Jordan Kelly, O.P.

Main image: A Funeral at Ornans, Gustave Courbet, 1850

[1] Order of Christian Funerals, hereafter OCF, #5.

[2] OCF, #6.

[3] Ibid.

[4] Philippians, 3:20.

[5] OCF, # 71.

A Forgotten Sin

There is a strange and subtle fault that plagues human hearts. It is strange because it is committed only with other sins, and it is subtle because one already has forgiveness on his mind when he commits it and so is likely not to think it needs repenting from. What is this sin?

Presumption.

Presumption is opposed to the virtue of hope, whereby we desire and expect God’s forgiveness and help in obtaining Heaven. It is the contrary of despair… The presumptuous person throws aside the moral law on account of the excessive character of his hope. He expects too much from God: he expects a thing not promised. Salvation has not been promised to those who merely fulfill a formula (viz., announcing one’s sins in sacramental confession, for example) but rather to those who exhibit perfect contrition, which is the rejection of all to do with sin – its evil effects, its evil content, and its evil motivation – out of love for God (with the assumption of making confession soon, if not presently making one), and to those who at least have true “attrition” (fear of punishment) within the sacrament of confession itself. Presumption is a special kind of motivation… a “meta-sin” if you will. One is in danger of not having adequate repentance for the sacrament of confession to receive absolution if he fails to mention presumption, as he brings his lack of the fear of God into the confessional with him. For a valid confession, one must at least have true attrition – fear of punishment. The presumptuous person does not have this fear with regard to himself. (If you have just become aware of this sin in your life, you should assume that your prior confessions were valid unless you have a clear certainty that you were not really trying that hard to examine your conscience. Simply mention presumption in your next confession.)

To help understand this sin, here’s a natural, human form of presumption. Imagine a child who stays out well past his curfew. When he comes home, his parents are upset, but he apologizes for his lateness and they forgive him. Then, on their way to bed, they hear their son talking on the phone to a friend – “Yeah they were mad but they forgave me. I knew they would, that’s why I did it.”

Ouch. What parent wouldn’t then proceed with an even more severe punishment than what mere lateness merited?

Unlike an unsuspecting parent, God is wise to this game. A person has “too much hope” if he thinks that “God will forgive me” is an excuse for doing whatever he wants and then only confessing the faults he commits because of his expectation of forgiveness. He must also confess his motivation – presuming upon God’s mercy. In this sense, presumption is “an inordinate conversion to God,” as St. Thomas puts it. This is strange to our ears, but it is indeed what this sin is; a person hopes so much for forgiveness that his servile fear is entirely demolished and replaced not by filial fear but by disobedience.

Presumption is a daughter of pride. One who thinks he is so great as to deserve Heaven is likely to fall into halfhearted repentance, or even into no repentance at all. What a calamity! Pride can also lead to another kind of presumption, namely, the rash assumption that God has blessed one’s endeavors in such a way that failure will be impossible or at least improbable in the project one has undertaken. For example, a man decides to become a missionary in China. He prayed, but he did not seek the approval of any ecclesial authority nor take counsel with a prudent spiritual director. How does he know that this is really God’s will? He does not. He would be guilty of this secondary kind of presumption. So too would a person who thinks himself to have “the gift of healing” and so goes about laying hands on people without authentic discretion. This is presumptuous of God’s grace and also exposes the Gospel to ridicule.

Knowing you have committed this sin is not always so easy. There is a difference between the hope of forgiveness motivating a sin and the hope of forgiveness occasioning a sin… I have given an example of the former in the context of human relationships. An example of the latter would be something more like a child who has become used to his parents forgiving him and so loses some respect and fear of punishment. He does not consciously choose to violate their legitimate demands on him because he knows they will forgive him, but a kind of vicious habit has been ingrained nonetheless. Where is the line between these two cases? It might not always be so clear. What we can say is that a person who consciously makes forgiveness a condition of his sinful action has certainly committed this sin, and a person who has lost respect and fear of punishment is in serious danger of committing this sin.

To reiterate, presumption requires its own mention in confession, as it is its own distinct sin. Often a person will know he has done something seriously wrong by using “God will forgive me” as a motivation for sin but will not have the vocabulary to explain himself in confession. The word is “presumption.”

 

Post by: Eamonn Clark

Main image: Pope Francis goes to confession – via Catholic News Agency

First Fridays: Leviticus 23

“The LORD said to Moses, ‘These are the festivals of the LORD which you shall celebrate at their proper time with a sacred assembly.’”

So begins the Old Testament reading for today. Following this introduction, the reading continues with God pronouncing the major feasts that would make up the Jewish calendar: The Sabbath, the Passover and the Festival of Unleavened Bread, the Offering of First Fruits, Festival of Weeks, the Festival of Trumpets, the Day of Atonement, and the Festival of Tabernacles. It may seem that this list of Jewish festivals may not appear to be particularly relevant to the modern Christian. After all, we don’t celebrate these feasts; so why did this passage and others like it make its way into our lectionary? What can we take away from them?

One reason why these readings are important to us is the historical background which they provide us about our ancestors in faith and the religious culture into which our Lord was born. The present is shaped by the past, so learning about the lived experience of those who preceded us and how they kept their traditions alive gives us a blueprint for doing the same today. For example, that the Festival of Weeks is a celebration commemorating the wheat harvest in Israel does not seem to be of utmost importance to the modern Christian. However, knowing that the Festival of Weeks was also known as Pentecost because it fell 50 days after Passover, in the same way as the Christian Pentecost follows 50 days after Easter, and that apart from being a harvest celebration, it commemorates the giving of the law at Mt. Sinai, allows us to enter into the liturgical importance of this festival. Understanding the relationship between the giving of the law at Sinai, and the giving of the Holy Spirit at Pentecost allows us to see the gradual fulfillment of salvation history and the slow unveiling of God’s love throughout time.

But for this post, I don’t want to talk about the rich theological insights a close study of each of these festivals would give us. There are others who have done a much better job than I could. Interested readers could do no better than to check out the Berit Olam commentaries published by The Liturgical Press. Rather, I want to focus on the general theme present throughout the entire narrative, (which in the reading is presented in a redacted form.) That is, the elements of time and space, and how they are ordered to the worship of God. If you look at Leviticus 23:1-44, there are several phrases that you would find repeated multiple times. “The Lord said to Moses, speak to the Israelites,” “The appointed festivals” or “sacred assemblies,” “Do no regular work,” and “lasting ordinance for generations to come” and “Wherever you live.” I want to concentrate on these repeated phrases as revelatory of the kind of relationship God wants the children of Israel to have with Him and with each other. Finally, after looking at these, I want to address the seemingly out of place verse of Leviticus 23:22 which I think is central to the passage.

To begin, it should be noted that Leviticus 23 begins a new “section” in the book of Leviticus. The previous “section” dealt with the conduct of the priests, and now we have seemed to move on from that to norms of general conduct for all the Israelites. How fascinating that the first directive God gives to His people is a calendar! Even before the seemingly paramount sections on rewards for obedience and punishments for disobedience (Leviticus 26), God gives very specific instructions for when to celebrate liturgical feasts. Furthermore, the passage makes it clear that this is a divine command. “The Lord said to Moses, speak to the Israelites,” is repeated several times, reiterating how the giving of the calendar of feasts comes from God Himself. In fact, the chapter ends with God saying, “I am the Lord your God,” using the divine “I am” with which he first identified Himself to Moses to underline the sacred nature of the festivals just commanded. Furthermore, the repetition of “lasting ordinance for generations to come” and “wherever you live” reflect the universality of these commands. These commands hold true, not just for the small group being spoken to, but for all of God’s people, wherever and whenever they are.

It is because of this that we hear repeated the command to do “no regular work” (in other translations, servile or laborious work). Is this command given because God disapproved of the work they Israelites did? Of course not. God commands that sacred days be days of rest as a reminder that these are not normal “work days.” They are days that we rededicate ourselves to the work of the Lord, that is, prayer. Just as God “rested” on the seventh day after the work of creation, we rest after our participation in that unfolding work of creation to remind ourselves of what that work is ordered to – God.

And that is what I think this passage reveals the most about God and about ourselves. Our work is ordered to our rest, which is itself ordered towards our relationship with God. As human beings, we are transcendent creatures. We have limited needs like any other animal; we eat until we aren’t hungry anymore, we sleep until we aren’t tired anymore, we seek shelter from the elements, and all the other basic necessities. But we also have unlimited transcendent desires. We have a desire for beauty, for companionship, for wholeness, for infinite joy. That is, we have a desire for God. God led the Israelites out of Egypt and he gave them the calendar of feasts not to satisfy their basic animal needs, but their transcendent human needs. God gave the Israelites a calendar of feasts and directions of how to celebrate them even before He finished leading Israel into the Holy Land because it was given to them for the purpose of worship, and so their time in the land and their use of it must be ordered to that purpose.

Do we find this to be the reality in our lives today? Do we order our time and our space to that reality? How often do we find our work encroaching into our time with God? How often are we tempted to skip prayer or even just healthy social activities in order to get work done because we think that is what is expected of us? Could you imagine what kind of a society we would be if our calendars were arranged around preserving the sacredness of the day of rest? Imagine if employers arranged work schedules in a way that not only provided employees with sufficient “days of rest” but also such that they could participate with dignity in community activities (both religious and other healthy communal gatherings.)

It is to that point which I think the, seemingly out of place, verse of Leviticus 22:23 is ordered. “‘When you reap the harvest of your land, do not reap to the very edges of your field or gather the gleanings of your harvest. Leave them for the poor and for the foreigner residing among you. I am the Lord your God.’” This is the “gleaning law” of ancient Israel, which essentially stated that those who owned and worked farmland ought not be so exacting in their harvest that those without land wouldn’t be able to find food should they glean from the field. In a passage about liturgical feasts, why would this command be placed in the exact middle? It’s true, the Festival of Weeks is a harvest festival and so making a point about harvesting is not completely out of place here, but it still seems a little strange.

However, reflecting on the idea that our time and resources are ultimately ordered to the service of God, we might find religious significance in the gleaning law. In some sense, the gleaning law made it possible for the poor to participate in the festival. It ensured that there would be food available after the harvest for those who begged in order to fulfill their basic needs. The poor would not have to worry if taking time off from their job for the festival would impact their ability to fill their needs. Just as a farmer has a right to collect the fruit of his labor from his field but not be so exacting that there is none left for others, an employer has a right to his employees time (for a fair wage of course,) but not to be so exacting in his demands that an employee does not have time or energy left for religious and community oriented activities in a respite from “regular work.”

As a reflection, we might ask ourselves, do we keep the “gleaning law” in our own lives? Do we ensure that every day we have several periods of time protected from the encroachment of our daily demands, our regular and laborious work? Do we use that time to concentrate not on our basic animal needs, but our transcendent human needs? What “mini-festivals” do we have planned in our day in which our focus is on prayer to God and charity towards our neighbor? Is our time away from the office ordered towards these higher things, or is it only a brief respite to prepare for the next day on the job? Essentially, do we work to live, or are we living to work? With these thoughts in mind, thanks be to our God, who takes care of our needs so that we can use this time on Earth to grow closer to Him!

 

Post by: Niko Wentworth

Main image: The Gleaners, Jean-François Millet, 1857, oil on canvas

Jesus and the Aliens

By now, the question is no longer fresh and new. If aliens found us, or vice versa, what is the appropriate pastoral response? The Holy Father wants us to go to “the peripheries” – well, what could be more peripheral than some planetary system in the GN-z11 galaxy, which is a whopping 32 billion light years away? Shouldn’t we want to share our Faith even there?

 

aliens1.png
Father Jack Landry, from ABC’s show “V” looks up at a UFO. His skepticism eventually earns him laicization – by the aliens secretly running the Vatican.

Let’s say a peaceful race of aliens show up on our front doorstep. We can tell that they are rational, living creatures with bodies. We can communicate with them about higher order concepts. They want to be part of our culture and society. So, do we tell them about God? Do we invite them to Mass? Do we baptize them? Pope Francis has said he would, and the Vatican’s chief astronomer, Br. Guy Consolmagno, has said the same.

I suggest the following possibilities, given the above scenario.

  1. They already know and worship God and don’t stand in need of redemption.
  2. They already know and worship God, or not, and do stand in need of redemption, but their redemption can’t possibly be found in Jesus Christ.

Before the reader accuses me of heresy – or even apostasy – let me explain.

The first possibility is that these aliens do not need redemption. It would be easy to rule this one out, as soon as we found any kind of habitual moral failure in them… Given that their first parents (or parent?) was like our own, sin (and death too) would indicate a corruption of nature. If they are not sinful at all and do not die, it would make sense to assume they are prelapsarian. Creatures that don’t have a broken nature do not need that nature to be healed. No sin, no need for redemption.

By most accounts, if they are sinful creatures, we will know right away.

The second possibility is that they do need redemption, which it seems indicates the need for a Savior and a sacramental economy. Because the task of redemption is specially suited to the Second Person of the Trinity, it would make sense for the Son to become incarnate in order to pay the price of the sin of their common ancestor from whom they inherited sin and death. That ancestor, however, is not descended from Adam. If there is a race of intelligent life apart from the progeny of Adam, Jesus Christ, a descendant of Adam, cannot be that race’s Savior. Recall Pope Pius XII’s famous words in paragraph 37 of Humani Generis:

“When, however, there is question of another conjectural opinion, namely polygenism, the children of the Church by no means enjoy such liberty. For the faithful cannot embrace that opinion which maintains that either after Adam there existed on this earth true men who did not take their origin through natural generation from him as from the first parent of all, or that Adam represents a certain number of first parents. Now it is in no way apparent how such an opinion can be reconciled with that which the sources of revealed truth and the documents of the Teaching Authority of the Church propose with regard to original sin, which proceeds from a sin actually committed by an individual Adam and which, through generation, is passed on to all and is in everyone as his own.”

Pius XII firmly teaches that everyone on Earth is a descendant of Adam and therefore is an inheritor of Adam’s sin, but he does not consider the question of alien life – he leaves it open. For, perhaps there are “true men” (in the sense that they are rational animals capable of knowing, loving, and serving God) that are not on this Earth and never have been, who therefore would not be descended from Adam. If they are not descended from Adam, they do not enjoy the benefits of the redemption of the race of Adam. Our Christ’s death and resurrection allows for our baptism, our baptism takes away our original sin – inherited from Adam. If the aliens have their own Original Sin, they need their own Christ descended from their own common sinful ancestor.

Taking the absolute fittingness of “Earth Christology” for granted (meaning specifically that it would always be best for God to fix every instance of Original Sin through an Incarnation), this would mean that the Son of God would have to become incarnate according to their flesh, and so pay their debt of sin. Indeed, St. Thomas teaches in the Summa Theologica III q. 3 a. 7 that a Divine Person may take on multiple human natures at once:

“What has power for one thing, and no more, has a power limited to one. Now the power of a Divine Person is infinite, nor can it be limited by any created thing. Hence it may not be said that a Divine Person so assumed one human nature as to be unable to assume another. For it would seem to follow from this that the Personality of the Divine Nature was so comprehended by one human nature as to be unable to assume another to its Personality; and this is impossible, for the Uncreated cannot be comprehended by any creature. Hence it is plain that, whether we consider the Divine Person in regard to His power, which is the principle of the union, or in regard to His Personality, which is the term of the union, it has to be said that the Divine Person, over and beyond the human nature which He has assumed, can assume another distinct human nature.”

This means that if God planned to save these aliens by an Incarnation in their flesh, the Son could do that in the same way He did for us, the descendants of Adam, regardless of already having done so in Jesus of Nazareth. If God so chose, He could give them a progressive revelation, just like He did for us through the patriarchs, Moses, and the prophets. Maybe these aliens are actively waiting for their own Messiah… Perhaps we could play a kind of prophetic role, insofar as we might give them the teachings of Christ and improve their moral life in this way, but they could never be incorporated into our sacramental economy. They need their own Savior and their own sacramental economy, probably suited to their own kind of flesh, archetypal associations, and any salvific history peculiar to them.

It should be noted that this account takes for granted that the aliens’ sinful common ancestor was graced like Adam with the preternatural gifts and sanctifying grace and was not instead left in the so-called “state of pure nature.” It does not seem there can be a “Fall” for them in the first place if their race did not have at least the gift of integrity (fittingly aided by infused knowledge and perfected by bodily incorruptibility) springing from the gift of sanctifying grace. The free rejection of that grace through a departure from God’s law would initiate the corruption of the soul into what we call “fallen nature.” This state of pure nature, passed on from generation to generation, would render the alien race unable to reach beatitude without a direct, superabundant, and universal act of mercy on the part of God Himself. The aliens could sin and still reach their natural end of an honorable life of the natural love of God, the Creator, with the help of His grace… but not sanctifying grace. They would end up, resurrected or not, in a kind of natural happiness or unhappiness according to their natural merits or demerits, but they could never gain beatitude (Heaven) without that special act of God.

In any case, it seems we won’t ever need to worry about writing the rubrics for RCIA – the Rite of Christian Initiation for Aliens.

 

Post by: Eamonn Clark

Main image: Screenshot from the 1982 film E.T.

The New Albigensianism, PART III: An Existentialism Crisis

See PART I and PART II

Having examined the first part of the “postmodern manifesto,” which is scientistic, we now turn to the second part, which is existentialist. Here it is again:

Real knowledge is only of irreducible information about the material world, and I can manipulate that same material world however I want in order to express myself and fulfill my desires.

The imposition of a spirit onto its flesh and the world is our object of investigation today.

After the Kantian revolution proposed a deontological moralism as a replacement for metaphysics, Schopenhauer took up the reins and ran with the theme: the will reigns supreme over the intellect. This doctrine recalls those first rumblings present in Ockham, Abelard, Scotus, and even St. Bonaventure. (Who could forget Dante’s depiction of Bonaventure and Thomas circling around each other in Heaven debating the primacy of the intellect and will?) Then came Soren Kierkegaard’s deep anxiety over life together with a suspicion of some kind of opposition between faith and reason. Heidegger, of course, was riddled with anxiety as well, over being and nothingness, and he had an obsession with freedom and authenticity: all characteristic of what was to come. There was no more dramatic precursor to the French existentialists than Nietzsche, who sought to free the world of its nihilism and empower it with the liberation of the will: the ubermensch, or “super man,” would embody a new kind of magnanimity with no regard for the welfare of others or some abstract Aristotelian “flourishing.” Nietzsche apparently couldn’t do it himself and went insane, finally cracking after seeing a horse being mercilessly beaten in a street in Turin. (Here we might pause and recall Durkheim’s observation about happiness and the subjection of the will to a pre-defined role in society… Those who have a life already set up for them tend to kill themselves less often.) The penultimate step to mature existentialism came with Michel Foucault, the forbearer of the “rainbow flag” and a staunch opponent of confining the mentally insane. After all, maybe they are just “different,” you know?

Finally, we come to the main event: a Parisian socialite, his lover, and a journalist-turned-philosopher raised on the soccer fields of French Algeria.

The core of the teaching of Jean-Paul Sartre can be summed up in three words: existence precedes essence. In other words, there really is no human nature, only a human condition which must be figured out and made into something of one’s own. He cites Descartes’ cogito in support of this theory, being an “anti-materialist,” and he claims that this is the only dignified vision of man, as this doctrine alone is capable of acknowledging his true power and freedom – which are apparently the characteristics of dignity. Man must go beyond himself to create himself, quite in contrast to the Comtean humanist religion, where humanity is good “just because.” For Sartre, man is nothing without making something of himself. (This would later become the basic teaching of Ayn Rand as well.) Freedom is to choose and conquer resistance present in one’s situation, and one must exercise this freedom according to his authentic self. But what is the “self” without a human nature? It is unclear.

Sartre’s intermittent lover, Simone De Beauvoir, with whom he would frequently seduce unwitting female students for sexual exploitation, held similar ideas and became the first “feminist.” It is from De Beauvoir that we get the now infamous gender-sex distinction: “One is not born but becomes a woman.” The woman is defined socially – and in classical A-T anthropology – in relation to man and therefore does not have her own identity. This is an existential problem for the woman, who must go out and create herself. To postmodern ears, however, it would sound insane to contradict the sense of De Beauvoir’s complaint; and yet we have St. Paul teaching that some kind of superiority of men is rooted in nature and of necessity must flow into ecclesial life (1 Cor. 11: 3-16, Eph. 5: 21-33, Col. 3: 18-19). The Christian must not be a feminist of the De Beauvoir variety. Our friends the Cathars had women clergy; they anticipated the existentialists in their justification for this choice. We will return to that in a future post.

Then we have our Algerian friend. Albert Camus’ most famous contribution to Western thought was the that the only serious question a person has to ask himself is whether to end his own life. After all, life is absurd, and if one can find no meaning for himself, then it is better that it end on one’s own terms, rather than in something meaningless like a car crash (which, ironically, was exactly how Camus was killed). Despite explicitly denying the existentialist label and preferring to be an “absurdist” instead, Camus is nonetheless the crystallization of the movement – his interpretation of the Greek myth of Sisyphus, claiming that man must accept his existence as an absurdity in order to find peace, or the anguish of the main character of “The Stranger” over the meaningless of his life and what has happened to bring about his execution, for example, provides a fitting capstone to the existentialist project because it shows its end: senselessness. When human nature is removed, purpose is removed. And the frantic search for a self-assigned basic purpose can only end badly, even if it doesn’t feel that way to a “successful existentialist.”

Certainly, more can and should be said about the French existentialists. But this brief and rude treatment suffices to bring to light the critical themes of our own day which were present in the movement, namely: a rejection of human nature as such; a perceived need to define one’s own role to make up for such an absence; and an obsession with “gender” equality.

We have already noted in PART I of this series the shocking fact that the existentialist doctrine on human nature as such has been enshrined in U.S. law by the Supreme Court. That should be enough to show there is a deep-seated existentialist current plaguing the West, but when coupled with the wide diffusion of the watered down scientistic-positivism we explored in the last post, disdain for classical Aristotelico-Thomistic anthropology has become its own unspoken rule. It is not unspoken in the way one doesn’t talk about Fight Club, it is unspoken in the way one doesn’t talk about red being a color… it’s just a given.

fightclub
Our culture is schizophrenic and self-destructive too. But does it give us novelty soap bars?

If there is any admittance of a “human nature” it is a passing nod to the truth that what we call human beings usually have certain kinds of physical characteristics which normally produce certain kinds of effects. The classical meaning of “nature,” however, is alien to this vague and platitudinous physicalism, as there can be no teleology (in-built purpose) for what is merely a random collection of stuff onto which we slap a name. This, I suggest, is the final fruit of Ockham’s Nominalism which we have discussed previously.

Of course, most postmodernists dimly realize their godless worldview poses the “existential problem,” viz., a lack of inherent meaning and purpose in their life, and they seek to solve it through the recommended process of “self-definition.” We are not here critiquing a healthy ambition to “do what one can” or to avoid idleness; rather, the issue is the desperate and necessarily futile attempt to provide altogether one’s own meaning for existing in the first place. There are also many people, who are not quite full-blown postmodernists, who seek to correct this same inner anxiety with DIY spirituality (moralistic therapeutic deism, usually); this is particularly dangerous as it nominally acknowledges something greater than oneself as a grounds for directing one’s life, but it is really the imposition of one’s own ideas onto a divine mouthpiece.

The existentialist paradigm helps make sense of the postmodern millennial’s take on the issues: the life issues, the gender issues, and the sex issues. Since a person’s meaning is basically self-derivative, and that meaning is predicated upon desires and the ability to fulfill them, then the unborn and the elderly are without their own meaning. Having a certain kind of body which has certain powers does not force one to accept that embodied reality as a given identity and direction either within a social framework or even within a physical framework, provided there is a surgeon available. Much less does this God-given engendered bodily existence, constitutive of unique powers with lasting social consequences and everlasting spiritual consequences, provide an individual with rules for how to engage in the use of the organs which are the seat of that power. You must choose to become something. Alternatively, you may disappear into oblivion – either irrelevance, or death. Before it was the American Dream™ it was the French philosophical anthropology.

The current of this thought has bored a hole so deep into the subconscious of postmodern America (and many parts of Europe) that it has become impolite, if not outright illegal, to tell a person that he is a he, she is a she, that “No, I will not serve cake at your wedding,” or anything that might emotionally hurt that person, so long as that self-given identity or meaning does not result in “harmful” behavior. Harmful behavior, remember, is reduced to emotional, physical, or financial pain or loss – for those who can already “will to power” and aren’t entirely reliant on help from other people for existence, that is.

The video above, while admittedly a bit cherry-picked, demonstrates nonetheless the existentialist current of millennial postmodernity with breathtaking frankness. No doubt such an experiment could be replicated across the global West with some success, at least in supposedly “elite” institutions of higher education. Note again the criterion of “harm” as constituting the core of the normative ethics for postmodern millennials – as if a person with a wildly erroneous self-perception is doing no real harm. You can tell that these kids become more and more uncomfortable as they are forced by their own premises and sense of political correctness to the affirmation that what is obviously “real” truth is being denied by this person, but that since “it’s not ‘harming’ anyone,” it must be okay and therefore good to support. It is the lack of an awareness that such a departure from the truth of one’s natural constitution as “man,” “white,” etc., does indeed cause harm to that person and therefore also to society at least inasmuch as that person’s self-perception is related to his or her function in society, is probably why it doesn’t “bother” the people interviewed. There used to be a word for the self-deception which is being coddled as healthy and normal: mental illness. Now it requires university sponsored trigger warnings and safe spaces, international awareness campaigns, and even protective laws. All of this finally ends in a kind of laissez-faire utilitarian relativism, which we might call the postmodernist ethics. “The more a behavior harms the people or things that I like, the more immoral the behavior is, and the more a behavior does good to the people or things that I like, the better the behavior is.” In this normative ethics, I can never do anything wrong, except inasmuch as I might unthinkingly do something harmful to my own cause. Another person is irrelevant insofar as he doesn’t harm my own mostly arbitrary and narrow values. This must also be understood as occurring within the materialistic framework – both harm and good are all temporal and experiential. (Unless, that is, a little DIY spirituality comes into play… Then all bets are off.) Without a firm understanding of unchanging human nature, and the belief in its authority and power to provide a normative ethics, we are left to define our own values based on whatever we would like to do or become as individuals or collectively as a society.

“Existence precedes essence.” Human beings are now human doings.

Yet clearly, “Some are more equal than others.” Why are some people or things valued over others? The connection to the expression of self and fulfillment predicated upon it are the foci around which postmodern value is measured: money, physical pleasure, convenience, emotional pleasure, diversity, equality, progress. Each goal is vaguer – and more dangerous – than the last. If you  are not contributing one of these goods to society, how can you be valuable? Maybe you are a “good person,” but you are no longer useful and are therefore of no account. In other words, we may kill you if we would like to… and one day we might realize that we ought to kill you: because you are not capable of doing the kind of things we value, your own existence offers you “no benefit.” It is now charitable to destroy a life that can’t “create itself.” Beyond the obvious cases of killing the unborn and physically sick, Camus’ dilemma is being answered for the mentally ill and elderly in Europe in “assisted suicides” which are a little too assisted.

It has become popular these days to remark on “the science” behind why transgenderism or same-sex marriage or whatever is “bad.” While taking note of the psychological and physical processes and results of these experiments is not irrelevant to forming a right opinion on their goodness (like studying the average harm done to children by “gay parenting”), there is no need, and in fact no possibility, for “science” to provide the answer to the foundational moral questions whose answers are found in a study of the soul and body’s basic purposes which are widely known to all, as St. Paul reminds the Romans (Rm. 1:18-32). You really don’t need an expert biologist to give kids “the talk.” You do need something other than mere biology to infer that deviating from the natural order is wrong, and the obsession with the minutest details of the “is” to justify the “ought” belies at least a touch of the intellectual illness diagnosed in Part II of this series, namely, a weak form of positivism called scientism.

Given that existentialism is historically opposed to the materialistic worldview which positivism relies on, how can the postmodern manifesto combine both elements? For example, how can a person support transgender surgery as an effective means of “expressing the real self” while claiming that there is no such thing as a soul because it’s not an object of scientific observation? We might say it is a simple lack of reflection which allows this cognitive dissonance, and this is indeed true. The deeper problem, however, is that ideology is serving passion, rather than the other way around. This is part of what makes millennials so difficult to reason with: they will shift from one part of the manifesto to the other for the sake of whatever person or group or behavior they feel good about, not realizing that each pole is at least a mild affront to the other. What they tend to sense is that their scientism forces one to create his own meaning since there is no predefined role by a true authority (God, revealed religion, a family or government invested with God-given authority), and that the quest to create meaning for oneself is determined only by what is able to be perceived by oneself, the greatest authority. The poles point back toward each other in this way, even though real positivists would reject the idea that a person can “mean something” at all, and real existentialists are not even attached to the doctrine that there is a real material world in the first place. The details of theory are lost in the practice of the unfortunate and unwitting inheritors of these worldviews.

Whether the French existentialists would be on board with the hashtag gender activists of today is not entirely clear. Sartre would perhaps call transgenderism “bad faith,” that is, a fake expression of oneself wherein one “tries too hard” to be something he or she really is not. This is not “authentic” to Sartre. (How there could be such a thing as the “self” independent of one’s sincere desires begins to strike the central nerve of the existentialist project, however; if one can act in bad faith, then there must be something more to one’s identity than his desires which those desires can be in line with… which sounds an awful lot like an essence preceding existence, so to speak.) Camus might call such people to account as failing to accept that life just does not make sense, and that the only way to be happy is to accept this: providing a physical answer to a spiritual problem is vain, but there is no spiritual answer either, so one must simply be content with madness.

Existentialism is likely to remind the attentive reader of Sacred Scripture of Ecclesiastes. Was Qoheleth the first existentialist? The first absurdist? He does claim that the acceptance of life as vain and meaningless in itself is a condition for peace, like Camus. (Truly, Qoheleth is right – there is nothing new under the sun!) But Qoheleth, despite all of his despair, believes that everyone’s life means something to God, and that there are objective measures of morality by which that God will somehow judge us. That his idea of final judgment is fuzzy can seem odd given this, but in his intellectual humility he did not grasp for what he had not already been given. He knew we would die and that God would somehow render justice, but he will not say more.

Postmodernists avoid the topic of death because it would force them out of their watered down existentialism – protected by a million distractions – into the disquieting bluntness of Camus, which few can stomach: your life really is fundamentally meaningless, and there’s nothing you can do about it, so just get comfortable with that fact like a happy Sisyphus. The suicidal dilemma is also “too harsh” for sensitive millennial minds – let that question be left to poor Hamlet and Hannah Baker.

Next time, we will directly investigate the relationship between the trends of our current culture and the doctrine and praxis of the Cathars, finally making good on the title of this series.

 

Post by: Eamonn Clark

Main image: Simone De Beauvoir, Jean-Paul Sartre, and Che Guevara; Cuba, 1960

The New Albigensianism, PART II: Comte and the Combox

For Part I, click here.

Just as the woman with the hemorrhage reached out to touch the hem of Jesus’ tunic, so do post-modern secular Westerners reach out to touch the hem of scientists’ lab coats. Despite the plain fact that any given scientist or doctor or other “expert” will be tend to be specialized in only some tiny sliver of his or her field, hopeless intellectual wanderers will gather at the feet of these people to learn all the mysteries of the universe… which is dumb. How did this happen?

Let’s take a step back.

The manifesto of the post-modern Westerner par excellence is this: “Real knowledge is only of irreducible information about the material world, and I can manipulate that same material world however I want in order to express myself and fulfill my desires.”

Herein we see two strands of thought colliding, one about the mind and one about the will: positivism and existentialism. Historically, they are not friends. How they have become fused together in post-modernity is a strange tale.

Today we will break open the first clause – real knowledge is only of irreducible information about the material world, the positivist element.

From the outset, we must make a distinction between “positivism,” which is an epistemic and social theory, and “logical positivism,” which is something more metaphysically aimed. My goal here is to show the roots of the broader idea of positivism, how it found its academic zenith in logical positivism, then how the aftermath of its fall has affected Western philosophy and science at large as well as in the minds of millennials.

A brief sketch of the positivist genealogy will suffice. We recall Descartes to point out his obsession with certitude, just as we note the empiricist thrust of Bacon, Locke, and Hume. We must mention Kant, both as the originator of the analytic-synthetic distinction (which will become enormously important) and as an influence to Hegel, who is notable for his approach to philosophy as something integral with history. Condorcet and Diderot should be pointed out as influential, being the greatest embodiments of the French Enlightenment, wherein reason and revealed religion are opposing forces. Marx, though he would reject positivism as a social ideology, helped inspire it along the same lines as Hegel had. The penultimate step was Henri de Saint-Simon, whose utopian socialism was all the rage during the French Revolution which was attempting to put his political theory into political practice.

Of course, these men were not positivists. It is Henri de Saint-Simon’s pupil, Auguste Comte, who brings us this unwanted gift of an empiricism so strong it entirely and unabashedly rejects any and all metaphysical knowledge outright. This led Comte to build a reducible hierarchy of the sciences based on their certainty or “positivity,” and he claimed (rightly) that the trend of empirical studies was heading toward a “social science.” This conception of a reducible scientific hierarchy – one where, for instance, biology can be put in terms of chemistry, and chemistry in terms of physics, etc. – was a rather new way of thinking… Previously, it had been more or less taken for granted that each science has its own irreducible terms and methods, even admitting some kind of hierarchy (such as with the classical progression of the liberal arts).

Not only was Comte the first real philosopher of science, he was also the first sociologist. According to Comte, humanity was passing from its first two stages, the theological and the metaphysical, into the third and final “positivist stage” where only empirical data would ground truth-claims about the world. Having evolved to a higher clarity about what the world is, and having built up enough of the more basic physical sciences to explore how that world works, sociology could finally occur. Mathematical evaluation of social behavior, rather than qualitative analysis, would serve as the proper method of the “queen of the sciences” in this new age.

Comte outright jettisoned religion qua supernatural and revelatory, but his intensely Catholic upbringing had driven into him such a habit of ritual that he could not altogether shake the need for some kind of piety. What was a French Revolution atheist to do? Well, start a “religion of humanity,” of course. (The “positivist religion” never became a major force, especially since Freemasonry already filled the “secular religion gap,” but it did catch on in some areas. Take a closer look at the Brazilian flag and its meaning, for example…) We should also note, for the record, that Comte was only intermittently sane.

The epistemic side of positivism almost ended up just as much of a flop as the pseudo-religion side of it. Unfortunately for the West, Durkheim and Littré became interested, and they, being altogether sane, effectively diffused Comte’s ideas and their own additions through the West at the start of the 20th century. Eventually, a group of like-minded academes started habitually gathering at a swanky café in Austria to discuss how filthy and naïve metaphysics was compared to the glories of the pure use of the senses and simple mathematical reason – the Vienna Circle was born.

Together with some Berliners, these characters formulated what came to be known logical positivism. When the shadow of Nazism was cast over Germany, some of these men journeyed westward to England and America, where their ideas were diffused.

The champions of logical positivism were Hans Hahn, Otto Neurath, Moritz Schlick, Rudolf Carnap, A.J. Ayer, and Bertrand Russell. While Russell is no doubt familiar to some readers (think “tea pot”), the others fly lower under the radar. It is Ayer’s formulation of the logical positivist doctrine which we will use, however, for our analysis.

“We say that a statement is factually significant to any given person, if, and only if, he knows how to verify the proposition which it purports to express – that is, if he knows what observations would lead him, under certain conditions, to accept the proposition as being true, or reject it as being false.” (Language, Truth, and Logic, 35)

Got that? What this means, in the context of the whole book, is that in addition to statements which are “analytic” (“all bachelors are unmarried”) being true necessarily, only statements which we can actually use our 5 senses to verify the truth of can be meaningful – that is, able to be true at all. These are “synthetic” statements. If I say that Pluto is made of bacon grease, I am making a meaningful statement, even though I cannot actually verify it; it suffices that it is hypothetically possible to verify it. If I say that the intellect is a power of the soul, this is not meaningful, since it cannot be verified with the senses. For the details, see Ayer’s book, which is rather short.

Needless to say, it is rare that a school of thought truly dies in academia. A thorough search of university philosophy departments in the Western world would yield a few die-hard fans of Plotinus, Al-Gazali, Maimonides, and maybe even Heraclitus. Perhaps the best or even only example of ideological death was logical positivism. W.V. Quine’s landmark paper “Two Dogmas of Empiricism” was such a blow to the doctrine that eventually Ayer actually admitted himself to be in massive error and repudiated his own work.

What was so blindingly erroneous about logical positivism?

First, the analytic-synthetic distinction, as formulated by the logical positivists, is groundless. Analytic statements supposedly don’t need real referents in order to be true, but they are instead simply about the meanings of words. For some kinds of statements which employ basic affirmation and negation, this might work, as it is simply just a dressing up of the principle of non-contradiction. Fine. But if one wants to start using synonyms to take the place of some of the parts of these statements, the distinction begins to disappear… What the relationship is between the synonym’s object and the original word’s object cannot be explained without a reference to real things (synthetic!), or without an ultimately circular appeal to the analyticity of the new statement through a claim of the universal extension of the synonym based on modal adverbial qualifications (like the word “necessarily,” which points to an essential characteristic which must either be made up or actually encountered in reality and appropriated by a synthesis). In other words, it is analytic “just because.” (Thus, the title of Quine’s paper: Two Dogmas of Empiricism. Read more here.)

Beyond that, logical positivism is a self-refuting on theory its face… If meaningful statements can only be about physically verifiable things, then that statement itself is meaningless because it is not analytic (or is arbitrary if it is, and we go back to the first problem) and cannot be verified with the senses so is not synthetic… How does one verify “meaningfulness” with the senses? Logical positivism is a metaphysical theory that metaphysics is meaningless. Once again, this can only be asserted, not discovered. Except with this dogma, it evidently claims itself to be meaningless.

But the cat was out of the bag: “Metaphysics has completely died at last.” Logical positivism had already made its way from the salons of Austria to the parlors of America and lecture halls of Great Britain. The fuel was poured on the fire that had started in England by Bertrand Russell and G. E. Moore after they had decided to reject the British Idealism that dominated the scene by creating an “analytic” philosophy that didn’t deal with all those Hegelian vanities that couldn’t be touched with a stick or put in a beaker. Russell’s star pupil, Ludwig Wittgenstein, would also come to be a seminal force in strengthening the analytic ethos, after having already inspired much of the discussion in the Vienna Circle. Though Quine did indeed destroy the metaphysical doctrine that metaphysics is meaningless, the force of positivism continued nonetheless within this “analytic” framework – and it is with us to this day en masse in university philosophy departments, which has led several generations of students to miss out on a solid education in classical metaphysics and philosophical anthropology.

In sociology there arose the “antipositivism” of Max Weber, which insisted on the need for value-based sociology – after all, how can a society really be understood apart from its own values, and how can a society be demarcated at all without reference to those values, etc.? A liquid does not assign a value to turning into a gas, which it then acts upon, but a group does assign a value to capitalism, or marriage, or birth status which it then acts upon.

In the broader realm of the philosophy of science, Karl Popper and Thomas Kuhn’s postpositivism came to the fore. Science in general cannot be best explained without regard for some kind of value, but that the possibility of and/or actualization of the falsification or failure of a scientific theory is the characteristic feature of the sciences – in contrast to the optimism of the positivists that we can “just do science,” and that that will be useful enough.

In “science” itself, an air of independence was diffused. Scientists do “science,” other people do other things, and that’s that; never mind that we have no idea how to define “science” as we understand it today, and never mind that values are always brought to bear in scientific evaluation, and never mind what might actually be done with what potentially dangerous knowledge is gained or tool developed. A far cry from the polymaths, such as St. Albert the Great or Aristotle, who never would have considered such independence.

Then there are the “pop scientists” who try to do philosophy. A few examples of many will have to suffice to show that there exist three traits among pop scientists who are the go-to sources on religion and philosophy for countless curious millennials and Gen-Xers alike.

The first is an epistemic myopia, which derives immediately from positivism: if you can’t poke it or put it in a beaker, it’s not real. (Yes, it is a little more complicated than that, but you’ve read the section above describing positivism, right? Empirical verification is the only criterion and process for knowledge… Etc.) This is often manifested by a lack of awareness that “continental philosophy” (as opposed to analytic philosophy) often works in totally immaterial terms, like act, or mind, or cause, or God. This immediately creates equivocation – a pop scientist says “act” and thinks “doing something,” for example.

The second is an ignorance of basic philosophical principles and methods, which follows from the first characteristic. If you don’t know how to boil water, don’t go on “Hell’s Kitchen” – everyone will laugh at you and wonder what you are doing there in the first place. We might do well to have a philosophical version of Gordon Ramsay roaming about.

The third is the arrogance to pontificate on philosophy and theology nonetheless, and this of course follows from the second characteristic. They don’t know what they don’t know, but they got a book deal, so they will act like they are experts.

Everyone knows Dr. Stephen Hawking. (They made a movie!) But did you know that the average 6-year-old could debunk the central claim of his most recent book? It is now an infamous passage:

“Because there is a law such as gravity, the universe can and will create itself from nothing.” (From The Grand Design)

I can hear the 1st graders calling out now: “But gravity’s not nothing!” And they would be right. The myopia of Dr. Hawking (and Dr. Mlodinow, his co-author) is evident in the inability to grasp that, as Gerald Schroeder pointed out, an immaterial law outside of time that can create the universe sounds a lot like, well, God. The ignorance of basic philosophical principles, in this case, the most basic, is clear from realizing that “gravity” can’t be both SOMETHING AND NOTHING. Then, the arrogance to go on pontificating anyway is self-evident by the fact of the existence of the book, and then a TV series which aired shortly afterward wherein we find philosophical reflection which is similarly wanting.

If you really want to do a heavy penance, watch this “discussion” between Hawking, Mlodinow, Deepak Chopra, and poor Fr. Spitzer – I had the displeasure of watching it live several years ago:

Then there are folks like Dr. Michio Kaku. He regularly shows up on those Discovery Channel specials on string theory, quantum mechanics, future technology, yadda yadda. All well and good. But here’s an… interesting quotation for our consideration:

“Aquinas began the cosmological proof by postulating that God was the First Mover and First Maker. He artfully dodged the question of ‘who made God’ by simply asserting that the question made no sense. God had no maker because he was the First. Period. The cosmological proof states that everything that moves must have had something push it, which in turn must have had something push it, and so on. But what started the first push? . . . The flaw in the cosmological proof, for example, is that the conservation of mass and energy is sufficient to explain motion without appealing to a First Mover. For example, gas molecules may bounce against the walls of a container without requiring anyone or anything to get them moving. In principle, these molecules can move forever, requiring no beginning or end. Thus there is no necessity for a First or a Last Mover as long as mass and energy are conserved.” (Hyperspace, 193-195)

The misunderstandings here are as comical as they are numerous… The conflation, found explicitly in the full text, of the first 3 Ways as “the cosmological proof,” which obscures the issue, the belief that “motion” is a term about something necessarily physical, the thought that only recently did we discover that matter and energy don’t just appear and disappear, and then the most obvious blunder – Thomas does NOT start any of the 5 Ways by saying anything like “God is the First Mover, therefore…” There is no such ungrounded assertion which “dodges the question,” as Kaku puts it. One must wonder if he even bothered to read the original text – which is readily available. Kaku has even weaker arguments (unbelievably) against both the “moral proof” (which is a characterization I have never heard of the 4th Way until Kaku’s book, which troubles me from the start) and the teleological proof on top of this disastrous critique, but I won’t bore you. (Basically: “Because change and evolution.” Read it for yourself.)

Once again, we see three qualities: epistemic myopia (as evidenced, for example, by the error about “motion”), ignorance of the most basic philosophical principles (albeit these are a little more complicated than the one Hawking whiffed on), and the arrogance to pontificate about God and the act of creation nonetheless.

Next you have a man like Richard Dawkins, one of the nastiest examples of publicly evangelical atheism the world has to offer at present. Here’s one particularly embarrassing quotation from his seminal anti-theistic work, The God Delusion:

“However statistically improbable the entity you seek to explain by invoking a designer, the designer himself has got to be at least as improbable.” (p. 138)

philmeme1

Can you see the three characteristics? Material beings only (or at least “things” with “parts”), no idea what metaphysical simplicity is and how it relates to God in Western philosophy, and yet here we have one book of many which address this theme.

It is not that these folks don’t believe in classical metaphysics – it’s that they don’t understand them in the least. They play a game of solitaire and claim to be winning a game of poker.

We won’t even get into discussing Bill Nye the Eugenics Guy… for now.

Okay, yes, quote-mining is easy. But this is the cream of the crop from a very large and fertile field. I am not sure I recall ever reading an important and sensible argument about religion or metaphysics from a world-renowned scientist who lived in the past 50 or so years. Someone prove me wrong in the comments.

All this leads us to the average “scientism” which one finds in the comboxes of Youtube videos about religion, threads on various websites, and debates on social media. Yes, there are plenty of religious people in those arenas, but the skeptics who try to make wild claims like “science disproves religion” or “evolution means God does not exist” or even just dismiss the idea of revealed religion outright with some kind of mockery ought to be seen as the children of positivism. It is the most probable explanation – the sources of their myopia, ignorance, and arrogance can usually be traced back through intermediate steps to a talking head like Dawkins who ultimately owes his own irrational ramblings to Auguste Comte.

Why is post-modern positivism so naïve? At the combox level, it is because these people, as all others, have an instinctive drive to trust in someone beyond themselves. For many it is due to circumstance and perhaps a certain kind of emotional insecurity and intellectual laziness that they latch on to the confident scientistic loudmouths to formulate their worldview – and it becomes a pseudo-religious dogmatic cult of its own, a little like Comte’s “religion of humanity.” At the pop-science level, it is just plain laziness and/or intellectual dishonesty combined with arrogance, as we have investigated. At the lecture hall level – and I mainly speak of the general closed-mindedness towards classical metaphysics found in analytic circles – it is a deeper kind of blindness which is the result of the academic culture created by the aforementioned ideological lineage. Each level has its own share of responsibility which it is shirking.

The truth is that matter is known by something immaterial – a mind or person – and this reveals to us a certain kind of hierarchy and order, seeing as matter itself does not know us. Man is indeed over all matter and ought to control it and master it, and all without the consent of matter; but this does not mean that there can’t be knowledge of things nobler and/or simpler than man, like substance or causation or God. Not looking at matter as the product of non-matter, and as being ordered to the immaterial in a certain way, is part and parcel of the New Albigensianism.

So there we have the first part of the manifesto explained. Irreducible facts (the ones devoid of metaphysics and value judgments) about the material world constitute the only real knowledge. The less reducible, the less it is really known. Even though the West is full of supposed “relativists,” it would be difficult to find a person who would truly let go of the objectivity of “science.” To say, “Christianity is your truth but not mine” is one thing; it is quite another to say something like, “Geocentrism is your truth but not mine.”

There is yet more to be explored… Next time, we will dive into the second half of the “postmodernist manifesto” with a look at its existentialist roots and how misconceptions about the relationship of the self to one’s bodily life have led to transgender bathroom bills.

Post by: Eamonn Clark

Main image: The Positivist Temple in Porto Alegre, Brazil

Source: Tetraktys – User:Tetraktys, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=3295600

The Real Reason People Like 13 Reasons Why

There have been plenty of reasonable critiques of the new hit Netflix show, 13 Reasons Why, which follows the story of a community dealing with a young girl’s suicide and the creative “notes” she left behind. Bad acting, bad writing, the “role models” are extraordinarily clueless, suicide is romanticized, etc. Okay… then why is it so popular?

Take a look at the trailer (language warning):

The most powerful moment in the trailer, at least for me, is the revelation that the tapes are from Hannah, at 37 seconds… The following 20 seconds build on this force.

I suggest that the reason people are so intrigued by the show is this: it presents a concrete, realistic example of someone speaking from beyond the grave. Through her tapes, the Hannah Baker character presents a benign version of otherworldly communication, and people find this attractive. We human beings have a deep-seated need to go beyond this world and encounter something greater than ourselves. By committing suicide and leaving voice recordings of herself, Hannah half-accomplishes this – she is half-encountered, and she is half-greater, as she has become “ubiquitous” and commands enormous attention, but… spoiler alert… she’s dead. At any rate, people’s sense of the otherworldly is “turned on” by the show, and since many are not activating that sense adequately through religion, they watch this show to compensate. (This goes hand in hand with Hollywood’s obsession with exorcisms and the occult – a topic which merits its own post.) Hannah takes the place of God, Who, by the way, does not seem to find His way into the screenplay.

The problem is just that. Being convicted by an accusation of a dead girl through an audio tape is painful, important, and final, but she neither necessarily has got everything correct (as the show explores at length) nor is anyone’s life truly measured by her judgment. Furthermore, there can be no reconciliation with her… it’s over.

On the other hand, being convicted by an accusation of the living God through Scripture or preaching or conscience is quite different. Because God does not make mistakes, and because He does indeed provide the true measure of our life, His accusations, if seen rightly, are more painful, important, and final. It is no use arguing or rationalizing – we must reconcile, which thankfully we can do. It is even more powerful to find oneself being accused by God due to the fact that He is not just looking to prove a point, or to get some kind of attention, or to show that He’s really upset and can’t take it anymore… He convicts us of sin because He loves us, and reconciling with Him and amending our lives to be in accord with His Will are the best things for us.

Not so with Ms. Baker.

The characters in the show indirectly contributed to the death of Hannah, but she is clearly the one who is actually responsible for taking her life… Christ, however, was really put to death by others; and we ourselves are indirectly responsible for His death, at least insofar as we are sinners standing in need of that death, which He chose for our sake. So each and every one of us is one of His “reasons why.” He speaks to us now, but unlike Hannah Baker, He is alive and is waiting for us to speak back. And once you realize that, it is much more powerful than a suicide note could ever be.

 

Post by: Eamonn Clark

Main image: thumbnail from Netflix’s trailer for its show, 13 Reasons Why

The New Albigensianism, PART I: From Scotus to S.C.O.T.U.S.

For the most part, religious errors are reducible to four basic ideas.

  1. Jesus is not by nature both fully God and fully human (Arianism, Eutychianism, Monothelitism, Agnoetism, Mormonism, etc.)
  2. There are not three Persons in One God (Modalism, Unitariansim, Subordinationism, Partialism, etc.)
  3. Sanctifying grace is not a free and universally available gift absolutely necessary for salvation (Pelagianism, Semi-Pelagianism, Baianism, Jansenism, Calvinism, etc.)
  4. Matter is not essentially harmoniously ordered with spirit (Manichaeism, Buddhism, Albigensianism, etc.)

While the first three ideas are certainly prevalent in our own day, the correct doctrines are only available through the grace of faith. The falsehood of the fourth, however, is evident from a rigorous use of natural reason alone. Therefore, it is more blameworthy to succumb to that error.

We are seeing today the resurgence of the fourth error in four ways: the sexual revolution, radical feminism, the culture of death, and most recently, gender theory.

The three forms mentioned in the first list (Manichaeism, Buddhism, and Albigensianism) more or less say that matter is evil and needs to be done away with. The Manichees thought that matter was created by an evil god, the Buddhists think that matter is only a distraction, and the Albigensians (or “Cathars”) became so enamored with the thought of the spirit escaping its fleshy prison that suicide became a virtue… But we will talk all about the Cathars later, and we will find some striking similarities between this medieval rigorist dualism and some of the most recent value developments in the Western world.

The current manifestations of the fourth error do not quite say “matter is evil,” but they instead say that the determination of human matter (the body) is irrelevant to the good of the spirit, and/or that the spirit is one’s “true self” which can be served by the body according to one’s whims. Some proponents may claim they don’t believe in spirit, that is, immaterial reality (in this case, the “soul,” or formal principle of life), but when they speak of someone being “a woman trapped in a man’s body,” or something similar, they betray their real thoughts. Still, even if a person insists on denying the reality of spirit, it remains the spirit within him who denies it. There can be no “self-determination” without a self to determine, and if the body simply is the self, then how can there be real determination? There could then only be physical events without any meaning. This, of course, is contradicted by the very existence of “experience.” It is not merely a body which acts, but a person who experiences.

The error in its current expressions can be traced to Descartes, whose laudable project of attaining perfect certainty about the world was, ultimately, a disastrous failure. After shedding all opinions about which he did not have absolute certainty, he was left only with one meaningful truth: cogito, ergo sum. “I think, therefore I am.” No person could both think and not exist.

This was not new, as St. Augustine had come to a similar realization over 1,000 years earlier. The difference was the context and emphasis of the thought; to Augustine, it was an interesting idea coming out of nowhere and going nowhere. To Descartes, it was the foundation of every knowable proposition, and it led to the idea that human beings are essentially thinking (spiritual) beings rather than a body-soul composite… Think “soul trapped in body.”

This came after the ruins of the scholastic project. With the combination of the fixation on choice and freedom in Scotus’ work and Abelard’s troubling take on the problem of universals (how to account for similarities between different things), the stage for Ockham’s Nominalism was set. (See Gilson’s detailed description in his wonderful book, The Unity of Philosophical Experience.) It was Ockham who hammered in the last nail of St. Thomas’ coffin and who paved the way for the “cogito” to be intensely meaningful not only to Descartes, but to the entire Western academy. Nominalism’s dissociation of “things” from any real universal natures which would make those things intelligible as members of species was the first step towards overthrowing classical metaphysics. This “suspicion of being” understandably increased exponentially with the publication of Descartes’ Discourse on the Method, as it cast a serious doubt on the reliability of the senses themselves, doubt that many felt was unable to be overcome, despite a sincere effort to do so on the part of Descartes himself.

The_Matrix_Poster
Descartes: The Movie

The anxiety finally culminated in Kant’s “nervous breakdown”: a total rejection of metaphysics in the denial of the possibility of knowing “the-thing-in-itself” (noumena). From there, much of the academy generally either desperately tried to do without a robust metaphysics or desperately tried to pick up the pieces, and this theme continues today in the strange and fractured world of contemporary philosophy.

Ideas have consequences. As McIntyre shows so well in his book After Virtue in the case of “emotivism” (the position that ethical statements merely express one’s emotional preference for an action) a powerful idea that spreads like wildfire among the right academic circles can eventually stretch into the average home, even if subconsciously. A very well educated person may never have heard of G. E. Moore, but everyone from the wealthy intellectual to the homeless drunkard has encountered some shade of the emotivism Moore’s work gave rise to. The influence which both Descartes and Kant had on the academic scene in their respective eras was so vast and powerful, that it is not unfair to say that Western philosophy after the 17th century was in response to Descartes, and that Western philosophy today is in response to Kant.

The reaction to Descartes’ rationalism was first empiricism, then idealism. The reactions to Kant’s special fusion of rationalism and empiricism (that started “transcendental idealism”) which here concerns us were logical positivism and French existentialism.

Logical positivism is basically dead in academia, although the average militant atheist has taken a cheapened form of Ayer’s positivism to bash over the head of theists, and the general inertia of positivism remains in force in a vaguer “scientism” which hangs heavy in the air.

Existentialism, on the other hand, has become a powerful force in the formation of civil law. The following lengthy quotation is from Justice Anthony Kennedy’s majority opinion given in Planned Parenthood v. Casey (my emphases):

“Our law affords constitutional protection to personal decisions relating to marriage, procreation, contraception, family relationships, child rearing, and education. Carey v. Population Services International, 431 U.S., at 685 . Our cases recognize the right of the individual, married or single, to be free from unwarranted governmental intrusion into matters so fundamentally affecting a person as the decision whether to bear or beget a child. Eisenstadt v. Baird, supra, 405 U.S., at 453 (emphasis in original). Our precedents “have respected the private realm of family life which the state cannot enter.” Prince v. Massachusetts, 321 U.S. 158, 166 (1944). These matters, involving the most intimate and personal choices a person may make in a lifetime, choices central to personal dignity and autonomy, are central to the liberty protected by the Fourteenth Amendment. At the heart of liberty is the right to define one’s own concept of existence, of meaning, of the universe, and of the mystery of human life. Beliefs about these matters could not define the attributes of personhood were they formed under compulsion of the State.

“These considerations begin our analysis of the woman’s interest in terminating her pregnancy, but cannot end it, for this reason: though the abortion decision may originate within the zone of conscience and belief, it is more than a philosophic exercise. Abortion is a unique act. It is an act fraught with consequences for others: for the woman who must live with the implications of her decision; for the persons who perform and assist in the procedure; for the spouse, family, and society which must confront the knowledge that these procedures exist, procedures some deem nothing short of an act of violence against innocent human life; and, depending on one’s beliefs, for the life or potential life that is aborted. Though abortion is conduct, it does not follow that the State is entitled to proscribe it in all instances. That is because the liberty of the woman is at stake in a sense unique to the human condition, and so, unique to the law. The mother who carries a child to full term is subject to anxieties, to physical constraints, to pain that only she must bear. That these sacrifices have from the beginning of the human race been endured by woman with a pride that ennobles her in the eyes of others and gives to the infant a bond of love cannot alone be grounds for the State to insist she make the sacrifice. Her suffering is too intimate and personal for the State to insist, without more, upon its own vision of the woman’s role, however dominant that vision has been in the course of our history and our culture. The destiny of the woman must be shaped to a large extent on her own conception of her spiritual imperatives and her place in society.

No doubt, a critical reader will observe some tragic oddities in this passage. We will table an in-depth analysis, but I do want to point out the bizarre idea that our beliefs can determine reality. One might be tempted to call this “relativism,” and there is indeed some relativism in the passage (the evaluation of the fact of whether a life or potential life is taken in abortion “depending on one’s beliefs”). Without denying this, I also assert that beyond a casual relativism, which might be more a product of a lack of reflection than a real worldview, Kennedy is a deeply committed existentialist. (Indeed, it seems that existentialism naturally disposes a person to relativism.) The thought that one’s beliefs define one’s personhood comes almost directly from Jean-Paul Sartre. The doctrine is: existence precedes essence. Essence is determined by beliefs and actions, according to the existentialist. Such an affront to traditional metaphysics would have been impossible without the aforementioned ideological lineage – Scotus, Abelard, Ockham, Descartes, Kant… Seeing Justice Kennedy through the existentialist lens also helps to account for the striking absence of respect for a human being who can’t believe or meaningfully act. After all, how can such a thing really be a person?

Today’s common philosophy of the Western liberal elite (and their spoiled millennial offspring) seems to be a chimera of these two diametrically opposed worldviews: positivism and existentialism. These ideologies have been filtered into the average home, and watered down in the process in such a way that they can appear to fit together. In this series of articles, we will thematically wind through a maze of philosophy, science, hashtag activism, and moral theology to understand the present crisis and to propose possible remedies for it.

After now having given a brief sketch of the ideological history, we begin next time with a look at the positivist roots of the so-called “New Atheism” and how an undue reverence for science has contributed to what I have termed the “New Albigensianism.”

Stay tuned…

 

For Part II, click here.

Post by: Eamonn Clark

Main image: Carcassonne, France… one of the old Albigensian strongholds.

Main image source: http://en.destinationsuddefrance.com/Discover/Must-See/Carcassonne

The Dark Knight of the Soul: Fortitude in the Batman

Behold, a humorous essay I recently wrote for a moral theology class, with some slight edits. Enjoy!

Mr. Bruce Wayne had a troubled childhood. Not only did he lose his parents to a crazed gunman, but he also fell into a deep well full of bats. The former occasioned the inheritance of vast amounts of wealth, while the latter occasioned an intense case of chiroptophobia (fear of bats). Together, these effects would eventually lead him to undertake a massive bat-themed vigilante project which would dominate his life and cause a complicated set of benefits and drawbacks in Gotham City. The question is: whether the act of becoming the Batman was an act of true fortitude on the part of Bruce Wayne?

What is clear is that in Batman’s vigilante project there is matter for fortitude, namely, dangers of death. “Now fortitude is a virtue; and it is essential to virtue ever to tend to good; wherefore it is in order to pursue some good that man does not fly from the danger of death.” (1) Wayne, of course, is choosing to fly toward dangers of death, and literally at that. With countless thugs, gang leaders, and dastardly supervillains, Gotham is anything but safe; and this is not even to mention the means which Wayne adopts for fighting crime, which includes jumping off skyscrapers and careening in between all kinds of obstacles, supported by some mesh wings. He is doing battle with criminals who might kill him, in a way that might kill him. “Dangers of death occurring in battle” are the proper matter for fortitude, beyond lesser evils like bodily pain or the annoyance of standing in line at the DMV. (2)

It seems that Wayne might have gone to a vicious extreme in overcoming his own private chiroptophobia by becoming “half bat.” Yet there is really nothing to fear about bats in themselves, so to fear bats at all seems to be a case of timidity. This means that overcoming such a fear is a good thing to do. In facing his repressed traumatic experience of nearly dying in the well, which became so closely associated with the well’s bats, Wayne becoming Batman would only tend towards a vicious neurosis if his new bat-persona did not serve some purpose beyond itself. That is to say, if Wayne habitually dressed up like a bat in his own house and looked in the mirror, this would be disordered. Taking on the bat-persona for the sake of intimidating criminals, which is his primary motivation, is something else entirely.

Wayne does not become Flowerman or Butterflyman or Puppyman, he becomes Batman. Even if he had had traumatic experiences with flowers and butterflies and puppies, surely he would not want to deal with those memories in the same way. The idea of a vigilante qua bat (or alternatively qua spider) is simply terrifying, which is the point: it is an effective aid to fighting crime. This, however, does not necessarily make it prudent, as prudence means that justice and other virtues are not being violated. Here we will simply mention the possibility that vigilantism is unjustifiable in Gotham, given that there are good cops like Commissioner Gordon around. If Wayne had not considered this, or had not considered the physical risks involved, then the decision would be imprudent regardless of whether it is just. Becoming a vigilante virtuously requires serious counsel and an understanding of the principles of law. (3)

There are certain appearances of fearlessness and daring throughout the career of Batman, but one must wonder if this is merely a result of having mastered the fear of death during his time training in the mountains with the League of Shadows. On the contrary, Wayne goes to great lengths to protect himself, investing in the production and maintenance of extremely sophisticated protective devices, and this could exonerate him at least of fearlessness. Batman, supposing his project is just, certainly ought to fear death, not just for his own sake, seeing as life is a great good, but also for Gotham’s sake: “Death and whatever else can be inflicted by mortal man are not to be feared so that they make us forsake justice: but they are to be feared as hindering man in acts of virtue, either as regards himself, or as regards the progress he may cause in others.” (4) This is also part of why concealing his true identity is so important, for if it was widely known that Batman is Bruce Wayne, he would be easier to destroy.

As for magnanimity, Wayne already has great honors, insofar as honors accrue to a man of enormous wealth such as himself. Ironically, his public identity as a billionaire is a cover for what he really lives for privately, which is the accomplishment of great things like deposing crime bosses and deterring supervillains at great personal risk. He accepts the “unofficial honors” that come with such acts, but he does not care for them for their own sake, so he is not ambitious. He takes on the project to give the city of Gotham hope, which is where he refers the glory given to him as Batman. Therefore, Batman has a degree of magnanimity. (5) There is, however, an element of Wayne’s public life that is pusillanimous, as he purposefully distances himself from seeming great by being an arrogant, dishonest, quarrelsome womanizer. He could gain more honor publicly by being more virtuous, but he rightly fears that this could lead to the suspicion that he is Batman. Insofar as this component of concealing his nocturnal activities is vicious, it is neither magnanimous nor fortitudinous, as sins cannot be called acts of virtue.

The crime fighting skills of Wayne are second to none, and since he has ordered his life and vast wealth towards crime fighting without compromising his fortune or social status, he most certainly deserves to be ascribed the virtue of magnificence. For, “[It] belongs to magnificence not only to do something great, ‘doing’ (facere) being taken in the strict sense, but also to tend with the mind to the doing of great things.” (6) Since Wayne could do almost anything he wants on account of his wealth, the good use of which is the proper object of magnificence, his mind certainly tends with great force toward the accomplishment of masterful crime fighting. (7) Otherwise he would do whatever it is that other billionaires do.

To the question, whether Bruce Wayne’s choice to become Batman was an act of true fortitude, we answer is the affirmative, with two qualifications. The first is that the entire vigilante project is just, which is unclear. The second is that the artificial public persona taken on as part of the condition for the project, which can be assumed to have been part of the means from the start, is at least mildly vicious and therefore reduces the fortitudinous character of the choice.

(1) STh II-II q. 123 a. 5 ans.

(2) Ibid.

(3) Namely, gnome and epikeia would be required. See STh II-II q. 51 a. 4; q. 120 a. 1, a. 2

(4) STh II-II q. 126 a. 1 rep. 2

(5) That his voice is extraordinarily deep is not a sign of greater magnanimity, it is merely another component of his intimidation, as well as a way to conceal his public identity. Furthermore, that he does not walk slowly to accomplish his tasks does not imply a lack of magnanimity, as the particular kind of great things which he seeks to accomplish demand agility.

(6) STh II-II q. 134 a. 2 rep. 2

(7) STh II-II q. 134 a. 2

 

Post by: Eamonn Clark

Christian Rock and Rocky Soil

It used to baffle me. “How can so many of my peers who were so ‘churchy’ and ‘involved’ in high school have just drifted away in college?”

It doesn’t baffle me any more.

If you are a new DRE, youth minister, or high school chaplain in the USA, here’s a sobering reality check: the chances are that a lot of the kids volunteering on the weekend, helping lead retreats, signing up for work camp each year, etc., etc., will fall away when they leave high school. No, not all, and probably not most, but many. Some will eventually find their way back, maybe by a chance encounter with a priest, or a random itch of their conscience, or if and when they get married in the Church and decide it’s time to “get serious.” Some will find their way back, but not all.

Why does this happen, how does this illusion of commitment work, and what can be done to prevent this?

Despite the provocative title of this article, music is only part of the problem, though it is one of the best examples of the core conflict – trying to choose both God and mammon in parishes and ministry programs.

But let’s talk music first.

It is possible for rock music to be authentically Christian and still be good rock. But the Christian message must be indirect, or else there will be a lack of proportion between what is being said and how it is being said. Proportion is an essential element of beauty, and who wants music that isn’t beautiful to be used for worship?

Here is one comparison between two songs with similar themes but achieved in radically different ways.

This song is a first-person account of someone trying to overcome some life obstacles.

The lyrics are vaguely Christian, but it seems like even if they were more direct it would not help much – it would still be inappropriate for worship, because it is taking a music genre entirely from and for the world and trying to Christify it explicitly. That is why it’s so awkward, at least for me, even just to listen to.

Furthermore, the music itself in this example is just plain second-rate. The message itself also is very self-centered, which would be one thing if it wasn’t marketed as “Christian” and there wasn’t the almost artificial insertion of a mini-prayer in the lyrics, “God, I want to dream again.” I’ve never heard this at church, but I don’t frequent Protestant megachurches. I can certainly imagine it being used.

The next song is about a couple of kids whose lives are going terribly wrong, starting with one who gets shot on his way to school.

This is good rock music. It’s also profoundly moving, albeit in an unexpected way. Nobody would play this at a church, and rightly so, but I argue that this is a much better example of “Christian Rock” than the first song, not only because it is better musically but also because it knows what it is: the artists don’t try to insert the explicitly other-worldly into a worldly genre, apart from a one-off Scriptural reference (“the blind leading the blind”). Instead, they vividly illustrate real world problems and the emotions associated with them. This leads the listener to the simple consideration of the bleakness of sin and the need for something dramatically good to counter young people’s hopelessness. Finally, they suggest that the solution is at least in part our responsibility: “We are, we are, the youth of the nation.” That’s about a thousand times more Christian and artistic than the previous song. (The band, P.O.D., is loosely self-described as Christian, by the way.)

Anyway, as an alternative to Christian Rock at church, we have masterpieces like this available to us:

It’s very hard to pull off something like this well – and it really MUST be recited live – but that is part of what makes it worth so much as an act of worship. It involves serious dedication. Sacrificial worship doesn’t only mean killing goats, of course: it can also mean slaving away for a few dozen hours just to produce one beautiful arrangement for a single Mass. God likes that.

“But I like the churchy Christian Rock. So do lots of other people. In fact, a lot of the people at my church come because we play that kind of music.”

Now we come to the root.

If it were a simple matter of aesthetics, one taste does not rule over other tastes. Chocolate is not inherently better than vanilla, etc. Except we are not talking about ice cream, we are talking about the public worship of Almighty God and spiritually encountering Him in that worship (which is distinct from emotionally encountering Him). There is an objectivity to music and worship, which is why the objection that “classical” music is just the “rock” or “pop” of the 17th century (etc.) does not work. Certain kinds of music do not appropriately resonate with our soul inasmuch as it is ordered toward loving and encountering the otherworldly. As the famous saying goes, “Lex orandi, lex credendi” – as one worships, so one believes. If someone heard a “Christian song” without knowing the language in which it’s being sung, and he thinks it’s probably about some guy’s girlfriend, for example, there is a big problem. If God, as the Author of Grace, is going to be treated directly, He deserves something more than what your girlfriend deserves, as nice as she may be. And the more one treats God like a girlfriend in worship, the more one is likely to think of God that way. It’s just how human beings work. When your girlfriend gets boring or too challenging, you can leave her for someone else. When God or the Mass or the one true Church is treated like a girlfriend in worship, when they get boring or too challenging, they are all too likely to be left for something else. And the more one tries to dress them up like some other “girl,” the more one will realize that it would be easier just to go after that girl instead. We can’t make God in our image, and when we figure that out, the choice is forced upon us: we either destroy our little idol and worship God on His own terms, or we go seek the thing that we were trying to make Him into.

The trumpets that will blare at Our Lord’s return will be playing music closer to Mozart than to Meatloaf, and not for no reason. If I don’t like the Parousia’s music – or even Heaven’s music – will it be because God doesn’t know what’s “relevant,” or will it be because He knows there’s something more objective about transcendence than my fleeting emotional inclinations?

Liking secular-ish Christian-ish music and feeling good about God on its account is not wrong in itself.

feelsmeme
Go on ahead! Feelings and emotions are NOT evil. But they are only GOOD if they are in line with reason.

What is wrong is when those things are at the foundation of one’s spiritual life, instead of the imperceptible indwelling of the Holy Spirit and sanctifying grace expressing themselves in the exercise of moral virtue and frequent prayer (even continuous prayer, to the point where instead of talking to yourself to think through the mundane tedium of your daily life, you talk to God). If and when well-performed secular-ish Christian-ish music and/or nice feelings about God become inaccessible for some reason, a person who had seemed to grow up in the spiritual life so quickly is liable to become “withered by the sun and die,” so to speak, just like the seed sown in rocky soil (Mt. 13: 1-23). Such a person will eventually notice that the world (or even some other church) gives quicker and easier nice feelings, and that continuing to pray and go to Mass diligently is really hard when faced with that alternative. And why resist? “If spirituality is all about the feels anyway, when I get them, great, when I don’t get them, then I just won’t kill anyone or rob any banks, and I’ll go to Heaven, or something like that. But maybe the whole ‘organized religion thing’ is all just a psychological prison anyway, and a nondescript ‘spirituality’ is where it’s at.” And down the slope we go. People don’t usually think or express their desires in exactly these terms, but they often act based exactly on the ideas found in them.

If you live in the Western world, this process is almost certainly happening with people in your parish, especially to millennials. The problem, of course, is not limited to music – the approach of condescending indefinitely to worldliness can permeate the air of entire parishes. Let pastors who are looking to “Rebuild” be aware of the lesson of Aaron and the calf… Money and popularity do not make a parish a spiritual success. Your sanctuary may be tricked out with the latest live streaming gear and some nifty projector screens, and your band may make a 6 figure salary due to generous tithing, but if there’s not perpetual or nearly perpetual adoration; if there aren’t vocations; if there aren’t long lines at the confessional; if people are not praying before and after Mass in silence… these deserve more attention.

The Protestant megachurches and the world will always win the game anyway. They produce better, flashier, trendier stuff, including morals and doctrine. They produce better rock music. They condescend to our worldliness better. Therefore, the game ought not be played. Our Lord did not play the game, though He was invited to by the Devil. (Mt. 4: 1-11)

Christ condescended to our worldliness by becoming a human being. Beyond that, He used language and images we could understand. He identified with us in our need for food and drink, as with the woman at the well, or with the Eucharist itself. He pointed out the way to perfection to the Rich Young Man and to those wondering about divorce by meeting them where they were, and yet He did not insist on poverty or celibacy as Commandments. All this condescension, however, actually serves the will of the Father by calling people to look beyond the world. Christian Rock, as commonly understood, does not do this, but instead lowers God more than He lowered Himself by putting Him into a worldly genre of music which can certainly make people feel nice feelings but cannot lead one to contemplation as it is understood by the spiritual masters. (In fact, prolonged silence is one of the best things for that.) And of course, some other parochial and ministerial projects fall into the same trap. We must not be in the business of making good novices: we must be in the business of making saints.

The longer one pretends he can find God in the storm, the earthquake, and the fire, the more likely he is to miss the small whispering sound that calls a soul out of the cave. God showed His might on Sinai with signs of His fearsome power, but now, in the invisible life of grace, the signs of His love manifestly prevail – and lovers very often want to be alone together in silence, do they not?

Education in the spiritual life must become a greater priority in parishes, especially youth ministry programs, if we are to stop the bleeding of parishioners looking “to be fed” somewhere else – back in Egypt, that is, where there were melons and leeks and fleshpots.  We especially ought to curb the enthusiasm in our young people for getting chills and thrills on retreats – and certainly for “speaking in tongues” and being “slain in the Spirit,” for goodness’ sake – and instead teach them that the greater effects of prayer and the sacraments are in an undying thirst to do what is right out of love for God and the pursuit of union with Him at the expense of any and all other pleasures. Growth may seem slower, but it will be steadier.

Better, more subdued, more dignified music is just one part of the solution. Christ our Rock is more spiritual than worldly, after all.

Post by: Eamonn Clark

Main image: The Sower, Vincent Van Gogh, 1888