CRISPR: The Eugenitopia is Here

Have you heard of CRISPR? No it’s not a breakfast cereal… It’s a fast, accurate, and cheap means of changing DNA. It stands for Clustered Regularly Interspaced Short Palindromic Repeats.

If you haven’t heard of it, then you need to: this is a HUGE deal.

There is a protein in bacteria called Cas9 which helps defend against viral attacks. (Yes, bacteria get viruses, too!) The bacterial DNA can take the viral DNA and store it in a special place (CRISPR). When the virus attacks a second time, RNA loads Cas9 with the viral DNA. The loaded Cas9 scans the bacteria’s DNA to find the new viral DNA that has infiltrated it, and it cuts it out – a little like a DNA antibody. Personified, that means the cell says, “I’ve seen this change before, it’s a mistake, I need to replace the change with the original information.” Then it sends a message to the Cas9 protein with the bad DNA sequence to modify, and off we go.

The Cas9 protein can be taken out of bacteria, be given a DNA sequence from any kind of living thing, be injected into any other living thing, and it will make changes in that organism based on the information it was “programmed” with.

Got that? You give Cas9 a DNA sequence you want to modify, inject it into an organism, and it will make the changes. It is fast, it is accurate, and it is CHEAP.

Okay, this is a bit of an oversimplification. There’s more to it, and no you can’t just walk into the right lab and get a shot that will make you grow wings… yet.

There are obvious benefits to this kind of procedure. CRISPR might provide us with a cure for cancer, AIDS, any number of genetic diseases, and could help us generally keep healthy (like by increasing our metabolism or improving our eyesight). Once it is really nailed down, it is very likely that a couple of $12 shots at the minute clinic will be able to get rid of your asthma, or Alzheimer’s, or cerebral palsy… forever.

But… With great power comes great responsibility.

uncleben
Thanks, Uncle Ben. Wait a minute – was Spiderman CRISPR’d?

Unfortunately, the 21st century West is not very responsible. Where might CRISPR go wrong?

Well, what color eyes would you like your child to have? Should we bump up his IQ while we’re at it? Hey, you’re an athlete, maybe we can give him long legs and enhanced muscular growth as well, so he will be sure to be athletic too. Just an extra $300, please. Oh, you’d like him to have Shiva arms and a third eye, because you’re into that kind of thing right now? You’ll have to go down the hall for that.

Anti-aging cream? Psh. Take the right injection, and your body will actually start DE-AGING. As long as you don’t get hit by a truck or something, you’re good to go for another hundred years… a thousand years… indefinitely, perhaps. Or at least we will try.

Let’s say you’re running a poor nation and, well, need things to go more “smoothly.” So you put something in the water to make all your citizens have a defect that only you can provide the fix for. And you will only provide it to a person if his taxes are paid on time, he doesn’t have too many children, and he votes for you again. (This could be done now, but not with nearly the same ease and dramatic effects.) Meanwhile, you are pumping your soldiers and police full of testosterone 2.0…

halo-mccjpg-bc17eb_1280w
It’s only cool if he’s on your side. You might look like an alien once the Great Leader poisons you.

And once such genetically modified people reproduce (whether they have been helped to be healthy or have been “upgraded” or “downgraded” somehow), those screwed up genes get passed along. At that point, there’s no stopping it. And we have no idea what that will actually mean.

Here’s a video helpful for understanding more:

This technology is developing very quickly. The Church needs to get ready with a response, ASAP. Where is the line for modification, and why? If life is a good thing and death is to be avoided, is anti-aging wrong? What is to be done in terms of people who have already changed themselves by addition – how far does the obligation extend to have such a thing undone? Is this technology really worth the risk of irreversible changes to the gene pool which we don’t even know the danger of? Could there be an obligation to use this technology to prevent certain kinds of diseases? These are the kinds of questions we have to begin asking.

Get ready. It’s coming. And once it comes, it is here to stay.

Post by: Eamonn Clark

Main image: Cas9 in the Apo form

Main image source (modified): By Ben.lafrance – Template:Own rendition of the crystal structure solved by M Jinek et al, published in Science 2014, CC BY-SA 4.0, https://commons.wikimedia.org/w/index.php?curid=37224108

The New Albigensianism, PART II: Comte and the Combox

For Part I, click here.

Just as the woman with the hemorrhage reached out to touch the hem of Jesus’ tunic, so do post-modern secular Westerners reach out to touch the hem of scientists’ lab coats. Despite the plain fact that any given scientist or doctor or other “expert” will be tend to be specialized in only some tiny sliver of his or her field, hopeless intellectual wanderers will gather at the feet of these people to learn all the mysteries of the universe… which is dumb. How did this happen?

Let’s take a step back.

The manifesto of the post-modern Westerner par excellence is this: “Real knowledge is only of irreducible information about the material world, and I can manipulate that same material world however I want in order to express myself and fulfill my desires.”

Herein we see two strands of thought colliding, one about the mind and one about the will: positivism and existentialism. Historically, they are not friends. How they have become fused together in post-modernity is a strange tale.

Today we will break open the first clause – real knowledge is only of irreducible information about the material world, the positivist element.

From the outset, we must make a distinction between “positivism,” which is an epistemic and social theory, and “logical positivism,” which is something more metaphysically aimed. My goal here is to show the roots of the broader idea of positivism, how it found its academic zenith in logical positivism, then how the aftermath of its fall has affected Western philosophy and science at large as well as in the minds of millennials.

A brief sketch of the positivist genealogy will suffice. We recall Descartes to point out his obsession with certitude, just as we note the empiricist thrust of Bacon, Locke, and Hume. We must mention Kant, both as the originator of the analytic-synthetic distinction (which will become enormously important) and as an influence to Hegel, who is notable for his approach to philosophy as something integral with history. Condorcet and Diderot should be pointed out as influential, being the greatest embodiments of the French Enlightenment, wherein reason and revealed religion are opposing forces. Marx, though he would reject positivism as a social ideology, helped inspire it along the same lines as Hegel had. The penultimate step was Henri de Saint-Simon, whose utopian socialism was all the rage during the French Revolution which was attempting to put his political theory into political practice.

Of course, these men were not positivists. It is Henri de Saint-Simon’s pupil, Auguste Comte, who brings us this unwanted gift of an empiricism so strong it entirely and unabashedly rejects any and all metaphysical knowledge outright. This led Comte to build a reducible hierarchy of the sciences based on their certainty or “positivity,” and he claimed (rightly) that the trend of empirical studies was heading toward a “social science.” This conception of a reducible scientific hierarchy – one where, for instance, biology can be put in terms of chemistry, and chemistry in terms of physics, etc. – was a rather new way of thinking… Previously, it had been more or less taken for granted that each science has its own irreducible terms and methods, even admitting some kind of hierarchy (such as with the classical progression of the liberal arts).

Not only was Comte the first real philosopher of science, he was also the first sociologist. According to Comte, humanity was passing from its first two stages, the theological and the metaphysical, into the third and final “positivist stage” where only empirical data would ground truth-claims about the world. Having evolved to a higher clarity about what the world is, and having built up enough of the more basic physical sciences to explore how that world works, sociology could finally occur. Mathematical evaluation of social behavior, rather than qualitative analysis, would serve as the proper method of the “queen of the sciences” in this new age.

Comte outright jettisoned religion qua supernatural and revelatory, but his intensely Catholic upbringing had driven into him such a habit of ritual that he could not altogether shake the need for some kind of piety. What was a French Revolution atheist to do? Well, start a “religion of humanity,” of course. (The “positivist religion” never became a major force, especially since Freemasonry already filled the “secular religion gap,” but it did catch on in some areas. Take a closer look at the Brazilian flag and its meaning, for example…) We should also note, for the record, that Comte was only intermittently sane.

The epistemic side of positivism almost ended up just as much of a flop as the pseudo-religion side of it. Unfortunately for the West, Durkheim and Littré became interested, and they, being altogether sane, effectively diffused Comte’s ideas and their own additions through the West at the start of the 20th century. Eventually, a group of like-minded academes started habitually gathering at a swanky café in Austria to discuss how filthy and naïve metaphysics was compared to the glories of the pure use of the senses and simple mathematical reason – the Vienna Circle was born.

Together with some Berliners, these characters formulated what came to be known logical positivism. When the shadow of Nazism was cast over Germany, some of these men journeyed westward to England and America, where their ideas were diffused.

The champions of logical positivism were Hans Hahn, Otto Neurath, Moritz Schlick, Rudolf Carnap, A.J. Ayer, and Bertrand Russell. While Russell is no doubt familiar to some readers (think “tea pot”), the others fly lower under the radar. It is Ayer’s formulation of the logical positivist doctrine which we will use, however, for our analysis.

“We say that a statement is factually significant to any given person, if, and only if, he knows how to verify the proposition which it purports to express – that is, if he knows what observations would lead him, under certain conditions, to accept the proposition as being true, or reject it as being false.” (Language, Truth, and Logic, 35)

Got that? What this means, in the context of the whole book, is that in addition to statements which are “analytic” (“all bachelors are unmarried”) being true necessarily, only statements which we can actually use our 5 senses to verify the truth of can be meaningful – that is, able to be true at all. These are “synthetic” statements. If I say that Pluto is made of bacon grease, I am making a meaningful statement, even though I cannot actually verify it; it suffices that it is hypothetically possible to verify it. If I say that the intellect is a power of the soul, this is not meaningful, since it cannot be verified with the senses. For the details, see Ayer’s book, which is rather short.

Needless to say, it is rare that a school of thought truly dies in academia. A thorough search of university philosophy departments in the Western world would yield a few die-hard fans of Plotinus, Al-Gazali, Maimonides, and maybe even Heraclitus. Perhaps the best or even only example of ideological death was logical positivism. W.V. Quine’s landmark paper “Two Dogmas of Empiricism” was such a blow to the doctrine that eventually Ayer actually admitted himself to be in massive error and repudiated his own work.

What was so blindingly erroneous about logical positivism?

First, the analytic-synthetic distinction, as formulated by the logical positivists, is groundless. Analytic statements supposedly don’t need real referents in order to be true, but they are instead simply about the meanings of words. For some kinds of statements which employ basic affirmation and negation, this might work, as it is simply just a dressing up of the principle of non-contradiction. Fine. But if one wants to start using synonyms to take the place of some of the parts of these statements, the distinction begins to disappear… What the relationship is between the synonym’s object and the original word’s object cannot be explained without a reference to real things (synthetic!), or without an ultimately circular appeal to the analyticity of the new statement through a claim of the universal extension of the synonym based on modal adverbial qualifications (like the word “necessarily,” which points to an essential characteristic which must either be made up or actually encountered in reality and appropriated by a synthesis). In other words, it is analytic “just because.” (Thus, the title of Quine’s paper: Two Dogmas of Empiricism. Read more here.)

Beyond that, logical positivism is a self-refuting on theory its face… If meaningful statements can only be about physically verifiable things, then that statement itself is meaningless because it is not analytic (or is arbitrary if it is, and we go back to the first problem) and cannot be verified with the senses so is not synthetic… How does one verify “meaningfulness” with the senses? Logical positivism is a metaphysical theory that metaphysics is meaningless. Once again, this can only be asserted, not discovered. Except with this dogma, it evidently claims itself to be meaningless.

But the cat was out of the bag: “Metaphysics has completely died at last.” Logical positivism had already made its way from the salons of Austria to the parlors of America and lecture halls of Great Britain. The fuel was poured on the fire that had started in England by Bertrand Russell and G. E. Moore after they had decided to reject the British Idealism that dominated the scene by creating an “analytic” philosophy that didn’t deal with all those Hegelian vanities that couldn’t be touched with a stick or put in a beaker. Russell’s star pupil, Ludwig Wittgenstein, would also come to be a seminal force in strengthening the analytic ethos, after having already inspired much of the discussion in the Vienna Circle. Though Quine did indeed destroy the metaphysical doctrine that metaphysics is meaningless, the force of positivism continued nonetheless within this “analytic” framework – and it is with us to this day en masse in university philosophy departments, which has led several generations of students to miss out on a solid education in classical metaphysics and philosophical anthropology.

In sociology there arose the “antipositivism” of Max Weber, which insisted on the need for value-based sociology – after all, how can a society really be understood apart from its own values, and how can a society be demarcated at all without reference to those values, etc.? A liquid does not assign a value to turning into a gas, which it then acts upon, but a group does assign a value to capitalism, or marriage, or birth status which it then acts upon.

In the broader realm of the philosophy of science, Karl Popper and Thomas Kuhn’s postpositivism came to the fore. Science in general cannot be best explained without regard for some kind of value, but that the possibility of and/or actualization of the falsification or failure of a scientific theory is the characteristic feature of the sciences – in contrast to the optimism of the positivists that we can “just do science,” and that that will be useful enough.

In “science” itself, an air of independence was diffused. Scientists do “science,” other people do other things, and that’s that; never mind that we have no idea how to define “science” as we understand it today, and never mind that values are always brought to bear in scientific evaluation, and never mind what might actually be done with what potentially dangerous knowledge is gained or tool developed. A far cry from the polymaths, such as St. Albert the Great or Aristotle, who never would have considered such independence.

Then there are the “pop scientists” who try to do philosophy. A few examples of many will have to suffice to show that there exist three traits among pop scientists who are the go-to sources on religion and philosophy for countless curious millennials and Gen-Xers alike.

The first is an epistemic myopia, which derives immediately from positivism: if you can’t poke it or put it in a beaker, it’s not real. (Yes, it is a little more complicated than that, but you’ve read the section above describing positivism, right? Empirical verification is the only criterion and process for knowledge… Etc.) This is often manifested by a lack of awareness that “continental philosophy” (as opposed to analytic philosophy) often works in totally immaterial terms, like act, or mind, or cause, or God. This immediately creates equivocation – a pop scientist says “act” and thinks “doing something,” for example.

The second is an ignorance of basic philosophical principles and methods, which follows from the first characteristic. If you don’t know how to boil water, don’t go on “Hell’s Kitchen” – everyone will laugh at you and wonder what you are doing there in the first place. We might do well to have a philosophical version of Gordon Ramsay roaming about.

The third is the arrogance to pontificate on philosophy and theology nonetheless, and this of course follows from the second characteristic. They don’t know what they don’t know, but they got a book deal, so they will act like they are experts.

Everyone knows Dr. Stephen Hawking. (They made a movie!) But did you know that the average 6-year-old could debunk the central claim of his most recent book? It is now an infamous passage:

“Because there is a law such as gravity, the universe can and will create itself from nothing.” (From The Grand Design)

I can hear the 1st graders calling out now: “But gravity’s not nothing!” And they would be right. The myopia of Dr. Hawking (and Dr. Mlodinow, his co-author) is evident in the inability to grasp that, as Gerald Schroeder pointed out, an immaterial law outside of time that can create the universe sounds a lot like, well, God. The ignorance of basic philosophical principles, in this case, the most basic, is clear from realizing that “gravity” can’t be both SOMETHING AND NOTHING. Then, the arrogance to go on pontificating anyway is self-evident by the fact of the existence of the book, and then a TV series which aired shortly afterward wherein we find philosophical reflection which is similarly wanting.

If you really want to do a heavy penance, watch this “discussion” between Hawking, Mlodinow, Deepak Chopra, and poor Fr. Spitzer – I had the displeasure of watching it live several years ago:

Then there are folks like Dr. Michio Kaku. He regularly shows up on those Discovery Channel specials on string theory, quantum mechanics, future technology, yadda yadda. All well and good. But here’s an… interesting quotation for our consideration:

“Aquinas began the cosmological proof by postulating that God was the First Mover and First Maker. He artfully dodged the question of ‘who made God’ by simply asserting that the question made no sense. God had no maker because he was the First. Period. The cosmological proof states that everything that moves must have had something push it, which in turn must have had something push it, and so on. But what started the first push? . . . The flaw in the cosmological proof, for example, is that the conservation of mass and energy is sufficient to explain motion without appealing to a First Mover. For example, gas molecules may bounce against the walls of a container without requiring anyone or anything to get them moving. In principle, these molecules can move forever, requiring no beginning or end. Thus there is no necessity for a First or a Last Mover as long as mass and energy are conserved.” (Hyperspace, 193-195)

The misunderstandings here are as comical as they are numerous… The conflation, found explicitly in the full text, of the first 3 Ways as “the cosmological proof,” which obscures the issue, the belief that “motion” is a term about something necessarily temporal, the thought that only recently did we discover that matter and energy don’t just appear and disappear, and then the most obvious blunder – Thomas does NOT start any of the 5 Ways by saying anything like “God is the First Mover, therefore…” There is no such ungrounded assertion which “dodges the question,” as Kaku puts it. One must wonder if he even bothered to read the original text – which is readily available. Kaku has even weaker arguments (unbelievably) against both the “moral proof” (which is a characterization I have never heard of the 4th Way until Kaku’s book, which troubles me from the start) and the teleological proof on top of this disastrous critique, but I won’t bore you. (Basically: “Because change and evolution.” Read it for yourself.)

Once again, we see three qualities: epistemic myopia (as evidenced, for example, by the error about “motion”), ignorance of the most basic philosophical principles (albeit these are a little more complicated than the one Hawking whiffed on), and the arrogance to pontificate about God and the act of creation nonetheless.

Next you have a man like Richard Dawkins, one of the nastiest examples of publicly evangelical atheism the world has to offer at present. Here’s one particularly embarrassing quotation from his seminal anti-theistic work, The God Delusion:

“However statistically improbable the entity you seek to explain by invoking a designer, the designer himself has got to be at least as improbable.” (p. 138)

philmeme1

Can you see the three characteristics? Material beings only (or at least “things” with “parts”), no idea what metaphysical simplicity is and how it relates to God in Western philosophy, and yet here we have one book of many which address this theme.

It is not that these folks don’t believe in classical metaphysics – it’s that they don’t understand them in the least. They play a game of solitaire and claim to be winning a game of poker.

We won’t even get into discussing Bill Nye the Eugenics Guy… for now.

Okay, yes, quote-mining is easy. But this is the cream of the crop from a very large and fertile field. I am not sure I recall ever reading an important and sensible argument about religion or metaphysics from a world-renowned scientist who lived in the past 50 or so years. Someone prove me wrong in the comments.

All this leads us to the average “scientism” which one finds in the comboxes of Youtube videos about religion, threads on various websites, and debates on social media. Yes, there are plenty of religious people in those arenas, but the skeptics who try to make wild claims like “science disproves religion” or “evolution means God does not exist” or even just dismiss the idea of revealed religion outright with some kind of mockery ought to be seen as the children of positivism. It is the most probable explanation – the sources of their myopia, ignorance, and arrogance can usually be traced back through intermediate steps to a talking head like Dawkins who ultimately owes his own irrational ramblings to Auguste Comte.

Why is post-modern positivism so naïve? At the combox level, it is because these people, as all others, have an instinctive drive to trust in someone beyond themselves. For many it is due to circumstance and perhaps a certain kind of emotional insecurity and intellectual laziness that they latch on to the confident scientistic loudmouths to formulate their worldview – and it becomes a pseudo-religious dogmatic cult of its own, a little like Comte’s “religion of humanity.” At the pop-science level, it is just plain laziness and/or intellectual dishonesty combined with arrogance, as we have investigated. At the lecture hall level – and I mainly speak of the general closed-mindedness towards classical metaphysics found in analytic circles – it is a deeper kind of blindness which is the result of the academic culture created by the aforementioned ideological lineage. Each level has its own share of responsibility which it is shirking.

The truth is that matter is known by something immaterial – a mind or person – and this reveals to us a certain kind of hierarchy and order, seeing as matter itself does not know us. Man is indeed over all matter and ought to control it and master it, and all without the consent of matter; but this does not mean that there can’t be knowledge of things nobler and/or simpler than man, like substance or causation or God. Not looking at matter as the product of non-matter, and as being ordered to the immaterial in a certain way, is part and parcel of the New Albigensianism.

So there we have the first part of the manifesto explained. Irreducible facts (the ones devoid of metaphysics and value judgments) about the material world constitute the only real knowledge. The less reducible, the less it is really known. Even though the West is full of supposed “relativists,” it would be difficult to find a person who would truly let go of the objectivity of “science.” To say, “Christianity is your truth but not mine” is one thing; it is quite another to say something like, “Geocentrism is your truth but not mine.”

There is yet more to be explored… Next time, we will dive into the second half of the “postmodernist manifesto” with a look at its existentialist roots and how misconceptions about the relationship of the self to one’s bodily life have led to transgender bathroom bills.

Post by: Eamonn Clark

Main image: The Positivist Temple in Porto Alegre, Brazil

Source: Tetraktys – User:Tetraktys, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=3295600

The Real Reason People Like 13 Reasons Why

There have been plenty of reasonable critiques of the new hit Netflix show, 13 Reasons Why, which follows the story of a community dealing with a young girl’s suicide and the creative “notes” she left behind. Bad acting, bad writing, the “role models” are extraordinarily clueless, suicide is romanticized, etc. Okay… then why is it so popular?

Take a look at the trailer (language warning):

The most powerful moment in the trailer, at least for me, is the revelation that the tapes are from Hannah, at 37 seconds… The following 20 seconds build on this force.

I suggest that the reason people are so intrigued by the show is this: it presents a concrete, realistic example of someone speaking from beyond the grave. Through her tapes, the Hannah Baker character presents a benign version of otherworldly communication, and people find this attractive. We human beings have a deep-seated need to go beyond this world and encounter something greater than ourselves. By committing suicide and leaving voice recordings of herself, Hannah half-accomplishes this – she is half-encountered, and she is half-greater, as she has become “ubiquitous” and commands enormous attention, but… spoiler alert… she’s dead. At any rate, people’s sense of the otherworldly is “turned on” by the show, and since many are not activating that sense adequately through religion, they watch this show to compensate. (This goes hand in hand with Hollywood’s obsession with exorcisms and the occult – a topic which merits its own post.) Hannah takes the place of God, Who, by the way, does not seem to find His way into the screenplay.

The problem is just that. Being convicted by an accusation of a dead girl through an audio tape is painful, important, and final, but she neither necessarily has got everything correct (as the show explores at length) nor is anyone’s life truly measured by her judgment. Furthermore, there can be no reconciliation with her… it’s over.

On the other hand, being convicted by an accusation of the living God through Scripture or preaching or conscience is quite different. Because God does not make mistakes, and because He does indeed provide the true measure of our life, His accusations, if seen rightly, are more painful, important, and final. It is no use arguing or rationalizing – we must reconcile, which thankfully we can do. It is even more powerful to find oneself being accused by God due to the fact that He is not just looking to prove a point, or to get some kind of attention, or to show that He’s really upset and can’t take it anymore… He convicts us of sin because He loves us, and reconciling with Him and amending our lives to be in accord with His Will are the best things for us.

Not so with Ms. Baker.

The characters in the show indirectly contributed to the death of Hannah, but she is clearly the one who is actually responsible for taking her life… Christ, however, was really put to death by others; and we ourselves are indirectly responsible for His death, at least insofar as we are sinners standing in need of that death, which He chose for our sake. So each and every one of us is one of His “reasons why.” He speaks to us now, but unlike Hannah Baker, He is alive and is waiting for us to speak back. And once you realize that, it is much more powerful than a suicide note could ever be.

 

Post by: Eamonn Clark

Main image: thumbnail from Netflix’s trailer for its show, 13 Reasons Why

Motherhood and Human Maturity

(Part I in a series on motherhood and fatherhood)

So much of who we are comes from our mothers. We are who we are in relation to others – and the first relationship we had was being nestled nine months in our mother’s womb.

“Male and female He created them” – it is fitting that with these words our first parents are introduced, since our first experience of gender, our first experience of male and female, comes – not from our analysis of gender roles in society – but really and concretely, from our mother and our father.

“God created man in his own image, in the image of God He created him; male and female he created them.” Because we are male, because we are female, we are in the image of God. We are not made in the image of God as mere androgynous souls with consciousness; rather, we are embodied in our masculinity and our femininity. Our lives are circumscribed between motherhood and fatherhood – none of us comes into this world without a natural father, none of us comes into this world without a natural mother.

In a time hidden from our memories, that initial relationship with our mothers forms us at the core of who we are. No person has ever grown to maturity without first passing through their mother’s body. Try as they might, technology still has yet to eclipse biology.

(If you want to be overwhelmed with all the particulars of gestational biology, check out this video.)

 

From the first moment of our existence in the womb of our mother, we are surrounded by her, enveloped in her body. Her body supplies for every one of our needs. As our cells divide and develop, our blood takes nourishment and oxygen from her blood; there is an exchange of life. By the time a mother is aware she is with child, her maternal body has known this already for weeks. Before she feels the budding movements of the child’s limbs, she is already being moved by the child – morning sickness, new diet, the maternal nesting instinct to tackle stale projects. But more than that, her whole life receives a new trajectory; she holds a person within her – two souls in one body.

I recall an experience of a friend of mine when his wife was pregnant with their first child. He came back from work one day to find his pregnant wife lying on her bed with her hands over her womb, filled with wonder. She explained to her husband that she felt her baby move for the first time and was overwhelmed with the realization of her motherhood, explaining to her husband, “I am not alone in my own body.”

A mother after having her first child will often comment that, had she known how much of herself would have been taken in order to love her child, she would not have thought herself capable of giving so much of herself. Motherhood is an experience that requires all of her. It is a self-emptying love that cares fiercely and intimately for her child.

Maternity, femininity, female-ness – this is our first experience of gender; it is our first experience of life. We are born into – conceived into – this relationship with our mother. It is most natural to us. It is the strongest and longest lasting of human bonds. It is a natural communion. For the rest of their lives, the mother and child will retain something of that intimacy where they were truly two souls in one body.

Beginning from this indescribable intimacy, the child goes through a development. Birth requires a leaving behind of the original closeness of the mother. The dependence of the child on the mother continues – nourishment, locomotion, comfort, bathroom issues – but slowly begins to wane. When the child learns to crawl, a mother is pained to see his reliance on her lessened. When the child takes his first steps, every step is a step away from the mother. Motherhood is tinged with sadness. Watching her child grow apart from her requires all of that self-emptying love.

In my own mother, I’ve seen this self-emptying love every time a sibling leaves my parents’ house to depart for college – fourteen times (I have a big family) one of her children left home, fourteen times she’s cried.

A mother’s vocation begins in intimacy, and ends in separation.

A mother’s love makes room for the child to grow. All human life takes as its origin the intimacy of motherhood. Fatherhood completes the picture.

***

We see this reality of maternal separation lived out most radically in the life of our Blessed Mother. Jesus shared a hidden intimacy with Mary for nine months. At his birth, the shepherds find Him, not wrapped in the arms of His immaculate mother, but wrapped in swaddling clothes and laid in a manger – apart from her. When He is twelve, after being lost for three days in the Temple, He tells her “Did you not know that I must be in my Father’s house?” (Luke 2:49) At Cana, He begins His public ministry with what looks like a rebuke, “Woman, what have you to do with me?” Once while Jesus was close by, Mary tried to get through the crowd to see her Son, and He says, “Who are my mother and my brethren? Here are my mother and my brethren! Whoever does the will of God is my brother, and sister, and mother.” (Mark 3:33-35) Even at the foot of the cross, when she is with Him again, He gives her away, saying to her “Woman, behold, your son” and to St. John, “Behold, your mother.” (John 19:26-27) And then He undergoes the ultimate separation, giving up His spirit and dying on the Cross.

Here, we let Blessed John Henry Newman take over, with his reflection on the Thirteenth Station of the Cross:

He is Thy property now, O Virgin Mother, once again, for He and the world have met and parted. He went out from Thee to do His Father’s work – and He has done and suffered it. Satan and bad men have now no longer any claim upon Him – too long has He been in their arms. Satan took Him up aloft to the high mountain; evil men lifted Him up upon the Cross. He has not been in Thy arms, O Mother of God, since He was a child – but now thou hast a claim upon Him, when the world has done its worst. For thou art the all-favoured, all-blessed, all-gracious Mother of the Highest. We rejoice in this great mystery. He has been hidden in thy womb, He has lain in thy bosom, He has been suckled at thy breasts, He has been carried in thy arms – and now that He is dead, He is placed upon thy lap.

Virgin Mother of God, pray for us.

 

Main image: “Virgin of the Angels,” William Adolphe Bouguereau, 1881
Post by: Deacon Peter Gruber

The New Albigensianism, PART I: From Scotus to S.C.O.T.U.S.

For the most part, religious errors are reducible to four basic ideas.

  1. Jesus is not by nature both fully God and fully human (Arianism, Eutychianism, Monothelitism, Agnoetism, Mormonism, etc.)
  2. There are not three Persons in One God (Modalism, Unitariansim, Subordinationism, Partialism, etc.)
  3. Sanctifying grace is not a free and universally available gift absolutely necessary for salvation (Pelagianism, Semi-Pelagianism, Baianism, Jansenism, Calvinism, etc.)
  4. Matter is not essentially harmoniously ordered with spirit (Manichaeism, Buddhism, Albigensianism, etc.)

While the first three ideas are certainly prevalent in our own day, the correct doctrines are only available through the grace of faith. The falsehood of the fourth, however, is evident from a rigorous use of natural reason alone. Therefore, it is more blameworthy to succumb to that error.

We are seeing today the resurgence of the fourth error in four ways: the sexual revolution, radical feminism, the culture of death, and most recently, gender theory.

The three forms mentioned in the first list (Manichaeism, Buddhism, and Albigensianism) more or less say that matter is evil and needs to be done away with. The Manichees thought that matter was created by an evil god, the Buddhists think that matter is only a distraction, and the Albigensians (or “Cathars”) became so enamored with the thought of the spirit escaping its fleshy prison that suicide became a virtue… But we will talk all about the Cathars later, and we will find some striking similarities between this medieval rigorist dualism and some of the most recent value developments in the Western world.

The current manifestations of the fourth error do not quite say “matter is evil,” but they instead say that the determination of human matter (the body) is irrelevant to the good of the spirit, and/or that the spirit is one’s “true self” which can be served by the body according to one’s whims. Some proponents may claim they don’t believe in spirit, that is, immaterial reality (in this case, the “soul,” or formal principle of life), but when they speak of someone being “a woman trapped in a man’s body,” or something similar, they betray their real thoughts. Still, even if a person insists on denying the reality of spirit, it remains the spirit within him who denies it. There can be no “self-determination” without a self to determine, and if the body simply is the self, then how can there be real determination? There could then only be physical events without any meaning. This, of course, is contradicted by the very existence of “experience.” It is not merely a body which acts, but a person who experiences.

The error in its current expressions can be traced to Descartes, whose laudable project of attaining perfect certainty about the world was, ultimately, a disastrous failure. After shedding all opinions about which he did not have absolute certainty, he was left only with one meaningful truth: cogito, ergo sum. “I think, therefore I am.” No person could both think and not exist.

This was not new, as St. Augustine had come to a similar realization over 1,000 years earlier. The difference was the context and emphasis of the thought; to Augustine, it was an interesting idea coming out of nowhere and going nowhere. To Descartes, it was the foundation of every knowable proposition, and it led to the idea that human beings are essentially thinking (spiritual) beings rather than a body-soul composite… Think “soul trapped in body.”

This came after the ruins of the scholastic project. With the combination of the fixation on choice and freedom in Scotus’ work and Abelard’s troubling take on the problem of universals (how to account for similarities between different things), the stage for Ockham’s Nominalism was set. (See Gilson’s detailed description in his wonderful book, The Unity of Philosophical Experience.) It was Ockham who hammered in the last nail of St. Thomas’ coffin and who paved the way for the “cogito” to be intensely meaningful not only to Descartes, but to the entire Western academy. Nominalism’s dissociation of “things” from any real universal natures which would make those things intelligible as members of species was the first step towards overthrowing classical metaphysics. This “suspicion of being” understandably increased exponentially with the publication of Descartes’ Discourse on the Method, as it cast a serious doubt on the reliability of the senses themselves, doubt that many felt was unable to be overcome, despite a sincere effort to do so on the part of Descartes himself.

The_Matrix_Poster
Descartes: The Movie

The anxiety finally culminated in Kant’s “nervous breakdown”: a total rejection of metaphysics in the denial of the possibility of knowing “the-thing-in-itself” (noumena). From there, much of the academy generally either desperately tried to do without a robust metaphysics or desperately tried to pick up the pieces, and this theme continues today in the strange and fractured world of contemporary philosophy.

Ideas have consequences. As McIntyre shows so well in his book After Virtue in the case of “emotivism” (the position that ethical statements merely express one’s emotional preference for an action) a powerful idea that spreads like wildfire among the right academic circles can eventually stretch into the average home, even if subconsciously. A very well educated person may never have heard of G. E. Moore, but everyone from the wealthy intellectual to the homeless drunkard has encountered some shade of the emotivism Moore’s work gave rise to. The influence which both Descartes and Kant had on the academic scene in their respective eras was so vast and powerful, that it is not unfair to say that Western philosophy after the 17th century was in response to Descartes, and that Western philosophy today is in response to Kant.

The reaction to Descartes’ rationalism was first empiricism, then idealism. The reactions to Kant’s special fusion of rationalism and empiricism (that started “transcendental idealism”) which here concerns us were logical positivism and French existentialism.

Logical positivism is basically dead in academia, although the average militant atheist has taken a cheapened form of Ayer’s positivism to bash over the head of theists, and the general inertia of positivism remains in force in a vaguer “scientism” which hangs heavy in the air.

Existentialism, on the other hand, has become a powerful force in the formation of civil law. The following lengthy quotation is from Justice Anthony Kennedy’s majority opinion given in Planned Parenthood v. Casey (my emphases):

“Our law affords constitutional protection to personal decisions relating to marriage, procreation, contraception, family relationships, child rearing, and education. Carey v. Population Services International, 431 U.S., at 685 . Our cases recognize the right of the individual, married or single, to be free from unwarranted governmental intrusion into matters so fundamentally affecting a person as the decision whether to bear or beget a child. Eisenstadt v. Baird, supra, 405 U.S., at 453 (emphasis in original). Our precedents “have respected the private realm of family life which the state cannot enter.” Prince v. Massachusetts, 321 U.S. 158, 166 (1944). These matters, involving the most intimate and personal choices a person may make in a lifetime, choices central to personal dignity and autonomy, are central to the liberty protected by the Fourteenth Amendment. At the heart of liberty is the right to define one’s own concept of existence, of meaning, of the universe, and of the mystery of human life. Beliefs about these matters could not define the attributes of personhood were they formed under compulsion of the State.

“These considerations begin our analysis of the woman’s interest in terminating her pregnancy, but cannot end it, for this reason: though the abortion decision may originate within the zone of conscience and belief, it is more than a philosophic exercise. Abortion is a unique act. It is an act fraught with consequences for others: for the woman who must live with the implications of her decision; for the persons who perform and assist in the procedure; for the spouse, family, and society which must confront the knowledge that these procedures exist, procedures some deem nothing short of an act of violence against innocent human life; and, depending on one’s beliefs, for the life or potential life that is aborted. Though abortion is conduct, it does not follow that the State is entitled to proscribe it in all instances. That is because the liberty of the woman is at stake in a sense unique to the human condition, and so, unique to the law. The mother who carries a child to full term is subject to anxieties, to physical constraints, to pain that only she must bear. That these sacrifices have from the beginning of the human race been endured by woman with a pride that ennobles her in the eyes of others and gives to the infant a bond of love cannot alone be grounds for the State to insist she make the sacrifice. Her suffering is too intimate and personal for the State to insist, without more, upon its own vision of the woman’s role, however dominant that vision has been in the course of our history and our culture. The destiny of the woman must be shaped to a large extent on her own conception of her spiritual imperatives and her place in society.

No doubt, a critical reader will observe some tragic oddities in this passage. We will table an in-depth analysis, but I do want to point out the bizarre idea that our beliefs can determine reality. One might be tempted to call this “relativism,” and there is indeed some relativism in the passage (the evaluation of the fact of whether a life or potential life is taken in abortion “depending on one’s beliefs”). Without denying this, I also assert that beyond a casual relativism, which might be more a product of a lack of reflection than a real worldview, Kennedy is a deeply committed existentialist. (Indeed, it seems that existentialism naturally disposes a person to relativism.) The thought that one’s beliefs define one’s personhood comes almost directly from Jean-Paul Sartre. The doctrine is: existence precedes essence. Essence is determined by beliefs and actions, according to the existentialist. Such an affront to traditional metaphysics would have been impossible without the aforementioned ideological lineage – Scotus, Abelard, Ockham, Descartes, Kant… Seeing Justice Kennedy through the existentialist lens also helps to account for the striking absence of respect for a human being who can’t believe or meaningfully act. After all, how can such a thing really be a person?

Today’s common philosophy of the Western liberal elite (and their spoiled millennial offspring) seems to be a chimera of these two diametrically opposed worldviews: positivism and existentialism. These ideologies have been filtered into the average home, and watered down in the process in such a way that they can appear to fit together. In this series of articles, we will thematically wind through a maze of philosophy, science, hashtag activism, and moral theology to understand the present crisis and to propose possible remedies for it.

After now having given a brief sketch of the ideological history, we begin next time with a look at the positivist roots of the so-called “New Atheism” and how an undue reverence for science has contributed to what I have termed the “New Albigensianism.”

Stay tuned…

 

For Part II, click here.

Post by: Eamonn Clark

Main image: Carcassonne, France… one of the old Albigensian strongholds.

Main image source: http://en.destinationsuddefrance.com/Discover/Must-See/Carcassonne

The Dark Knight of the Soul: Fortitude in the Batman

Behold, a humorous essay I recently wrote for a moral theology class, with some slight edits. Enjoy!

Mr. Bruce Wayne had a troubled childhood. Not only did he lose his parents to a crazed gunman, but he also fell into a deep well full of bats. The former occasioned the inheritance of vast amounts of wealth, while the latter occasioned an intense case of chiroptophobia (fear of bats). Together, these effects would eventually lead him to undertake a massive bat-themed vigilante project which would dominate his life and cause a complicated set of benefits and drawbacks in Gotham City. The question is: whether the act of becoming the Batman was an act of true fortitude on the part of Bruce Wayne?

What is clear is that in Batman’s vigilante project there is matter for fortitude, namely, dangers of death. “Now fortitude is a virtue; and it is essential to virtue ever to tend to good; wherefore it is in order to pursue some good that man does not fly from the danger of death.” (1) Wayne, of course, is choosing to fly toward dangers of death, and literally at that. With countless thugs, gang leaders, and dastardly supervillains, Gotham is anything but safe; and this is not even to mention the means which Wayne adopts for fighting crime, which includes jumping off skyscrapers and careening in between all kinds of obstacles, supported by some mesh wings. He is doing battle with criminals who might kill him, in a way that might kill him. “Dangers of death occurring in battle” are the proper matter for fortitude, beyond lesser evils like bodily pain or the annoyance of standing in line at the DMV. (2)

It seems that Wayne might have gone to a vicious extreme in overcoming his own private chiroptophobia by becoming “half bat.” Yet there is really nothing to fear about bats in themselves, so to fear bats at all seems to be a case of timidity. This means that overcoming such a fear is a good thing to do. In facing his repressed traumatic experience of nearly dying in the well, which became so closely associated with the well’s bats, Wayne becoming Batman would only tend towards a vicious neurosis if his new bat-persona did not serve some purpose beyond itself. That is to say, if Wayne habitually dressed up like a bat in his own house and looked in the mirror, this would be disordered. Taking on the bat-persona for the sake of intimidating criminals, which is his primary motivation, is something else entirely.

Wayne does not become Flowerman or Butterflyman or Puppyman, he becomes Batman. Even if he had had traumatic experiences with flowers and butterflies and puppies, surely he would not want to deal with those memories in the same way. The idea of a vigilante qua bat (or alternatively qua spider) is simply terrifying, which is the point: it is an effective aid to fighting crime. This, however, does not necessarily make it prudent, as prudence means that justice and other virtues are not being violated. Here we will simply mention the possibility that vigilantism is unjustifiable in Gotham, given that there are good cops like Commissioner Gordon around. If Wayne had not considered this, or had not considered the physical risks involved, then the decision would be imprudent regardless of whether it is just. Becoming a vigilante virtuously requires serious counsel and an understanding of the principles of law. (3)

There are certain appearances of fearlessness and daring throughout the career of Batman, but one must wonder if this is merely a result of having mastered the fear of death during his time training in the mountains with the League of Shadows. On the contrary, Wayne goes to great lengths to protect himself, investing in the production and maintenance of extremely sophisticated protective devices, and this could exonerate him at least of fearlessness. Batman, supposing his project is just, certainly ought to fear death, not just for his own sake, seeing as life is a great good, but also for Gotham’s sake: “Death and whatever else can be inflicted by mortal man are not to be feared so that they make us forsake justice: but they are to be feared as hindering man in acts of virtue, either as regards himself, or as regards the progress he may cause in others.” (4) This is also part of why concealing his true identity is so important, for if it was widely known that Batman is Bruce Wayne, he would be easier to destroy.

As for magnanimity, Wayne already has great honors, insofar as honors accrue to a man of enormous wealth such as himself. Ironically, his public identity as a billionaire is a cover for what he really lives for privately, which is the accomplishment of great things like deposing crime bosses and deterring supervillains at great personal risk. He accepts the “unofficial honors” that come with such acts, but he does not care for them for their own sake, so he is not ambitious. He takes on the project to give the city of Gotham hope, which is where he refers the glory given to him as Batman. Therefore, Batman has a degree of magnanimity. (5) There is, however, an element of Wayne’s public life that is pusillanimous, as he purposefully distances himself from seeming great by being an arrogant, dishonest, quarrelsome womanizer. He could gain more honor publicly by being more virtuous, but he rightly fears that this could lead to the suspicion that he is Batman. Insofar as this component of concealing his nocturnal activities is vicious, it is neither magnanimous nor fortitudinous, as sins cannot be called acts of virtue.

The crime fighting skills of Wayne are second to none, and since he has ordered his life and vast wealth towards crime fighting without compromising his fortune or social status, he most certainly deserves to be ascribed the virtue of magnificence. For, “[It] belongs to magnificence not only to do something great, ‘doing’ (facere) being taken in the strict sense, but also to tend with the mind to the doing of great things.” (6) Since Wayne could do almost anything he wants on account of his wealth, the good use of which is the proper object of magnificence, his mind certainly tends with great force toward the accomplishment of masterful crime fighting. (7) Otherwise he would do whatever it is that other billionaires do.

To the question, whether Bruce Wayne’s choice to become Batman was an act of true fortitude, we answer is the affirmative, with two qualifications. The first is that the entire vigilante project is just, which is unclear. The second is that the artificial public persona taken on as part of the condition for the project, which can be assumed to have been part of the means from the start, is at least mildly vicious and therefore reduces the fortitudinous character of the choice.

(1) STh II-II q. 123 a. 5 ans.

(2) Ibid.

(3) Namely, gnome and epikeia would be required. See STh II-II q. 51 a. 4; q. 120 a. 1, a. 2

(4) STh II-II q. 126 a. 1 rep. 2

(5) That his voice is extraordinarily deep is not a sign of greater magnanimity, it is merely another component of his intimidation, as well as a way to conceal his public identity. Furthermore, that he does not walk slowly to accomplish his tasks does not imply a lack of magnanimity, as the particular kind of great things which he seeks to accomplish demand agility.

(6) STh II-II q. 134 a. 2 rep. 2

(7) STh II-II q. 134 a. 2

 

Post by: Eamonn Clark