The Greatest Man Who Ever Lived

Today we celebrate the birthday of Jesus of Nazareth, widely considered the greatest man who ever lived. His teachings are arguably the most profound, insightful and challenging the world has ever seen. He has inspired countless universities, hospitals, and ministries to the sick, poor, and oppressed. He has inspired the world’s greatest literature, art, and music, not to mention the greatest theological treatises and much of the greatest philosophy in history. He has also inspired countless followers to make extreme sacrifices, many even literally giving up their lives, in his service. So great is his influence on human history, in fact, that this itinerant teacher became the reference point for the world’s dating system. That’s remarkable stuff for a man who never held a political office, never led an army, never authored any books or even a single essay, nor did he even travel more than a few hundred miles from his hometown.

So how did he manage to so profoundly impact human history? For Christians, of course, the answer is that Jesus wasn’t just the greatest man who ever lived. He was something far more than this.

“Now unto him that is able to keep you from falling, and to present you faultless before the presence of his glory with exceeding joy. To the only wise God our Savior, be glory and majesty, dominion and power, both now and ever. Amen.” (Jude 1:24-25)

Anticipating a Trump Presidency

A little more than three weeks out from the presidential election, and the anti-Trump riots have subsided, at least for now. This might be the calm before the proverbial storm, if some predictions are correct. In any case, extreme negative responses on the left continue, as do exuberant responses on the right. Such strong reactions among Christians are especially dismaying—suggesting that there is an inordinate hope and trust in political power for human flourishing in this country. We need to heed Augustine’s important reminder that there is only one reasonable Kingdom hope, and that is in the Kingdom to come where Christ is king. Of course, this does not mean we should be apathetic or unengaged in civil matters and political work. But it does mean that we should not be distraught or desperate when those we vote or campaign for do not win elections.

In 2008 many felt a sense of doom when Obama was elected. They expressed the same sort of desperation and distress that some on the left have been experiencing lately (though I don’t recall any rioting as a consequence of this). Well, those eight years passed, and we’re all still here. Will we survive the next four years under a Trump administration? I think its safe to say that we will, that is unless the populous reacts in severe and destructive ways, which certainly seems possible if anti-Trump sentiments continue to grow.

Often in politics the response to a negative situation can be more dangerous than the negative situation itself—like an allergic reaction to a relatively minor health issue can prompt a serious, even fatal condition. As a nation, we need to avoid such a deadly “allergic” response to the Trump presidency. Many of these responses, by the way, seem to be aggravated by media exacerbation of Trump’s vices, which are numerous, for sure, but hardly out of step with those of past presidents—including the severe racism of Woodrow Wilson, Franklin Roosevelt, and Lyndon Johnson and the womanizing of JFK and Bill Clinton. I didn’t vote for Trump and am disappointed that he is poised to be our next President (though I didn’t vote for Hillary Clinton and think she would have been an even worse choice). But I do think we owe Trump a chance to govern and we should apply the principle of charity when it comes to interpreting many of his comments. Just as many conservatives gave Obama a fair chance and responded peacefully while critiquing his policy decisions along the way, liberals should likewise give the Trump administration a fair chance, and this includes responding peacefully even while offering well-reasoned and respectful critiques.

Swinburne, Homosexuality, and the Society of Christian Philosophers

This weekend I attended the Midwest meeting of the Society of Christian Philosophers at Evangel University. The theme of the conference was “Christian Philosophy and Public Engagement.” The keynote speakers included Alexander Pruss (Baylor University), Leigh Vicens (Augustana University) and the eminent Richard Swinburne (Oxford University). Each of the keynoters gave thoughtful and stimulating presentations.

Unfortunately Swinburne’s talk sparked controversy, though it really shouldn’t have. In his presentation, entitled “Christian Moral Teaching on Sex, Family and Life” he addressed, among many other moral issues, homosexuality. img_2160  He noted that the inability of homosexual couples to procreate constitutes a “disability” and referred to those gays and lesbians who are unable to develop heterosexual desires as “incurable.” During the Q&A that followed, an attendee named J. Edward Hackett badgered Swinburne with a Foucauldian rebuke, insisting that Swinburne’s constituted “metaphysical violence.” Hackett never addressed his actual arguments but simply made this indignant accusation, to which Swinburne responded with admirable patience and grace.

The next day Hackett posted about it on the Philosophical Percolations blog. Hackett’s piece is a semi-coherent rant that misconstrues Swinburne’s actual remarks, though he is correct in noting that Swinburne believes—in agreement with Christian scholars throughout church history—that homosexual behavior is morally wrong. This was followed just hours later with disclaimers by both the SCP president Michael Rea and img_2162executive director Christina Van Dyke, distancing the SCP from Swinburne’s remarks. Rea’s statement, posted on his Facebook page, is as follows:

I want to express my regret regarding the hurt caused by the recent Midwest meeting of the Society for Christian Philosophers. The views expressed in Professor Swinburne’s keynote are not those of the SCP itself. Though our membership is broadly united by way of religious faith, the views of our members are otherwise diverse. As President of the SCP, I am committed to promoting the intellectual life of our philosophical community. Consequently (among other reasons), I am committed to the values of diversity and inclusion. As an organization, we have fallen short of those ideals before, and surely we will again.

Not surprisingly, this prompted a lengthy discussion with opinions expressing both support and criticism of Rea’s disclaimer. Personally, I side with those who are critical of Rea’s approach, and for several reasons.

First, while Rea would likely insist (as some supporting him do) that such a disclaimer does not necessarily constitute a rejection of Swinburne’s view, such seems to be implied. Disclaimers like this are only issued when an organization regards someone’s views as embarrassing or problematic and thus effectively amounts to a censure. For an academic society to do this to an invited speaker is really bad form, but it is especially inappropriate when the speaker is someone of the stature of Richard Swinburne, who is one of the top philosophers of religion in the world and whose work for the Society of Christian Philosophers for more than three decades has been immense. If I were Swinburne, I would feel humiliated by this. Talk about “hurt” that is worthy of “regret.”

Second, this disclaimer sets a dangerous precedent and chills the academic air for anyone in the SCP who holds the traditional view on the ethics of homosexuality. Will I be the next one to be called out by an SCP officer if I express the same view at a future conference? While surely not intended to censor the advocacy of the traditional view of Christian sexuality at SCP meetings, from a psychological standpoint Rea’s remarks could be tantamount to this. Some Christian scholars active in the SCP, especially those who are early in their careers, may be intimidated into silence about their traditional views on sexual ethics. This is hardly an atmosphere that is desirable for an academic community where the free and open sharing of ideas is crucial.

Third, it is disturbingly ironic that Rea’s disclaimer distances the SCP from what is an historic Christian conviction regarding the morality of homosexual behavior. Would he have posted a similar disclaimer if a keynote speaker had defended a permissivist view on homosexuality? I doubt it. But now that the traditional view is under fire in our culture, he deems it necessary to disclaim a speaker’s assertion of the view—a stance which, by the way, was not as strong as some assertions in biblical passages such as Romans 1:26-27, 1 Corinthians 6:9-10, and 1 Timothy 1:9-10 (not to mention the language used in Leviticus 18:22 and 20:13). One wonders if Rea would have posted a similar disclaimer if St. Paul himself delivered a keynote address at the SCP. After all, if the apostle simply read the relevant passages from his epistles, his remarks would be no less “hurtful” than Swinburne’s.

I don’t know where this controversy will lead or how this will impact the SCP. But one thing is certain: it is indicative of a dramatic and alarming shift regarding discussions of sexual ethics within the Christian academic community.

Kicking the Caffeine Habit

It was 2 a.m., and I awoke in a literal cold sweat. The room was warm, and I was layered in covers, but I felt freezing cold. My body ached as if wracked by the flu. But it wasn’t the flu. It was withdrawals. Not from heroin or some other hard narcotic. It was caffeine withdrawals, and this was the price of my finally kicking a two-decade long addiction. Well, at least part of the price. The next morning would bring a massive headache, which took several days to finally taper off. It wasn’t until day nine that the withdrawal symptoms fully abated and I felt that I was finally free.

220px-Koffein_-_Caffeine.svgMy love affair with caffeine began in the Spring of 1983, motivated by my second-semester Organic Chemistry class. (Thank you, Dr. Kelly.) At first, it was tea, but then I tried coffee and realized that I actually liked the flavor, unlike my father who had an innate distaste for the stuff. From there, I drank coffee periodically until the mid-90s when I slipped into a morning routine that featured coffee and cereal. Then came the Starbucks explosion of the late 90s and I was on-board for the ride, my standard choice being raspberry mocha. A few years later, the dirty chai became a favorite as well. Few things brought me more gustatory delight than these drinks.

So why quit? Why subtract from one’s life such a wonderful culinary aesthetic? It certainly wasn’t because I came to any conviction that caffeine consumption is morally wrong. In fact, now that the dust has settled and I know longer “need” my daily caffeine fix, I do drink the occasional Coke or decaf coffee or tea, which contains small amounts of caffeine. The problem was simply the fact that I was addicted, physiologically dependent on a chemical. The Bible says, “a man is a slave to whatever has mastered him” (2 Pet. 2:19). And caffeine had mastered me, as Amy had pointed out to me several times, usually when I suffered from headaches after going to long without my fix. And reading some classic spiritual works from the early church fathers (e.g., Clement of Alexandria, Athanasius, John Cassian) this summer only reinforced the incentive to challenge my physical appetites. I began to fantasize about the idea of being free from the burden of supplying my body with caffeine every day. What would that be like?

So I decided to do it, beginning with a few days of decreasing my caffeine intake and then quitting altogether, which is what brought on the flu-like symptoms. Now that I’ve been addiction-free for over a month, it still feels strangely liberating. I do sometimes miss the morning ritual, which in recent years has featured tea rather than coffee. And I miss the full-bodied flavor of caffeinated coffee when I occasionally drink a decaf alternative. But it’s been worth it.

Why Tattoos Bug Me (and perhaps you)

For as long as I can remember, I have been annoyed by tattoos. Well, mainly the kind that are prominently displayed, especially on people’s torsos. And for as long as I’ve been bothered by them, I’ve wondered why they bug me. After all, people adorn their bodies in all sorts of other ways—with jewelry, piercings, make-up, etc.—and those things don’t usually annoy me. So why tattoos? My frustration at not knowing why they bother me so much has only been aggravated by the fact that, apparently, they bother many other people too. Yet, as I’ve discussed the matter with a lot of folks who feel the same way, they are almost always at a loss to explain their displeasure and usually throw it up to inexplicable personal distaste. Well, recently the reason for my distaste finally dawned on me, and so I will now share it with you, gentle reader.


We’ve all had the experience of encountering the elaborate dragon, butterfly, skull, jungle scene, abstract image or geometrical shape tatted on the back of that guy or gal in front of us at the grocery store, and we’ve been impressed by the exquisite detail or perhaps completely perplexed by what we were seeing. Then when s/he suddenly turned around, we looked away just in time before the person caught us gawking at her/him. Whew. But, then again, while staring at the tattoo, was it really her/him we were looking at?

Tattoos are typically intended to be artistic or, one might say, even genuine works of art. And works of art are supposed to be enjoyable objects of aesthetic pleasure. Art works also welcome study and analysis, and in the case of visual arts this means careful and extensive visual examination. However, the bodies of fellow human beings whom we do not know very well are not properly objects of intensive visual study. To stare at or closely examine another person’s body in public is, well, rude and inappropriate.

And there lies the problem with many tattoos, it seems to me. A prominent, eye-catching tattoo invites or tempts others, even complete strangers, to closely examine the person’s body—that is, to do something that is rude and inappropriate. And the stronger one’s visual aesthetic sensibility, it would seem the stronger that temptation will likely be. Now the tattoo fan will likely respond here by saying “too bad, just ignore it.” Well, of course, that’s just what I struggle to do every time I encounter someone in public who has a prominent tattoo. I restrain myself and look away—anywhere but at that person’s body. But in doing so, I must exert mental energy to avoid doing so. And, the more artistic and aesthetically interesting the tattoo, the more mental energy I must expend and the harder I must work to distract myself from looking at the image.

Now here’s an analogy. We all know—or should know, anyway—that it’s often rude to whistle tunes in public spaces. While the whistler might enjoy the melody s/he is making, it is unwelcome to others and a sonic distraction. Of course, the whistler might say, “too bad, just ignore it.” But that is to give oneself license to sonically impose on others for the sake of their own “art.” Yes, we can ignore it, but many times that’s difficult, and it requires an extra exertion of mental energy to do so. Which is why the whistling is rude and inconsiderate. Similarly, prominent tattoos are an imposition on others—a visual imposition that places a burden on others to exert a bit more mental energy to “ignore” it. And, similarly, it is rude and inconsiderate. But what’s worse in the case of tattoos, the imposition they place on others is also a temptation to do something inappropriate—gawk at a stranger’s body in public. So as annoying as the public whistler is, at least they don’t tempt me to do something inappropriate (assuming their whistling isn’t so bad that I’m tempted to slug them).

In sum, then, prominent tattoos are a rude imposition on others, as they invite people to see the human body as an object and in so doing tempt them to do something inappropriate. That’s why tattoos bug me. And perhaps that’s why they bug you, too.

Disabled to Serve

Recently, my younger kids have become obsessed with the game Five Nights at Freddy’s. I have gathered, through half-listening eavesdropping in the car and around the house, that the premise of this game is walking around a house, scaring yourself silly in that you-know-it’s-coming-but-can’t-keep-yourself-from-jumping-anyway kind of way. It’s described by Google as a “survival horror video game.” Gee, sounds like a barrel of laughs. Who wouldn’t want to play…other than me and most sane adults?

In looking for a silver-lining to this otherwise mind-wasting pastime, I guess there is a bit of life wisdom to be gained from a game designed to frighten you despite your being prepared. Sounds a lot like the reality of living in a fallen world. Learning to survive the unexpected. You know something bad is lurking just around the corner. Only question is when and in what form it will pop out and give you a fright.

Last month, I had one of those experiences. It was a sad rather than frightening event that nonetheless reminded me of this world utter lack of predictability. My aunt, a godly, loving woman, had a massive stroke and passed away. She has suffered from a brain abnormality all her life and despite knowing the odds were not in her favor, I was quite shaken by her death. It was one of those moments, like playing Five Nights at Freddy’s, no matter how much you think you have prepared yourself, it still catches you off guard.

I had the great blessing of growing up surrounded by family. Both my mom’s siblings and her parents lived nearby and we saw them often. My aunt and I were close and in spite of my innumerable failings, she loved me fiercely. When Jim and I got married and had children, this fierce blanket of sometimes near-suffocating love enveloped them as well. I am quite certain she annoyed people on a regular if not daily basis telling them all about our comings and goings. She was this way with all her nephews and nieces not to mention family and friends. She was like Geico—loving people was just what she did.

I knew that when she died, our family would lose our biggest fan. What I didn’t know was the scope of her love for others outside our family circle. Here was a woman who on paper didn’t have a lot to offer the world. Due to a series of strokes, she was no longer able to drive or walk without the assistance of a walker. She had long since retired from her teaching position and for as long as I can remember could not use her left hand. And yet, on the night of her memorial service, we stood for hours while person after person shook our hands and told us of the deep and meaningful impact my aunt had had on their lives. Person after person after person. For hours.

In the eyes of some, my aunt might have seemed to have little value in this world, but through her willingness to serve, she became a humble vessel of God’s love and compassion. She also served as representation of the brokenness that we all carry through life. She was broken physically, but managed to do mighty things for the Kingdom. She taught me that God’s work in and through us all starts at the place where we admit we can do nothing. That we are nothing without Him. Standing in the receiving line, I came to understand that my perspective on what is and isn’t important in life is often bass-ackwards. It is the phone call you don’t put off or the card you send or the small prayer you pray that make the world a far better place than any issue you blog about or book you author or check you write. Those things are needed too, but without a sense of humble service, they will all turn to ashes in the refining fire of God’s judgment.

I can’t talk about my aunt without mentioning my mom. If you look up humble service in God’s yellow pages, I am sure my mom has a full-page ad, though of course she would never have placed it herself. Hopefully, she won’t read this post or I will be in big trouble for putting her in the spotlight. My aunt served and loved many people, but she could never have done so without my mom, quietly balancing her checkbook, driving her to seemingly endless doctors’ appointments or coming over to clean up after my aunt had had an accident. She could have easily seen my aunt as a burden. And being human, I know she had days when she struggled to be patient or kind. But just as my aunt showed me unconditional love that was blind to many of my flaws, my mom has taught me unconditional love that sees you warts and all and loves you anyway. She has taught me that Christian service isn’t for Pollyannas and Suzy Sunshines. Jesus didn’t wash the disciples feet because he thought it would be a fun party game. He got down in the dirt, saw their filth and loved them anyway. He got on that cross because I wasn’t worthy and He wanted to make me so in Him. He died in agony so that I could follow His example and the examples of my mom and aunt, so that I could love as I have been loved. His death made me capable of receiving God’s love and His resurrection makes me capable of showing that love to others.

In the weeks following her death, I have gotten great joy in imagining my aunt, whole in body and mind, doing things that were impossible for her this side of heaven. But I have also come to understand that her disabilities are what made her work here on earth possible. She was able to serve in her unique and God-orchestrated way, not despite her handicaps but because of them.

Her limitations helped her to see the limitations of others and love them anyway. Her limitations also gave others, like my mom, the opportunity to serve. God was glorified in and through and because of her impairment which in the end was not an impairment at all.

I hope to honor my family’s legacy of service by looking for those less capable in whatever way and offering assistance when I can. But I also want to honor them by accepting my weaknesses and looking to see how God might use them to bring Himself greater glory. I want to see where He has disabled me in order that I might serve him more.

You Gotta Believe

Many atheists and agnostics like to declare that only religious people have faith.  However, if by “faith” one means a belief that ventures beyond the evidence or what can be strictly proven, then every (sane) person has significant faith.  In fact, all of us exercise lots of faith in many ways every day.  It is not only the “religious” folk who do so.

from Salvo Magazine

I explain why this is so in my article “You Gotta Believe” in the latest issue of Salvo.  I show how even some of our most basic and common sense beliefs are as unprovable as they are irresistible and that even the most rigorous scientist makes a number of assumptions that are essentially faith ventures.  Faith, it turns out, is unavoidable, despite what popular clichés might suggest to the contrary.

By the way, Salvo is a really cool magazine about society, sex and science.  So after you read my piece, be sure to subscribe.  And encourage all your friends (and enemies) to do so as well.

Four Arguments for Purgatory

The doctrine of purgatory is naturally associated with Roman Catholic theology, but some Protestant philosophers and theologians affirm the doctrine (albeit a version of the view which sees purgatory as serving the function of completing sanctification rather than providing final satisfaction for sin). One of the most prominent of these is Jerry Walls, who has published a trilogy of Oxford monographs on personal eschatology, as well as Heaven, Hell, and Purgatory, a single volume treatment of the subject.

Walls, who is a Wesleyan, defends the doctrine of purgatory beginning with the basic idea that salvation is not just about forgiveness of sins but is mainly about spiritual transformation. So if salvation essentially involves transformation, “what becomes of those who plead the atonement of Christ for salvation but die before they have


been thoroughly transformed?” Such people, he says, “do not seem ready for a heaven of perfect love and fellowship with God, but neither should they be consigned to hell.” The standard Protestant view is that for such people full sanctification is accomplished immediately and painlessly “by a unilateral act of God at death.” This view, which some scholars call “provisionism” is deeply problematic, according to Walls. One of his arguments appeals to human temporality. Since all human moral growth and maturation on earth is a process that takes time, then it makes sense to assume that our moral progress in the next world will be a temporal process. This suggests something like a purgatorial completion of our sanctification. Walls also uses an argument from human freedom. The idea here is that all human sanctification on earth involves free human choices as we cooperate in the process of moral growth. A unilateral act of God that instantaneously perfects us would be a radical departure from this basic aspect of our experience.

Another Protestant advocate of purgatory is philosopher Justin Barnard. In his Faith and Philosophy article “Purgatory and the Dilemma of Sanctification,” Barnard emphasizes two further considerations. One of these is the problem of personal identity that provisionists face. With such a radical sudden transformation of one’s moral nature, as provisionists propose, how can one be properly considered the same person afterwards? Preservation of personal identity through time requires more gradual change, Barnard would say, and this suggests a slow purgatorial transformation.

But Barnard’s primary concern regards the problem of evil. If God can perfect us morally suddenly after death, then why doesn’t he do it now? The fact that God waits suggests that there is a lot of evil that God cannot remove “without thereby sacrificing any significant good.” Here some appeal to the idea that God refrains from perfecting us on earth in order to respect our free will. But then this implies that God takes away our freedom when he perfects us in heaven. But if that’s not problematic in heaven, then why would it be problematic here? Barnard proposes that the doctrine of purgatory—or his “sanctification” version of the doctrine anyway—avoids this problem, as it says that the process of moral perfection that we begin on earth is simply completed in the afterlife—gradually and eventually completely.

Personally, I am not a proponent of the doctrine of purgatory, but I must admit that such arguments give me pause. While they might never ultimately persuade me to accept the doctrine, I certainly respect the view and see why it has been affirmed by so many great Christian thinkers down through history.

William James on Human Immortality

William James was one of the leading American pragmatists of the late 19th and early 20th centuries. Although highly empiricistic in his bent, his openness and increasing sympathy with belief in a transcendent reality is remarkable. Especially through his research for his Gifford Lectures on religious experience, which culminated in the classic text The Varieties of Religious Experience, James’s views seemed to shift from openness to bona fide belief.

In his 1898 essay “Human Immortality” James argues that even if we assume the brain-dependence of the mind, this does not rule out the possibility of life after death. James begins with the assumption that “thought is a function of the brain” and so asks, “Does this doctrine logically compel us to disbelieve in immortality?” 220px-William_James_b1842cHe answers negatively, and all he needs to do to support his thesis is offer a reasonable way in which such survival is possible even given this assumption. This would show that the functional dependence of the mind on the body “has in strict logic no deterrent power” when it comes to belief in immortality.

“The fatal conclusion of the physiologist,” says James “flows from his assuming off-hand another kind of functional dependence, and treating it as the only imaginable kind.” The truth is that there are several kinds of functional dependence, only one of which is the productive function that materialists assume about the brain-mind.

James asks us to consider two other kinds of functional dependence: (1) a releasing function, as when an obstacle is removed from the bow, allowing the bow to bounce back and, thus, the arrow to be shot away or when a plug is removed from a drain, allowing water to flow into the pipe and (2) a transmissive function, as when a prism or refracting lens allows light to pass through while determining the color, path, and shape of that light as it proceeds. In both of these cases there is functional dependence, but in neither case is the dependence productive.

So the question is whether the functional dependence of the mind on the brain must be productive. James says no, “we are entitled also to consider permissive or transmissive function. And this is what the ordinary psycho-physiologist leaves out of his account.” So James is proposing the possibility that the brain does not produce but rather transmits or releases mental activity, in the sense that there is a realm of consciousness beyond this physical realm—whether a single, monolithic consciousness, as conceived by pantheists or innumerable individual consciousnesses as conceived in orthodox Christian and Jewish traditions—which breaks into the physical realm via our brains.

James writes, “Consciousness in this process does not have to be generated de novo in a vast number of places. It exists already, behind the scenes, coeval with the world. The transmission theory not only avoids in this way multiplying miracles, but it put itself in touch with general idealistic philosophy better than the production-theory does. It should always be reckoned a good thing when science and philosophy thus meet.” As a Berkeleyan idealist myself, I am especially pleased to see James make this important observation. He continues: “On the production-theory one does not see from what sensation such odd bits of knowledge are produced. On the transmission-theory, they don’t have to be ‘produced,’—they exist ready-made in the transcendental world, and all that is needed is an abnormal lowering of the brain-threshold to let them through.” So, on this view, death doesn’t bring destruction of the person. Rather, “all that can remain after the brain expires is the larger consciousness itself as such” whether conceived in a pantheistic or traditionally theistic way.

I find James’ perspective here to be refreshing for a couple of reasons. In the first place, the theory he proposes here is quite plausible—it has significant explanatory power, and it avoids many philosophical problems related to the two major theories of mind—physicalism and substance dualism. Secondly, James’ approach is refreshing because of his open-mindedness and theoretical adventurousness. James’s views evolved throughout his philosophical career, and as he explored issues in the area of religious experience, he showed an admirable willingness to allow his findings to open his mind to the pervasive reality of the supernatural. The proposal he makes in “Human Immortality” is just one instance of this.

Mill on Immortality

In a posthumously published essay entitled “Theism,” the great 19th century British philosopher John Stuart Mill takes an agnostic stance on life after death. Here I will review some of Mill’s arguments in this essay.

Mill rehearses the standard “correlation argument” against survival (formerly deployed by Lucretius, Hume and others). He observes that “the different degrees of complication of the nervous and cerebral organization, correspond to differences in the development of the mental faculties; and . . . diseases of the brain disturb the mental functions and . . . decay or weakness of the brain enfeebles them. We have therefore sufficient evidence that cerebral action is, if not the cause, at least . . . a condition sine qua non of mental operations.” Mill rightly concedes, John_Stuart_Mill_by_London_Stereoscopic_Company,_c1870however, “these considerations only amount to defect of evidence; they afford no positive argument against immortality.”

Mill further notes that belief in immortality isn’t grounded in philosophical or scientific arguments anyway but rather is inspired by “our own wishes and the general assent of other people.” There is also the fact that immortality is naturally desired, which some (such as Aquinas) have parleyed into an argument in favor of the belief. Mill writes, “We are told that the desire of immortality is one of our instincts, and that there is no instinct which has not corresponding to it a real object fitted to satisfy it.” Mill critiques this argument by comparing it to an inference from the desire for food to the conclusion that we will always have plenty to eat. This is clearly a misunderstanding of the argument, however. Since the inference is not just from the presence of desire to the conclusion that the desire will be fulfilled. Rather, the argument reasons from the fact that there is a natural human desire to the conclusion that that desire can be fulfilled (not that it necessarily will be). So Mill’s criticism doubly misconstrues the argument from desire.

Mill notes another line of argument, which is based in the goodness of God and “the improbability that God would ordain the annihilation of his noblest and richest work . . . and the special improbability that he would have implanted in us an instinctive desire of eternal life and doomed that desire to complete disappointment.” He says the problem with this argument is that it assumes we know more than we do about God’s broader purposes, some element of which might have made it best to give us this desire without its being fulfilled. Mill is certainly correct on this point.

Mill concludes that we have “no assurance whatever of a life after death on grounds of natural religion. But . . . there is no hindrance to his indulging that hope.” This tempered conclusion makes sense given the arguments he discusses, even despite his misconstrual of the argument from desire. Philosophical arguments (or those based on “natural religion,” as he puts it) for life after death, much less human immortality, are probably fall short of providing anything like assurance, at least those arguments available in Mill’s day. (Contemporary arguments based on recent near-death experience research, however, might be a different story. I will explore such in some upcoming posts.) But Mill is also wise to allow for the reasonableness of indulging in the “hope” of life after death. I would certainly say so, given (1) the nearly universal desire for survival, (2) the Kantian point that ethics depends upon immortality, and (3) the theological grounds for immortality, which are considerable.