Because of intense fear, we refuse to acknowledge that nothing in this world is permanent, that everything perishes, that soon we will be no more. Lodged within every human heart is the blind hope that death comes to others, not to us…

Prometheus was the one Olympian god to rebel against Zeus’ plan to wipe out the human race and create another. In this Greek myth, as told by Aeschylus, humans lived like ants in sunless caves, possessing neither science nor technology. With the ability to foresee the future, the fear of death paralyzed them. Prometheus took pity upon these wretched creatures and “caused them to cease foreseeing death,” lodged “blind hopes” in their hearts, stole fire from Mount Olympus and gave it to humans so they would learn the arts for a good life in this world and thus further blind themselves to the inevitable death that awaits all mortals.[i]

We die; everything perishes.

Yet, we believe the United States of America will last forever; we hope to preserve our memory by founding a college, a business, or a family; we dream of immortalizing ourselves by setting a sports record, making a significant scientific discovery, or writing the great American novel. But the Anasazis, the Hittites, and the Sumareans remain only in name; barely legible inscriptions on granite headstones proclaim the deceased is “gone, but not forgotten;” and, Jack “Gentleman” Johnson, Friedrich August Kekule, and Theodore Dreiser, giants in their own days, are now historical footnotes. Nothing escapes time; everything flows to oblivion.

The eleventh-century Chinese scholar-poet Su Tung-po compared all human endeavors to the footprints left by geese on snow:

To what can our life on earth be likened?
To a flock of geese,
Alighting on the snow.
Sometimes leaving a trace of their passage.[ii]

The writer of Ecclesiastes preaches that everything in this world — wisdom and folly, riches and poverty, happiness and grief — “all is vanity and a striving after wind.” A man dies and nothing remains.

Vanity of vanities, says the
Preacher,
vanity of vanities! All is Vanity.
What does man gain by all the toil
at which he toils under the sun?
A generation goes, and a generation
comes,
but the earth remains for ever.[iii]

Most us do not despair over our dreadful situation. In the ancient Indian epic poem, the Mahabharata, the sage Yudhisthira is asked, “Of all things in life, what is the most amazing?” Yudhisthira answers, “That a man, seeing others die all around him, never thinks that he will die.”[iv] In the East and the West, men and women use amusement and fantasy to divert their attention from the reality of death. Philosopher-mathematician Blaise Pascal observed in his day, the early seventeenth century, that the chief diversions were the hunt and the dance. A man whose mind is occupied with where to rightly put his feet cannot reflect upon the inevitable death that awaits him.[v] In our democratic era, the diversions readily at hand to us far surpass those available to seventeenth-century French aristocrats. With the touch of a finger we change channels on the TV, select a different track on a CD, or go to a new Website on the Internet.

What we hate most of all is silence and solitude. Home alone, the TV set drones away in the background, even when we are not watching it. We get into the car by ourselves, and our hand automatically reaches to turn on the radio. We avoid every opportunity to reflect upon where we come from, what we are doing here, and where we are going. When reality intrudes, we reach for the psychotropics, and thank Pfizer, the maker of Zoloft, for taking the edge off life.

In the modern world, particularly in America, we blind ourselves to illness, aging, and dying. The critically ill are moved to intensive care units, where they are hidden beneath cables and tubes connected to monitors and respirators; the old and infirm with cognitive problems are housed in memory-care facilities and are often confined to a single room with a flat screen TV and artificial flowers; the dying spend their last days in a hospice center, staffed by professionals, all strangers.

The Buddha unflinchingly stared Death down. He preached in his first sermon to five ascetics, his old companions, in the Deer Park at Isiptana near Benares that human existence is inseparable from suffering (dukka), that the cessation of suffering occurs by extinguishing craving, and that the liberation from craving results from ardently following the Eightfold Path. [vi]

Dukka follows from the most fundamental principle that Buddha taught: the impermanency of all compound things, an indisputable principle. Civilizations rise and fall; species of plants and animals come and go; continents drift and produce mountain ranges; that wind and water erode rock and level mountains. Twentieth-century cosmologists discovered that the universe, itself, is destined to end with the Big Freeze, a cold eventless state of electrons, neutrinos, antielectrons, and antineutrinos. We are born, walk around for a while, and then disappear. Everything and everyone we love inevitably changes, decays, dies, and vanishes. Nothing lasts. All this is indisputable, although unlike the Buddha we blind ourselves to this reality.

Surprisingly, many American Buddhists are experts at denying death, unlike their Master. Buddhism in our culture verges on being a bourgeois amenity, like Whole Foods, where the production of fruits, vegetables, and meat is sanitized, hidden from view. Robert Wright, in his best-selling book Why Buddhism is True: The Science and Philosophy of Meditation and Enlightenment, translates dukka as “unsatisfactoriness.” Life is a series of disappointments; “getting the next job promotion, or getting an A on that next exam or, eating that next powdered-sugar doughnut” does not bring “eternal bliss,”[vii] nor suffering like watching your spouse die of brain cancer. Meditation, “an essentially therapeutic endeavor — a way to relive stress or anxiety, cool anger, or dial down self-loathing just a notch — can turn into a deeply philosophical and spiritual endeavor,”[viii] although Wright confesses that nirvana remains a mystery to him and that so far meditation has made him only somewhat less irritable. In effect, Wright transforms Buddhism into a well-being commodity, drained of all reference to the transcendent.

Many technologists in Silicon Valley are committed to the blind hope that death is a technical problem that scientists will soon solve. Google contributed to founding Singularity University, which is not a university and where Singularity refers to the tipping point “in the history of the race beyond which human affairs, as we know them, could not continue,” according to John von Neumann, perhaps the greatest mathematician of the twentieth century.[ix]

Von Neumann gave no details of what the Singularity in human life might be, but numerous science-fiction writers, futurists, computer scientists, physicists, and mathematicians have. Vernor Vinge, a mathematician and science-fiction author, predicted, in 1993, that “within thirty years, we will have the technological means to create superhuman intelligence. Shortly after, the human era will be ended.”[x] Inventor and futurist Ray Kurzweil placed the Singularity at 2045, when the explosive increase in computing power will have produced superintelligence that transcends the “limitations of our biological bodies and brains.”[xi] Computer scientist Jaron Lanier claims in the new religion of information humans will “become immortal by getting uploaded into a computer,”[xii] what believers in the Singularity call “digital ascension.” The blind hope is that humans and computers will soon merge, making old age and death disappear.

Historian Yuval Noah Harari in his book Homo Deus discusses Google’s anti-aging program and claims that the company “probably won’t solve death in time to make Google co-founders Larry Page and Sergey Brin immortal.”[xiii] When Brin, 44 at the time, heard this, he said, “Yes, I was singled out for death; no, I’m not actually planning to die.”[xiv]

Two good friends of mine, both believers in the Singularity and adherents to a calorie-restricted diet, scoffed at my assertion that the pursuit of science-based immortality is akin to the search for a perpetual motion machine — no one has ever beaten entropy — things inevitably fall apart.

Most of us mortals are not believers in the Singularity and are more like Susan Sontag, a great writer, a public intellectual, and an extremely erudite thinker. She was diagnosed with metastatic breast cancer, in 1975, at the age of 42, and elected to have, according to her son, David Rieff, the “most radical, mutilating treatment”[xv] for her stage IV cancer. Her elected course of treatment saved her life, and she “became a militant propagandist for more rather than less treatment,”[xvi] despite that more medical intervention for cancer can be exceedingly harmful, causing new cancers, heart attack, and stroke.

Almost thirty years later, Sontag was diagnosed with myelodysplastic syndrome (ADS), a cancer in which immature blood cells in the bone marrow do not mature. Even today, ADS has no satisfactory treatment and usually converts to acute myelogenous leukemia that eventually kills the patient.

Sontag loved life — “there was nothing she did not want to see or do or try” — and was “unreconciled to mortality.”[xvii] She sought a reprieve from her death sentence, and may have believed that “death is somehow a mistake, and that someday that mistake will be rectified,” according to Rieff.[xviii] “What she wanted was survival, not extinction — survival on any terms.”[xix]

Sontag convinced herself that the best chance for her survival was a bone marrow transplant at the Fred Hutchinson Cancer Research Center in Seattle. The transplant failed, her leukemia returned, and she died “slowly and painfully.”[xx]

Rieff knew his mother had a “horror of cremation”[xxi] and he had her buried in the Montparnasse Cemetery in Paris. On reflection, he writes, a “black polished slab covers the bones and whatever else now remains of the embalmed corpse that was once an American writer named Susan Sontag, 1933–2004.”[xxii]

Like Sontag, many of us in the modern world fear “extinction above all else,”[xxiii] so how strange to us is Socrates’ claim that “true philosophers make dying their profession, and that to them of all men death is the least alarming.”[xxiv] Socrates argued that if a person fixes his attention on the material world, he falls “prey to complete perplexity and uncertainty”[xxv] for the senses can report that the same object is both hot and cold, big and small. Consequently, he advocated a philosophical training that “consists in separating the soul as much as possible from the body, and accustoming it to withdraw from all contact with the body and concentrate itself by itself.”[xxvi] After such a philosophical training that aims at direct experience of the eternal, the “soul can have no grounds for fearing that on its separation from the body it will be blown away and scattered to the winds, and so disappear into thin air, and cease to exist.”[xxvii]

On the day of his forced suicide, many of Socrates’ young friends arrived at his cell early in the morning to spend the last day with the Master. Socrates saw that all his followers were afraid of death, and he spent the last hours of his life consoling them! He attempted to cast a “magic spell”[xxviii] over his disciples to charm away their fear of death. If he could show them that the soul is immortal and what awaits it after death is a life more abundant, then in a sense he would have slain death for them. (See illustration.[xxix])

He explained to them that through philosophical training their souls would experience the invisible, the divine, and the timeless, and that such experience inspires confidence in the immortality of the soul. His young friends, however, had doubts and demanded a logical proof that the soul is immortal. Socrates must have known that such a proof would never allay the fears of his friends; nevertheless, out of love, he proceeded to spin a web of logic to dispel the fear of death from his young friends, all the time directing them to the deepest experiences of the interior life.

Socrates used philosophy to wind his way into the labyrinth of human existence to slay the monster — death — and thus to rescue Athenian youth. At his trial, he had refused to save himself from death, and while in prison, he did not permit his wealthy friends to arrange for his escape. In effect, Socrates offered himself as a sacrifice to death in order to teach publicly that the soul is immortal.

Many of us in the modern world are awed that Socrates spent his last day on Earth as he spent the days before, pursuing truth with his friends and looking out for their welfare. He calmly drained the cup of hemlock in one breath, prayed to the gods that his “removal from this world to the next may be prosperous,” and asked his friend Crito “to offer a cock to Asclepius,” the god of healing; thus died the “bravest and also the wisest and most upright man” in Athens.[xxx]

In Modernity, self displaces soul, because the basic unit of democracy, capitalism, and the Nation-State is the isolated, autonomous self.[xxxi] As a result, Socrates’ discussion of the nature of the soul is mainly of historical interest to most of us; we want to know if the self is immortal.

While ancient theologians such as Augustine and Aquinas spoke of the immortality of the soul, New World Christians proclaim that the self is immortal. When a bereaved self asks a priest or pastor, “Will I see my loved one again?”, the answer is invariably “Yes,” with the implication that the desires, habits, and memories of the loved one are either immortal and live on now or will be resurrected in Christ. C. S. Lewis, a New World Christian apologist, even argued (hoped or believed) that his favorite dog would be resurrected with him.[xxxii]

Margaret Guenther, retired director of the Center for Christian Spirituality of the General Theological Seminary of the Episcopal Church, imagines, “Maybe the next life will be a feast for the mind, like great expanses of time in the main reading room in the Library of Congress, only with good lighting and comfortable chairs. Maybe it will be bountiful, like the homecoming picnics at the Martin City Methodist Church, where my father worshiped as a boy.” She confesses, “Sometimes I play with the idea that I will see my grandfather, whom I loved deeply and who died when I was nine, and meet my German grandparents for the first time. . . . Maybe I can have a beer with Meister Eckhart or crochet and chat with Dame Julian of Norwich, while she sews on humble garments, suitable for anchorites.”[xxxiii]

Implicit in Guenther’s picture of the next life is her answer to the most fundamental question a person can ask — “Who am I?” Guenther gives the common answer — I am my memories, a view that does not hold up to scientific or philosophical examination. Neurologist Oliver Sacks reported that a man under his care had suffered a sudden thrombosis in the posterior circulation of the brain, which caused the immediate death of the visual parts of the brain. The patient became completely blind — and did not know it! Sacks’ questioning revealed that the patient had lost all visual images and memories yet had no sense of loss. The patient had no memory of ever having seen; he was unable to describe anything visual and became bewildered when hearing the words “seeing” and “light.” An entire lifetime of visual experience had been erased from memory in an instant.[xxxiv]

The visual memories stored in a person’s brain are nonmaterial[xxxv], but they do not exist separate from his brain; the same is true for all other perceptual memories. My memory of winning the eighth-grade math prize in 1956 at the Middle School in Union Lake, Michigan will die with my brain as will my memory of crossing frozen Wilkins Pond during a full moon in mid-winter of the same year. What is true of memory is also true of imagination. My image of myself — a gypsy outsider — will perish with my death. All my acquired emotional habits, such as the fear of dogs and the love of Bach and Mozart, depend upon brain physiology. Even discursive reason, which moves in time, seems perishable.

Through such reflection, my intellect uncovered a terrible truth — my mortality. I then concluded that the mortality of George Stanciu was a calamity that rendered human life meaningless. To ignore or to forget the reality of death, to force this unbearable truth from my mind, I often turned to a sensual life, to amusing diversions, or to other forms of self-narcosis. What kept me from firing a bullet into my brain was that deep down I thought that I had possibly made an error in my analysis of who I am.

The Self: A Cultural Construct

I was not born speaking English or Romanian, nor did I know Newton’s three laws or the Preamble of the U.S. Constitution. I was not born with a self; even when two-years-old, there was no Georgie Stanciu. Child psychologists have observed that around 19 months, a child begins to use the words “my,” “mine,” and “me” and his name with a verb — “Georgie eats.”[xxxvi] By 27 months, self-reference is common, although the child is not telling the parents who he is; that requires a narrative. Between three to five years of age, autobiographical memory emerges, and the development of a unique personal history begins.[xxxvii] Even at this young age, the self-narrative pattern that emerges depends upon culture.

I, of course, was born into a world of complex social relations; others instructed me how to behave and taught me what was important in life; in effect, the world gave me a self, assigned me a unique node in a social web. My parents and relatives taught me that I was part of a Romanian community with indissoluble obligations to others. I can still hear my father telling me that at times the other guy often needs your help.

At school, the lessons I received contradicted my father’s teaching. In the third grade, I sat at my desk, in a row of identical desks; mine was the farthest from the teacher, who sat at a large desk at the front of the classroom. Each one of us occupied a small cubicle with invisible walls. When my best friend, Joey Prinko, reached across the aisle separating our respective rows of desks to hand me a pencil or a crayon, the teacher yelled at him and told him to keep his hands home. In grade school, the arrangement of desks in my classrooms was dictated by the answer to “Who am I?” given to all of us by American culture.

Architecture reveals how members of a particular culture understand themselves. The typical American house is not separated from the rest of the community by a high wall; but inside walls divide the house into private spaces for each individual. Traditional Chinese houses, in contrast, are set off from the community by high walls and screened gates, although their interiors are designed for common living, not partitioned privacy. Each culture puts the wall around the basic unit: for Americans, the individual; for the Chinese, the family.

Contrary to Romanian culture, in America, I was treated as an isolated, autonomous individual. In school, I learned to compete with others and to disregard the emotional pain I caused others on the rare occasions I bested them at academics or on the playing fields. I saw Joey Prinko miss a “stupid word” for the school championship and watched him cry. Shirley Divine won the spelling bee, and everyone held her up as a winner; from the smile on her face, I knew she felt good about herself. Joey’s failure was his problem, not hers.

In the lower grades and in high school, I was instructed to work for gold stars, A’s on report cards, and the honor roll, and, in college, to thrust others aside for a place on the Dean’s List and the possession of a Phi Beta Kappa Key. I was also taught that academic achievement was necessary for affluence, a desirable goal, since material prosperity equaled happiness. Furthermore, to be successful, I learned that a pleasing appearance, stylish clothes, an attractive image, and an inviting bearing were necessary to be accepted by the right people. My education attempted to teach me to regard myself as a potentially valuable commodity that with the correct molding would fetch a substantial price in the marketplace.

Like every person on the planet, I fashioned my desires, achievements, losses, and experiences with others into a coherent whole through a self-narrative. I called my self into existence, as others did, through language. “George Stanciu” merged the Romanian and American aspects of my childhood into the story “Gypsy Outsider,” a narrative that included bits and pieces taken from literature, pop music, and the big screen. I shamelessly stole the plot of Shane, my favorite boyhood movie. Shane comes from nowhere, has no family or last name, possesses his own moral code, needs no help from any one, and rescues the cowardly townspeople from the “bad guys” and then rides off into the sunset. He is the picture of independence, the embodiment of the isolated, autonomous individual.

From early childhood on, I repeatedly told my self-narrative, adding layer upon layer of storytelling, incorporating new experiences into my story, and often embellishing past events to such a degree that they become distorted or even fictitious. I developed intense attachments to certain personal stories, revisiting them again and again, for weeks, months, and even years. In this way, “George Stanciu” both solidified and changed. Hence, a self-narrative is not a reliable history of a person; memories, often invented and always embellished, are not a person.

Both my parents worked fulltime in the grocery store they owned and had little time for childcare. My substitute parents were Grandma Rice and Murphy. Grandma Rice hugged me to her full bosom, told me that I was her wonderful little boy, and sheltered me from a threatening world. Murphy lived in a small room in the basement, took his meals with the rest of us, and worked as an all-around handyman to keep the dilapidated grocery-store building from falling apart. I didn’t know that Murphy liked his whiskey and was half in the bag most of the time. I just knew that he sang songs, joked about everything, and was fun to be around. When I wasn’t with Grandma Rice, I was with Murphy.

When I was six years old, Grandma Rice and Murphy died within four months of each other. With the passing of my two principal caregivers, death clung to my shoulder like a pest I could not shake off. I hated and feared death, and as I grew older, death forced me to ask, “Who is this I that dies?”

Much later in life, I realized that “George Stanciu” is an artifact fashioned by culture and personal storytelling, devoid of substance and eternal permanence. Like every isolated, autonomous self, I believed that I was the center of existence to which everything should be ordered and sought to build up myself through the acquisition of knowledge, honor, and love. I laughed when I truly grasped that “George Stanciu” is an illusion, frail and fleeting, with no more permanency than a smoke ring, doomed to disappear into nothingness with the death of my body.

In a roundabout way, I arrived at the central insight of the Buddha — the self is an illusion. In the Deer Park at Isiptana, the Buddha preached his second sermon, The Discourse on Not-Self, and “while this discourse was being spoken, the minds of the monks of the group of five were liberated from the taints by nonclinging.”[xxxviii] Arguably, anattā, a Pāli word that literally means no-self, is the most important and most difficult concept in Buddhism, since it led the five monks to instant enlightenment, to Nirvāṇa, to “the annihilation of the illusion [of self], of the false idea of self.”[xxxix]

But if each one of us were merely a particular compound of body, sense perceptions, memories, and ideas, then no escape from Samsara, the never-ending wheel of birth and death, would be possible. The Buddha told his disciples, “There is, monks, an unborn, not become, not made, uncompounded . . . therefore an escape can be shown for what is born, has become, is made, is compounded.”[xl] I had no idea what the Buddha meant by the unborn, so I suspected that my answer to “Who am I?” missed an essential element of who I am.

In some mysterious way, I was more than my memories, which by themselves without storytelling were disconnected, and more than my self-narrative, whose main plot was my adolescent rebellion against all authority. Looking back on my life, I saw there were two high points that I had ignored in my self-narrative.

I studied music with Mr. Bach — honestly, I am not making this up. Under his stern discipline, I became more than a passible clarinet player in the Union Lake High School Band. I loved music, and if I had possessed a better auditory memory, I would have become a musician, content to play the last desk in the Santa Fe Community Orchestra. What I loved about playing music was that “unique suspended moment when you actually become the emotional or sensory quality of the music — the colors, the water, the love,”[xli] that moment when the musician becomes the music, the dancer becomes the dance, the rock climber becomes the climb. The best moments of human life happen when the ego disappears, when the self, in a sense, dies.[xlii]

The other high point of my life that I excluded from my self-narrative was that in the tenth grade, I was changed forever by Euclid’s proof that the prime numbers are infinite, an exquisite proof that surprisingly showed in six lines of text an eternal truth. Until that point in my life, I thought truth did not exist; everything about me changed, the seasons, my body, and people. My experience of the human world was that everything was in flux, sometimes bordering on the absurd. Suddenly, mathematics presented me with one thing that was unchanging, a timeless truth, demonstrated in an exceedingly beautiful way; the 2,500 years between Euclid and me were of no consequence. My encounter with Euclid was not unique. Bertrand Russell described his first encounter with serious mathematics: “At the age of eleven, I began Euclid, with my brother as my tutor. This was one of the great events of my life, as dazzling as first love. I had not imagined that there was anything so delicious in the world.”[xliii]

Virtually every mathematician and theoretical physicist believes that mathematical objects exist and that relations between them are independent of human beings. From his experience of doing mathematics, Godfrey Harold Hardy, recipient of the Copley Medal of the Royal Society for his distinguished work in mathematical analysis, was convinced that “mathematical reality lies outside us, that our function is to discover or observe it, and that the theorems which we prove, and which we describe grandiloquently as our ‘creations,’ are simply the notes of our observations.”[xliv] Alain Connes, holder of the Chair of Analysis and Geometry at the Collège de France, agrees: “There exists, independently of the human mind, a raw and immutable mathematical reality. . . . The working mathematician can be likened to an explorer who sets out to discover the world.”[xlv] Roger Penrose, emeritus Rouse Ball Professor of Mathematics at Oxford, maintains that the Mandelbrot set used extensively in chaos theory and fractal analysis is not an invention of Benoit Mandelbrot and did not come into existence when the Polish-American mathematician wrote down its definition: “Like Everest, the Mandelbrot set is just there.”[xlvi]

 Many years after the tenth grade, as a theoretical physicist, I discovered timelessness in a different way. How strange that Homo sapiens, a flash in the pan in the history of the universe, can intellectually grasp the beginning and the end of everything, the Big Bang and the Big Freeze. Unlike zebras, kangaroos, and chimpanzees, human beings in some mysterious way transcend space and time.

All biological life is locked into time, into the cycle of birth, growth, reproduction, senescence, and death. Perceptual life is of the here and now; memories are of past particulars, and imagination in the restricted meaning of the word is the faculty for rearranging perceptual images and in the extended use is the ability to form interior images, to envision eventful scenes and peopled places.

Clearly, then, some special element or aspect of the mind allows Homo sapiens to transcend space and time. Without this special element, science would be impossible, for the goal of science is to find unchanging causes of phenomena, that is, first principles, or laws, that apply universally, not merely to one or two particular instances in space and time. The principle of inertia states that matter resists a change in motion, not that one James Brown found it difficult to move his suitcase from his front porch to his car in Atlanta, Georgia, on July 17, 1958.

In my first encounter with the truth and beauty of mathematics, my fourteen-year-old mind saw dimly that I was being called to the transcendent, in Christian terms to God. Socrates, from his experience of beauty, said beauty calls us to the source of beauty; for this reason, beauty is named kalos in Greek, which bears an uncanny resemblance to the verb kaleô, “to call.” Socrates as a young man, obviously much deeper than I am now as an old person, saw that when a person comes to know the source of beauty, “he shall be called a friend of god, and if ever it is given to man to put on immortality, it shall be given to him.”[xlvii] When in my thirties and still dim-witted, Socrates pointed out to me that since I could grasp the eternal truths of mathematics, there had to be something in me that was eternal, something deathless. For Socrates, the deepest answer to “Who am I?” is arrived at by stripping away the accidents of birth and culture until the true self appears, timeless and in some mysterious way connected to all existence.

For me, of course, this true self was a complete mystery, so after stumbling around for years exploring Hinduism and Buddhism, I turned to the deepest understanding of the human person that Christianity offers.

The Patristic Fathers embraced the theological insights of Pseudo-Dionysius the Areopogite[1]: God is not any of the names used in the Hebrew Bible and the New Testament, not God of gods, Holy of holies, Cause of the ages, the still breeze, cloud, and rock.[xlviii] God is not Mind, Greatness, Power, or Truth in any way we can understand, for He “cannot be understood, words cannot contain him, and no name can hold him. He is not one of the things that are, and he is no thing among things.”[xlix] God is the Unnamable.

Surprisingly, Thomas Aquinas, the most rational of theologians, also reached the conclusion that God is beyond our comprehension: “It is because human intelligence is not equal to the divine essence that this same divine essence surpasses our intelligence and is unknown to us: wherefore man reaches the highest point of his knowledge about God when he knows that he knows Him not, in as much as he knows that that which is God transcends whatsoever he conceives of Him.”[l]

According to St. Gregory Palamas, we know the energy of God, not His essence: “Not a single created being has or can have any communion with or proximity to the sublime nature [of God]. Thus, if anyone has drawn near to God he evidently approached Him by means of His energy.”[li] A person becomes close to God by participating in His energy, “by freely choosing to act well and to conduct [himself] with probity.”[lii] Here, Gregory distinguishes between God’s essence, or substance (ousia), and His activity (energeia) in the world. The energy of God is experienced as Divine Light, such as the light of Mount Tabor or the light that blinded St. Paul on the Road to Damascus.

The Church Fathers, however, warned that for most devout Christians visions and heavenly voices do not come from God but from a fevered imagination and are distractions in the spiritual quest. Modern spiritual directors issue the same warning about imagined experiences. Thomas Merton, for example, told his novices that “since God cannot be seen or imagined, the visions of God we read of the saints having are not so much visions of Him as visions about Him; for to see any limited form is not to see Him.”[liii]

Given this understanding of the God, the image of God within us means that the essence of each one of us is unnamable and that we are known to others only through our activity in the world, that is, through a socially-constructed self. We are unknowable to ourselves, although through meditation, or what the Patristic Fathers called contemplation, we can witness our thoughts, memories, and storytelling, and thus know that we are not what we witness. Through more advanced contemplation, we may experience Divine Light, the presence of God.

At the core of our being is the unnamable, the “empty mind” of Zen Buddhism, the “pure consciousness” of Hinduism, and the “spirit” of Christianity, although all words ultimately fail to capture our true self. We, the unenlightened, believe that the false self given to us by culture is permanent and fail to see that the false self is an illusion, with no more permanency than a smoke ring, destined to vanish with the death of the body. Because we take our culturally-given self for our true self, we fail to experience who we truly are. Our true self is always present, completely perfected with no need of development from us; we must merely step aside. Every spiritual master calls for the death of self and a spiritual rebirth beyond egoistic desires, beyond religious practices, beyond any given culture, beyond the dictates of society, into the law of love, into compassion for every living being.

The image of God within us also means that each one of us has the energy to transform the physical and social worlds we inhabit, either for good or evil. For instance, we are free to use the fruits of science for the benefit of life or the destruction of humanity, for creating polio vaccine or thermonuclear weapons, aids for life or instruments of death. Through such free choices, we either draw closer to God or more distant from Him.[liv] Each one of us becomes what we choose.

Our freedom is virtually unlimited; we can thumb our nose at God, refuse to become who we truly are, and embrace a self of our own choosing. However, to freely abandon God, to exist in oneself, and to seek satisfaction in one’s own being is not quite to become a nonentity but is to verge on non-being.[lv] Hell is not the fiery pit of received Christianity, but the complete separation from God — forever. Heaven is not the reuniting with one’s favorite dog or the blissful meeting with one’s unknown relatives or the pleasure of conversing with the saints, not such “enthusiastic fantasies,” but to “know more deeply the hidden presence by whose gift we truly live.”[lvi] Heaven is joining the Holy Trinity in Love, as wonderfully expressed in the icon The Trinity, also called The Hospitality of Abraham, by Andrei Rublev. The three angels in the icon are metaphors for the three persons of the Holy Trinity. The figures are arranged so that the lines of their bodies form a full circle. In motionless contemplation, each angel gazes into eternity. Because of inverse perspective the focal point of the icon is in front of the painting on the viewer, inviting the viewer to complete the circle of angels, to join in a union with the Holy Trinity. (See illustration.)

Like every person, I live in two worlds, the temporal and the eternal. I love the pungency of Stilton cheese, the softness of cashmere, the dance of cherry blossoms, and the smell of the ocean salt air, and wonder about the abundant beauty of Nature, where nothing is not beautiful, either to the eye or to the mind, and am enthralled by the poetry, drama, and music that touch the transcendent. Yet, this physical world, like “George Stanciu,” is transient and eventually vanishes without leaving a trace.

I live among the rich and the poor, the powerful and the weak, the ambitious and the lazy, the good and the bad, the loving and the hateful. Grappling with death taught me how to live in this world. I now see that every person I meet in ordinary, daily affairs — the mailman, the bank teller, the butcher at Whole Foods, the obnoxious teenager down the street with his blaring boom box — are part human and part divine, a storytelling self, often confused, dislikable, and in pain, but always transient, and a mysterious self, deathless, an image of God, worthy of unconditional love.

The Imaginative Conservative applies the principle of appreciation to the discussion of culture and politics—we approach dialogue with magnanimity rather than with mere civility. Will you help us remain a refreshing oasis in the increasingly contentious arena of modern discourse? Please consider donating now.

Editors Note: the featured image is “The Wards at Saint John’s Hospital” by Jan Beerblock

[1] Pseudo-Dionysius the Areopagite is the anonymous theologian of the late 5th to early 6th century whose works were erroneously ascribed to Dionysius the Areopagite, the Athenian convert of St. Paul mentioned in Acts 17:34.

[i] Aeschylus, Prometheus Bound, trans. David Grene in Aeschylus I: The Persians, The Seven Against Thebes, The Suppliant Maidens, Prometheus Bound (Chicago: University of Chicago Press, 2013), lines 230-255.

[ii] Su Tung-po, “Remembrance.”

[iii] Eccl 1:2-4. RSV

[iv] Quoted by Sushila Blackman, ed., Graceful Exits: How Great Beings Die (Death stories of Hindu, Tibetan Buddhist, and Zen masters), (Boston: Shambhala, 2005), p. 7.

[v] See Blaise Pascal, Pensées, trans. W. F. Trotter (Grand Rapids, MI: Christian Classics Ethereal Library, 2005), p. 30.

[vi] “The Buddha’s First Sermon, Known as the Foundation of the Kingdom of Righteousness or the Setting in Motion of the Wheel of Dharma,” in Buddhism: A Religion of Infinite Compassion, ed. Clarence H. Hamilton (Indianapolis, IA: Bobbs-Merrill, 1952), pp. 28-29 and “The Sermon at Benares” in The Teachings of the Compassionate Buddha, ed. E. A. Burtt (New York: New American Library, 1955), p. 30.

[vii] Robert Wright, Why Buddhism is True: The Science and Philosophy of Meditation and Enlightenment (New York: Simon & Schuster, 2017), pp. 6, 55.

[viii] Ibid., p. 55.

[ix] John von Neumann in conversation with Stanislaw Ulam, “Tribute to John von Neumann, 1903-1957,” Bulletin of the American Mathematical Society 64 (May 1958).

[x] Vernor Vinge, “The Coming Technological Singularity: How to Survive in the Post-Human Era.

[xi] Ray Kurzweil, The Singularity Is Near: When Humans Transcend Biology (New York: Penguin Books, 2006), p. 9.

[xii] Jaron Lanier, You Are Not a Gadget: A Manifesto (New York: Vintage, 2011), p. 29.

[xiii] Yuval Noah Harari, Homo Deus: A Brief History of Tomorrow (New York: Harper, 2017), p. 28.

[xiv] Sergey Brin, quoted by Tad Friend, “Silicon Valley’s Quest to Live Forever,” The New Yorker (April 3, 2017).

[xv] David Rieff, Swimming in a Sea of Death: A Son’s Memoir (New York: Simon & Schuster, 2008), p. 38.

[xvi] Ibid., p. 39.

[xvii] Ibid., pp. 15, 13.

[xviii] Ibid., p. 61.

[xix] Ibid., p. 17.

[xx] Ibid., p. 150.

[xxi] Ibid., 173.

[xxii] Ibid., p. 171.

[xxiii] Ibid., p. 163.

[xxiv] Plato, Phaedo, trans. Hugh Tredennick, in The Collected Dialogues of Plato, ed. Edith Hamilton and Huntington Cairns (Princeton: Princeton University Press, 1961), 67e.

[xxv] Plato, Seventh Letter, trans. L. A. Post, in The Collected Dialogues of Plato, ed. Edith Hamilton and Huntington Cairns (Princeton: Princeton University Press, 1961), 343c.

[xxvi] Plato, Phaedo, 67c.

[xxvii] Ibid., 84b.

[xxviii] Ibid., 77d.

[xxix] Jacques-Louis David, The Death of Socrates. Metropolitan Museum of Art, public domain.

[xxx] Plato, Phaedo, 117c-118.

[xxxi] George Stanciu, “Individualism: The Root Error of Modernity.”

[xxxii] C. S. Lewis, The Problem of Pain, (New York: The Macmillan Company, 1962), Ch. 9.

[xxxiii] Margaret Guenther, “God’s plan surpasses our best imaginings.”

[xxxiv] Oliver Sacks, The Man Who Mistook His Wife for a Hat (New York: Summit, 1987), p. 39.

[xxxv] See George Stanciu, “Wonder and Love: How Scientists Neglect God and Man.”

[xxxvi] Jerome Kagan, Unstable Ideas: Temperament, Cognition, and Self (Cambridge: Harvard University Press, 1989), p. 233.

[xxxvii] Qi Wang and Jens Brockmeier, “Autobiographical Remembering as Cultural Practice: Understanding the Interplay between Memory, Self and Culture,” Culture & Psychology (2002) 8:52.

[xxxviii] Anatta-lakkhana Sutta: The Discourse on the Not-self in In the Buddha’s Words: An anthology of Discourses from the Pāli Canon, trans. Bhikkhu Bodhi (Boston: Wisdom Publications, 2005), p. 342.

[xxxix] Walpola Rahula, What the Buddha Taught, (New York: Grove Press, 1974), p. 37.

[xl] Burtt, ed., Teachings of the Compassionate Buddha, p. 113.

[xli] Barry Green, The Inner Game of Music (New York: Doubleday, 1986), p. 14.

[xlii] For a fuller discussion of the loss of ego, see George Stanciu, “The Best Moments of Human Life.”

[xliii] Bertrand Russell, The Autobiography of Bertrand Russell: 18721914 (Boston: Atlantic Monthly Press, 1967), pp. 37–38.

[xliv] G. H. Hardy, A Mathematician’s Apology (London: Cambridge University Press, 1941), pp. 123-124.

[xlv] J.-P. Changeux and A. Connes, Conversations on Mind, Matter and Mathematics (Princeton: Princeton University Press, 1995), pp. 26, 12.

[xlvi] Roger Penrose, The Emperor’s New Mind, with Foreword by Martin Gardner (Oxford: Oxford University Press, 1989), p. 95. Italics in the original.

[xlvii] Plato, Symposium, 212a.

[xlviii] Pseudo-Dionysius, The Divine Names in Pseudo-Dionysius: The Complete Works, trans. Colm Luibheid (New York: Paulist Press, 1987), 596A.

[xlix] Ibid., 872A.

[l] Aquinas, Quaestiones Disputatae De Potentia Dei, trans. English Dominican Fathers (Westminster, MD: Newman Press, 1952, [1932]), Q. VII: Article V.

[li] Gregory Palamas, Topics of Natural and Theological Science and on the Moral and Ascetic Life: One Hundred and Fifty Texts in The Philokalia, Vol. IV, ed. and trans. G. E. H. Palmer, Philip Sherrard, and Kallistos Ware (London: Faber and Faber, 1984), p. 382.

[lii] Ibid., p. 383.

[liii] Thomas Merton, New Seeds of Contemplation (New York: New Directions, 1962), p. 132. Italics in original.

[liv] See Palamas, p. 382.

[lv] See Augustine, City of God, Bk. 14, Ch. 13.

[lvi] Joseph Ratzinger, Eschatology: Death and Eternal Life, 2nd ed., trans. Michael Waldstein (Washington, D.C.: Catholic University of America Press, 1988), pp. 233-234.

Editor’s Note: The featured image is “Charon crossing the Styx” (1520-1524), by Joachim Patinir, courtesy of Wikipedia Commons; the second image is “The Death of Socrates”  (1787) by Jacques Louis David, courtesy of Wikipedia Commons; and the last image is “The Trinity” (1411 or 1425-27), by Andrei Rublev, courtesy of Wikipedia Commons.

All comments are moderated and must be civil, concise, and constructive to the conversation. Comments that are critical of an essay may be approved, but comments containing ad hominem criticism of the author will not be published. Also, comments containing web links or block quotations are unlikely to be approved. Keep in mind that essays represent the opinions of the authors and do not necessarily reflect the views of The Imaginative Conservative or its editor or publisher.

Print Friendly, PDF & Email