How do you know?

We live in an age that craves precision and proposes to measure information scientifically. Yet, it is precisely in an environment dominated by junk information that it is most appropriate to ask: how do I know what I think I know?

We’re born with instinct and quickly develop an ingrained trust of our senses and other faculties. We naturally believe that reality is what our eyes show us. For, how can we know how to act in a given situation unless it’s clear what the situation is? Yet, reality is not naturally clear. We see things a definite way because ambiguous perception would be useless, leaving one confused, wavering in doubt. We trust our feelings because they provide an instant assessment and ready-made behavior, especially when delay could be fatal. But precisely because they are hasty and automatic, perception, instinct, and feeling can be wrong. Fortunately we have reason as well. But how reliable is it?

One likes to think that logical reasoning leads to truth. That is because the truths of logic are independent of particular facts, which are always subject to error. However, the sure steps of logical reasoning are merely stepping stones from presumed facts to action. They are only reliable if we know where to step in the first place. To arrive with certainty, you have to begin with certainty. Much of our reasoning is now done for us by computers, which are logic machines. But like us, their output is only as reliable as their input.

Logic forces the issue of certainty, because it is based on language rather than on reality. Logical propositions are statements defined to be either true or false. By formulating propositions, reason misleads us to believe that we can know things to be clearly one way or another. We are even trained in school to answer “true or false” questions on examinations. But only statements are true or false, not reality itself. It may seem evident, in a given time and place, that the statement “it is raining” is either true or false. But when you step outdoors and feel a single, barely discernible tingle of cool moisture on the skin of your face, is it then raining or not? The question may only matter if you are trying to decide whether to take your umbrella or even whether to go out at all. And that is the crux of the matter: the point of certainty is to decide, to know what to do next.

We are all familiar with the maddening limitations of public surveys, which ask us to rate our degree of accord with various propositions. In effect, these are “multiple choice” questions, also familiar from school days. Expanding the number of categories beyond the true/false dichotomy may seem like an improvement, but in fact all categories are arbitrary divisions. In the case of surveys our replies are used by others to make decisions that matter to them. Perhaps such information is reliable to the extent that people actually behave in ways that correspond to how they answer. But, haven’t you felt that the questions are misleading in the first place, and wished you could give more nuanced answers? In some ways, the public survey is an apt metaphor for our own internal thought processes. We query ourselves in order to decide some issue that could require action. How we reason, the questions we pose, the options we imagine, and the sort of answers we expect are shaped by language—our self-talk. We tend to think in words, which means in propositions and categories.

The digital age reflects this inborn propositional thinking. The essence of digital processing is ‘yes’ or ‘no’, ‘either/or’. In logic this is formally known as the law of excluded middle: there is no ground between true and false. But true and false are artificially sharp categories designed to generate the certainty upon which to act decisively. In many cases this works and serves us well. Even if we cannot predict the weather perfectly, we send spacecraft millions of miles to rendezvous precisely with a location as elusive as the proverbial needle in a haystack. The mathematics based on the law of excluded middle, and digital computers in particular, enable us to do this because the truths of mathematics, as of logic, are certain by definition. Yet, no plan or theory ever corresponds perfectly to reality, which is always more nuanced and may include unforeseeable surprises. Calculations are no more accurate than the data on which they are based. If you start with a false assumption, only by sheer luck can you arrive at a true conclusion.

Probability, statistics, and “fuzzy” logic have developed to compensate for the limitations of conventional reasoning as it applies to the naturally ambiguous real world. The probability that it will rain in the next minute refers in a fundamental way to similar situations in the past, of which a record has been kept. If it rained in 60 out of 100 past situations where similar conditions prevailed, then it is fair to claim there is a “60% chance” that it is about to rain now. Yet, even statistics deals with definable events, which are presumed either to have happened or not. (Was it indeed raining in each of those 60 cases? On what criterion was that decided?) And any logic, even fuzzy, depends on concepts, operations and conditions that are clearly defined to begin with. Thought aims toward clarity, but also presupposes it.

The truth is that truth is not a property of reality, but of statements or thoughts. Certainty is a state of mind, not a state of the world. We hope to feel certain, especially when we need to act, because being wrong (or failing to act) can have dire consequences for which we loathe to be responsible. Yet, however certain we feel, mistakes with dire consequences are possible. Sometimes (but not always, of course) it is better to do nothing than to act prematurely. In some situations, especially when time allows, it is wise to doubt what the situation actually is, because the reality is never as clear and simple as human accounts of it. There is room for a middle ground between true and false, which are categories that unrealistically presuppose well-defined situations. Yet, navigating the no-man’s-land between true and false is psychologically challenging—perhaps especially for action-oriented men. Remaining in doubt goes against the fundamental instinct to be decisive and ready to act. Not acting in that instance requires a different sort of action: to take the stance of unknowing.

As a septuagenarian male, I like to think I know my way around. While I generally trust my mind and my perceptions, I’ve also learned to know better. Experience has demonstrated that I can be wrong, and I sometimes find myself misjudging situations. I consider myself lucky when these errors are revealed before they are compounded by further action. I do not envy people in positions of responsibility who must make weighty decisions, which are unavoidably based on imperfect information. My own errors of judgment usually involve false assumptions. These may be based on poor information (for example, rumor), but the underlying problem is that I can trust my judgment inappropriately. And sometimes that is because I haven’t sufficiently questioned my own motivations, which underlie how I perceive the situation.

This is where the stance of unknowing comes in. It insists on taking time to self-examine and to question appearances. I have to remind myself that I may be making assumptions that are unjustified or that I am not even aware of making. I need to clearly understand and acknowledge my emotional stake in the situation. I have to ask myself how I know what I think I know. I may still be required to act, to come to a decision. But honoring the place of doubt may result in a better decision. In most cases, little is lost but time and a spurious sense of certainty. Henry Ford observed that time wasted is forever lost. But time lost is not necessarily wasted. Ford was a headstrong man whose self-confidence led to great success in early life. The same certainty led to regret in later life.

 

Going gentle

Dylan Thomas’ famous “Do not go gentle into that good night” is one of the great poems of the English language. It expresses an oddly ironic sentiment, however. The author was a tortured alcoholic, who never lived to old age, writing about his dying father who was probably just trying to come to peace. It pleads that “old age should burn and rave at close of day.” Like many of my generation, I encountered this poem in high school, where we thought that good romantics die at twenty-nine (still burning and raving), and thirty seemed impossibly old. (Later, in college, we learned to tune in, turn on, and drop out, and never to trust anyone over that age!) I now turn seventy-five, having somehow survived these prescriptions. Earlier in life I wondered why old people dodder. Now I know.

And I know quite well that “my words had forked no lightning,” but I don’t entirely blame myself for this failure. There is a glut of words out there, after all, and a monstrous cultural machine that decides what is worthy of note. I continue to write, to do my tiny part in the face of obscurity and now oblivion. Should I view death as a defeat, against which to rail, simply because I have not yet made a brilliant name for myself? Is recognition the right reason to do anything?

Ageing is a catastrophe if you view it as a disease. And there is certainly a close alliance of ageing and disease, both at the cellular level and as an experience of the elderly. The old are more vulnerable to illness, in more pain, less able. Ageing is a process of degeneration, and dying “of old age” generally means dying of some degenerative disease. Heroic efforts are underway to identify the molecular and genetic causes of ageing and to find the fountain of youth. While I don’t disapprove, I don’t particularly care either. If a pill or procedure is eventually discovered that enables people to halt ageing and even reset their biological clock, it will probably be too late for me.

Let me propose a different point of view: old age is a stage of development. We age as a function of unfolding, which only happens to coincide with the passage of time. From the get-go, the processes of degeneration are inseparable from those of development. The cells of the embryo, multiplying and differentiating in the womb, have already begun a countdown toward their final destiny. The hundred trillion cells that make up the adult human body began as a single cell, which divides again and again to form the specialized types of cells of the various organs and tissues. For complex and not entirely understood reasons, there is a limit to how many times these cells can divide. This covers roughly the minimum required to make an adult body that can reproduce itself—and not much more. The limit sets a lifespan for the body. If it could be extended, so could the lifespan, and that is the focus of much research on longevity.

My point here, however, is to put ageing in the context of development—as part of the disambiguation of the individual that begins in the womb and culminates in death. Experience and the process of psychological development make you you rather than someone else you might have been. The choices we make are part of this development and are largely irreversible. We begin as quite plastic and general beings and end up somewhat rigid and quite specific beings. We crawl out onto a limb from which there is no return. Viewed this way, old age is not a disease or catastrophe but a developmental stage of life, part of the unfolding. If—through drugs or gene therapy—it becomes possible to have a thirty-year-old body again, what would it mean for my brain to be rejuvenated that way? If my brain’s youthful plasticity were restored, would I be the same person? Surely I would want to retain the memories since age thirty, and the development of my personality and understanding that occurred since then. I am not convinced that a brain can function as though young, yet with the outlook and wisdom of the old. Good upbringing seems more promising toward that end than therapeutics late in life. And there is no way to undo the choices I have made along the way, which have defined who I am.

The survival instinct, built into every creature, is a necessary feature without which it could not have come to exist. Those creatures lacking such an instinct would not live long enough to reproduce and would be excluded through the filter of natural selection. In other words, the drive to keep on living is particularly appropriate to young organisms until they have replaced themselves with the next generation. It may be less biologically relevant later on. As conscious beings, of course, we have other reasons to be attached to living than the body’s natural ones. But to me it is helpful to keep in mind my biological context. It makes more sense for thirty-year-olds to “rage against the dying of the light” than for septuagenarians. The metaphor itself is suggestive. The light dies every twenty-four hours, then also resumes for a new dawn. I might wish the day were longer, like a young child who rebels against bedtime, but I’ve gotten to live my day and others will get to live theirs.

Everyone may have a different idea of the phases of their life. I recognize my childhood as probably the happiest epoch (and certainly most carefree). Adolescence turned darker, more serious and introverted. Young adulthood was about intellectual and emotional discovery, personal growth, and career. Age fifty seemed a continental divide, all downhill on the far side. But it began a productive period of consolidation and “giving back” more than personal expansion. By age seventy I had begun to think of wrapping up loose ends in a phase of completion. Currently, I think about “making peace” with myself. Whatever that turns out to mean, it seems like a necessary preparation for a graceful finale. Of course, I only need to make peace because I have been at war. In my case, inner conflicts have mostly not had dramatic outer consequences. But I feel them no less.

I think of this program, of coming to inner peace, as an aspect of living deliberately, as though my life were a work of art I am trying to finish before a literal deadline. Every artist knows that a medium has its own properties that resist control. One’s life is no exception. There is nothing I can do to change the past, for example, though I could lie to myself or change it in memory. But art is not only about imposing one’s will on materials according to a preconceived vision; it’s also a kind of experimental research. Artists welcome accidents or unexpected effects. The game adjusts to see what can be done with them. So it is with the course of one’s life, in many ways a series of inadvertent events, felicitous or no. What can I do now with what I have become? And now?

Of course, artists sometimes get frustrated and want to tear up the work and start over. Alternatively, they may just want to put final touches on something with which they are basically satisfied. Or maybe they are not as satisfied as they would like but understand the risk of over-working the piece, ironically spoiling it. Artworks are a literal canvas on which artists paint their inner struggles. Perfecting a work is a matter of coming to an inner resolution about it. And that is more than a matter of the objective product or of what someone else thinks of it, as important as those may be. I am now one of those grave men, nearing death, whom Dylan Thomas admonishes to rage. But frankly I do not see the benefit now of such impetuosity. It may provide momentary release, helpful in mid-life. What I seek instead is resolution before the final release.

How to resolve conflict by changing your mind

We all find ourselves in conflict from time to time. Conflict with another person usually entails some inner conflict as well. We feel unsettled, upset, perhaps angry or depressed by a situation and our involvement in it. We wish it weren’t happening and that we felt some other way. We want to resolve the outer situation and with it the inner one; but sometimes professional resources are not readily available and sometimes the negotiations leave us still feeling disturbed or discontent. Apart from the outer situation, what can you do by yourself and for yourself to return to inner equilibrium?

First, let us assume that it is indeed your desire and your focus to resolve the issue within yourself. That already sets the right tone, and it is entirely different than the desire to be justified or to get your way. The goal to get one’s way is a frequent stumbling block in outer conflict resolution. I do not mean that it is wrong; sometimes it is necessary and right to seek justice, to have a certain point of view prevail, even to be uncompromising. I simply mean that we must distinguish between the goal of outer vindication and the goal of regulating one’s inner environment. This latter is the option we will explore here. The distinction is simple, but it is not so easy to make. Let’s see why.

Back up a few million years. Your mind is naturally oriented toward the outside world. You could not exist if it were not so. Natural selection has made you a creature who pays close attention to the reality outside your skin, upon which your survival depends. It has equipped you with the ability to make snap judgments in order to react quickly, and with emotions that help you do this. Indeed, “gut feelings” directly express that ability, and so does the feeling of certainty—natural confidence in perception. Our default position is to believe our eyes and intuitions, to assume that our experience is a transparent window on the world. In philosophy, this position is known as naïve realism. For the most part, it serves us well in daily life. Yet, evolution has also provided other abilities to deal with those times when naïve realism does not serve us well. For, we are not only individuals competing with other individuals; we are also highly social creatures, who must get along in order for society to function. Hence, we have communication, reason, law, empathy, morality. Underlying them all is the ability to be conscious of our own role in producing the experience of the situation we are in. It is this consciousness of an inner realm, alongside the outer one, that opens our eyes beyond the naïve state. We then no longer take for granted that the world is literally as we see it. We no longer believe that events outside our skin make us see them a certain way or make us feel the way we do. We are now in a position to acknowledge our co-responsibility even in how the world appears to us. In particular, we know that any conflict takes two.

Even if you are resolved to take responsibility for your perceptions and feelings, there will be a constant and nagging temptation to revert to the default stance of simply believing your eyes. Again, I do not mean to say that is wrong. The point is rather to hold a frame of mind in which you can question appearances and feelings without dismissing them. I call this the stance of unknowing. It’s the willing suspension of belief. It enables you to view your experience as experience instead of as reality. You can stand back from your own awareness to view it as a personal creative effort, an art form, or an investigation. Your dissatisfaction with your personal experience during conflict becomes an artist’s dissatisfaction with a brushstroke, a color, a composition. It becomes about the esthetic quality of your own consciousness instead of about settling an account in the outer world. Or it becomes the curious scientist’s dissatisfaction with the current theory, the detective’s insistence on proper evidence, “just the facts, m’am.”

That is one way to stand back. Here’s another. We are naturally concerned not only with the quality of our experience but also with the quality of our behavior. We want to be happy and also to be good. (Because we are social creatures, responsible to and for others, it is very difficult to feel happy unless we are able to feel that we are “good.”) That means to meet with the approval of society and of our own internal judge, which is usually closely aligned with the judgments of others, beginning with our parents. Much of the reason for blaming the other person, and insisting that we are right in a conflict, is to justify the belief that we are good. Much of our “self-talk” is concerned with replaying a scenario in ways that either put us in a good light or at least allow us to explore our doubts. It’s a kind of inner courtroom session, with a prosecutor and a defense—and ultimately a judge or jury. We hope to be exonerated or to get off easy, but if it’s a fair trial it may not go that way. We cannot judge others without judging ourselves. If we hope for mercy from our own inner judge, practicing on others makes perfect.

From the principle of co-responsibility, we can reasonably assume that we are not entirely innocent in any conflict. But just as there is the temptation to take experience naively at face value, there is the temptation to presume our own innocence and the guilt of the other. It takes some discipline to resist that temptation. Moral courage is not defending our innocence but embracing the prospect of guilt. To admit our complicity allows us to see more clearly the whole picture of what is going on and what to do about it.

This has the benefit that it can defuse the aggressiveness of one’s intent. But what if it is the other who is aggressive and we feel ourselves to be merely on the defense? In that case, we go back to square one and ask whether our perceptions about the other’s actions and intentions are correct. Because all experience involves both self and world, subject and object, we can never be sure what in our experience derives from the other person and what from within oneself. Human beings are always in this unenviable position of having to interpret events that are inherently ambiguous. Rather than assuming that the other person is attacking us, we can strategically give them the benefit of the doubt; we can take the stance of not presuming to know the meaning of their actions. I’m not advocating ignorance, naiveté, or even trust, but only a temporary suspension of belief, for the sake of inquiring more deeply. If it turns out the other person is unwavering in their intent to do us harm, we can still defend ourselves appropriately. As often as not, however, it turns out to be a misunderstanding.

Human beings are intuitively tuned to one another. They know how to rub salt in each other’s old wounds. People appearing to attack us may well have found our vulnerable places, even unconsciously. If we feel attacked, it may well be because someone has discovered our Achilles heel—the place where we are apt to feel guilty, blameworthy, or vulnerable—where we judge and attack ourselves. We have the choice then either to blame them for wanting to hurt us (which is secretly a defense against self-attack) or to explore the useful information thus gained about oneself. The challenge is to confront that undesirable trait within oneself rather than confronting the other person who is pointing it out. If you cease to attack yourself for this trait, then possibly the other person will also stop attacking you for it. (If not, the attacks will be more like water off the duck’s back.) This is not a matter of denying the trait nor of feeling guilty about it either. It is not about deciding who is in the wrong. Rather, one must admit it frankly to oneself in order to negotiate the conditions for self-acceptance. That may require some reparation to the world, some admission to the other, some change in one’s behavior. Yet, the question is no longer how I want the other person to behave toward me, but how I would like to see myself behaving.

To sum up, if you want to resolve conflict ask yourself two questions. First: How would I like to feel in this situation? And second: How must I behave in this situation to be the kind of person I want be? The first requires a shift of focus onto your own experience per se, to view it as an inner tableau rather than as a sequence of contentious events in the drama of the outer conflict. The shift is from getting your way to having the quality of experience you want. The second question gives priority to your self-image rather than your image of the other person. You must ask yourself what sort of person you wish to be, with what priorities and values, instead of who you want them to be. You may find that your anger, fear, sadness, desire, or judgments in regard to the situation or toward others simply do not correspond to your present goals and values. Some things may no longer really matter to you, or matter less. Sometimes we are simply caught in old habits and need to update ourselves. As human creatures, we are mercifully blessed with the ability to change our minds.

 

Why I Don’t Want to Live “Forever”

Is it feasible to defeat natural aging and mortality through technology? The answer would depend on a clear understanding of why senescence occurs naturally and the role of mortality in the larger evolutionary scheme. That is a very interesting and complex topic I hope to embrace in future blog posts. A different question is whether it is desirable to cheat aging and death. And that question can be further subdivided. One might wonder, for example, about the negative social and ecological consequences of accumulating immortal bodies. But the usual rationale for life extension is to prolong personal consciousness. Here I will talk about why the prospect of living indefinitely has little appeal for me personally.

For some people the end of life and consciousness is terrifying. Others simply assume that consciousness doesn’t end with death because it doesn’t depend on the life of the body. Or, they may believe it can be artificially distilled from the brain and kept going somehow after the brain dies. For these people, their consciousness is intrinsically rewarding, an end in itself. I agree that consciousness is a marvel of the universe to be cherished, integral with the mystery of existence. I am pleased to think of it as a phenomenon that will continue to exist in the brains of other people after I am long gone. But such appreciation is distinct from any personal wish for my individual consciousness to persist indefinitely. Like fine art and beautiful women, one can appreciate consciousness without feeling driven to possess it.

The traditional notions of immortality make little sense to me, for they have nothing to do with the life of the body. On the contrary, I believe that all perception, feeling, and thought are functions of the body and have no meaning without it. What could pain or pleasure be to a bodiless spirit? What would a disembodied spirit do, and why would it do anything? If such a consciousness were possible, it could be nothing like the experience of living human beings. No, I prefer to think that consciousness is a biological function like breathing, which ceases with the death of the body.

Perhaps technology will one day enable us to indefinitely renew the body (which includes the brain, of course). After all, some body parts can already be replaced, as in organ transplants and hip replacements. The brain too can deteriorate and malfunction, and perhaps it will be possible to replace parts of it or augment it with artificial add-ons. But would this renovated or enhanced person still be me? If not, why should it be preferred to a brand new person who starts life afresh? What is so special about this “me”? Would it be worthy of preservation because of accumulated knowledge, experience, and wisdom from which society could benefit? (Who listens presently to the wisdom of elders, if in their dotage they are wise at all?) Or would it seek to carry on merely out of habit or instinct?

Perhaps technology will offer some way to preserve consciousness in a disembodied state. I doubt it, however, because neither the body nor consciousness can be captured in a computer program. Hence, I doubt that it will ever be feasible to upload a mind or personality (one’s consciousness) to live a perpetual virtual life in cyberspace (i.e., in a computer). But even if it were possible, would that existence be heaven or hell? Since I hold my consciousness to be a sort of natural virtual reality that serves my real body, I cannot see the point of an artificial consciousness that serves nothing real.

The question of personal continuity obviously hinges on what it means to be a conscious self. In my view, my consciousness is no thing that can live apart from my body, but a bodily function like digestion or breathing. It serves cognition and control by interfacing with the world. “I” am that function of my body, and “you” are that function of yours! There is no “me” apart from this body and no basis to believe in life for it after death; but neither is there a reason to fear either personal disappearance or punishment after death. There is no soul in a metaphysical sense, with past or future lives, only DNA that carries forth into future generations. Since one’s consciousness and sense of self are no more than bodily functions, selfhood grants no special entitlement to exist forever or apart from the body. For its own reasons, society might value some bodies or some minds more than others. In nature’s scheme, however, a body is temporary and so is the consciousness that one calls oneself.

Fear of death has a rational basis to protect the body. It makes biological sense to be attached to the survival and well-being of one’s physical body for its duration. I doubt that it makes biological sense for human bodies to live indefinitely. Nor would liberation from a built-in shelf life confer psychological relief from anxiety over mortality. A longer lifespan could still meet with an accidental end and there would simply be more years at stake to worry over. In any case, it is usually personal consciousness with which one identifies and that one hopes to prolong. In the view presented here, it makes no sense to be attached to one’s consciousness as though it were separate from the life of the body or could continue without it. The quest for immortality is a case of mistaken identity.

So much for rational arguments. The desire—or not—to extend one’s life is a more personal matter than reason can capture. In my stage of life, I find myself increasingly weary in mind as well as in body, and weary of the world. I look forward to a good long rest at day’s end. Perhaps if my body could be renewed I would feel differently and more optimistic about my mental prospects in a life that could still unfold. As it is, I feel it is time to begin wrapping things up. A life at its start is open-ended. Even apart from physical ageing and accruing cell damage, the natural course of a life is a narrowing process, weeding out possibilities in order to focus, to become both physically and psychologically the particular person that one is, to play a specific role. In my case, that person has developed particular skills that are useful in some ways more than others. I am no genius. If my brain could be enhanced to genius level, even with a body reset to perfect health I would actually be someone else! I have no objection to that happening, but no attachment to it either!

Perhaps I just don’t value the person I have become enough to wish for him to carry on indefinitely. Perhaps there is no clear line between a graceful resignation and a death wish. I admit I could have made better choices, wasted less time, accomplished more. Though I generously accept the doddering fool that I am, why would I want him to continue forever to take up valuable space? Merely for the chance to redeem myself at the eleventh hour? Do I honestly think I could achieve in the next fifty years what I failed to in the last? Would it not be better to let the next generation have their go at it?

While I ask myself such questions, wouldn’t I be curious to see how things turn out in fifty years, or in five hundred? Perhaps technology will make it possible for my shelved consciousness to reawaken and “drop in” from time to time to see how things are going. After all, there is already cryonics and the sci-fi dream of suspended animation during voyages to the stars. But wouldn’t that be a selfish luxury, a sort of voyeurism? It’s not that I want to participate in life fifty years from now—only to sneak a post-view, from which I could retreat back into eternal slumber if I didn’t like what I see. Such a fantasy implies consciousness without a cumbersome body to oblige participation.

Even if my achievements had been grander, my mind more adept, my energy and enthusiasm keener, my compassion, curiosity and self-love greater, I would not lack for respectable company in thinking that I am part of the natural world where life must come to an end. At age 65, David Hume passed away cheerfully without a belief in an afterlife, which he considered no more plausible than that a lump of coal on the fire would not burn. At age 71, Socrates declined the offer to escape death. Dying in the hospital at age 77, Albert Einstein refused a surgery that might have extended his life another decade. “It is tasteless to prolong life artificially,” he said. “I have done my share; it is time to go.” Though I lack the laurels of such company, I too accept that there is a time to go. I don’t wish to keep this body alive beyond its natural time, whether for its own sake or for the sake of an illusory self.

 

Infodemia

I like to think that the quarantine encouraged a period of sober reflection on issues ordinarily upstaged by the usual demands of daily living—reflection that now demands action. The pandemic gave society literal pause, in which to re-evaluate priorities. It signalled an opportunity to reset expectations, to raise the bar. It created a situation in which we were obliged and willing to listen to our leaders; in exchange we now insist that they listen to us. Yet I can’t help wondering: how is it that for weeks on end the news and social media were obsessed with nothing but Covid19, then suddenly were obsessed with nothing but police brutality and racial discrimination? Does the mind really have a single track, able to focus only on one theme at a time? Given that reality is multifaceted, what are the implications of this as a phenomenon?

Of course, during the shut-down there was a genuine reduction of the activity that could lead to notable events competing with health-related news. And unquestionably, police brutality and racism are such serious chronic issues that people passionately want to bring them to a final reckoning, now that they feel freer to emerge from isolation. Many of the people taking to the streets have actually been victims of racism or police brutality or know someone personally who has been. They know what they are talking about. Yet, I remain concerned about the bandwagon effect in which the media seem to have a one-track mind. However appropriate and urgent, the fixation on pandemic-related matters (and now on race-related matters) also raises the question of what is newsworthy, and what is the proper role of news casting. Far from being a comprehensive and in-depth account of world events, the principal role of the news seems to be to spread “memes,” which are the mental equivalent of viruses. The expression “go viral” is no coincidence.

The term infodemic has been used to describe the widespread misinformation and fake news surrounding the coronavirus pandemic. More broadly, it characterizes this era of social media. If the news and social media are vectors of that disorder, what are its background causes, and what might be the cure? What makes society (people) susceptible to rumor, hoax and fad—a condition that could be called infodemia. Basically, it’s the challenge to sift the true or worthy from the confusing chaff of information with which we are daily bombarded. Infodemia is the downside of modern connectivity.

Part of the problem is the proliferation of new information channels, where anything goes. Social media and self-publishing lack the editorial functions built into traditional modes. On the other hand, part of the problem lies with the old channels—the news media and the traditional publishing industry. These have morphed into commercial giants whose interests, motives, methods, and dominating point of view are suspect. Consolidation renders them top-down and hyper vetted. Concentration of ownership means power to exclude all but a few voices. Ironically, it is technologically easier than ever to print a book or article, but harder than ever for a new or marginal author to emerge. Monopoly makes a monoculture. Alternative channels have emerged to compensate, which are rapidly displacing the conventional ones. But, if the latter are too selective, the former offer no discrimination at all. We used to rely on editors to provide us with quality literature. Now they simply defer to our taste (or lack of it) to determine what the market will bear. We used to rely on journalists and newscasters to vet the information with which we are daily glutted. Now they seem all too anxious to jump on the bandwagon of what they think the public wants to hear about. This is the kind of feedback loop that becomes a vicious circle.

But that is hardly news. In fact, it resembles the crisis of faith introduced by the printing press. For centuries, religious ideas had been vetted by a priestly hierarchy, who had privileged access to the written word. The “one true church” carefully guarded its advantage by regimenting one true doctrine. Then suddenly printing gave wide and easy access to scripture and diverse commentary on it, as well as to the pagan writings of the ancients, bypassing Church control of knowledge. The Reformation was not only a religious revolution and a protest against the corruption of the Church. It was also a media revolution. The more orthodoxy was challenged, the more literature attacking or defending it proliferated. The medieval worldview was shattered; but from the debris emerged the synthesis we know as the modern worldview.

The other side of burgeoning information was the burden placed on the individual to sort through it. Your beliefs could determine your fate, both in this life and the next. While the Church had long told its flock what to believe (its own version of fake news), suddenly people now had the means and the need to decide for themselves. But, the burden of choice is anxiety. The stakes were high, since you could go to hell or be burnt alive! We are living through a similar media revolution, with its burden of choice and attendant anxieties. We have similarly cut loose from orthodoxies and authorities in which we formerly had faith, released to fend for ourselves in a wilderness of information and conflicting opinions. The challenge is once more to know what to believe—where to stand or not to stand. The potential is once again for a new vision to emerge from this chaos. Eternal hellfire may no longer seem a threat to most people. But the stakes remain high, for hell on earth remains a real possibility.

Perhaps the only cure for infodemia is common sense. If there is truly such a thing, it surely has a dimension of empathy and fellow-feeling as well as maintaining an even keel. Unfortunately, common sense did not prevail in the Reformation, which was plagued by extremism and civil wars. Despite the ideology of progress, it did not fare well in the 20th century. It did (barely) prevail during the Cold War, to the extent that no nuclear attack was actually launched—despite a few close calls, inquisitorial repression on both sides, and the inflammatory rhetoric of the period. The world seems now on the brink of a new unity, occasioned perhaps by the pandemic as a common cause. It seems to be moving forward to redress the deep roots of disunity. These include racism (and more generally Otherness) and police brutality (and more generally the unequal distribution of wealth). Hopefully common sense will prevail to guide us, wiser now, into a more equitable future.