Why nature should be accorded the rights of personhood

Let us first understand what a person is and why nature is not regarded as a person in our culture. But, even before that, let us acknowledge that nature once was so regarded. Long before industrial society became obsessed with the metaphor of mechanism, ancient peoples conceived the world as a sort of living organism. It was peopled with creatures, plants, spirits, and other entities who were treated as personalities on a par with human beings. Then along came the scientific revolution.

Nature for us moderns is impersonal by definition. For us, physical things are inert and incapable of responding to us in the way that fellow human beings are. Our science defines nature as comprised only of physical entities, not of moral agents with responsibilities and obligations. Humans have formed an exclusive club, which admits only members of our kind. Members are accorded all sorts of extravagant courtesies, such as human rights, equality before the law, the obligation to relieve (human) suffering and save life at any cost, the sentiment that (human) life is infinitely precious. We have appropriated half the earth as our clubhouse and grounds, expelling other inhabitants toward whom we feel no need to extend these courtesies. Having eliminated most of those formidable competitors who posed a threat and demanded respect, the remainder of creatures are no more to us than raw materials, to use and enslave as we please, or to consume as flesh.

We have done all this because we have been able to, with little thought for the rightness of such an attitude. In the Western hemisphere, our society was founded on domination of the peoples who occupied the land before the European invasion (the “discovery” of the New World). Significantly, it went without saying that this included the assumed right to dominate the other species as well. In other words, plundering these lands for their resources, and disregarding their native human inhabitants, went hand in hand to express an attitude that depersonalized both. The Conquest, as it has been called, was simultaneously about conquering people and conquering nature.

Probably this attitude started long before, perhaps with the domestication of wild animals and plants. The relationship to prey in the traditional hunt was fierce but respectful, a contest between worthy opponents, especially when large creatures were involved. (Can it be coincidence that losers of this contest are called “game”?) Often an apology of gratitude was made to the murdered creature, no part of whose valued carcass was wasted. When animals were enslaved and bred for human purposes, the tenor of the relationship changed from respect to control. The war on animals concluded in a regime of permanent occupation. A similar shift occurred when plants were no longer gathered but cultivated as crops. Both plants and animals were once personified and treated as the peers of humans. That relationship of awe gave way to a relationship of managing a resource. The growing numbers of the human invaders, and their lack of self-restraint, led eventually to the present breakdown of sustainable management.

Today we embrace a universal biological definition of human being, and a notion of universal human rights and personhood. Throughout history, however, this was hardly the inclusive category that it now is. Groups other than one’s own could be considered non-human, reduced to the status of chattel, or even literally hunted as prey. In other words, the distinction between person and thing remained ambiguous. Depersonalization is still a tactic used to justify mistreating people that some group considers undesirable, inferior, or threatening. Personhood is still a function of membership in the exclusive human Club. But since the qualifications for that membership were unclear throughout most of human history, perhaps it works both ways. Instead of excluding animals, plants, and nature at large, we could give them the benefit of the doubt to welcome them into our fellowship.

Now, what exactly is a person? In fact, there is no universally accepted legal or even philosophical definition. The word derives from the Greek term for the theatre mask, which indicated the stage character as distinguished from the mere actor wearing it. It means literally the sound coming through the mask, implying a transcendent origin independent of the physical speaker. Personhood so understood is intimately tied to language and hence limited to human beings as the sole known users of “fully grammatical language.” Other creatures do obviously communicate, but not in a way that entitles them to membership in the Club.

Various features of personhood—loosely related to language use—include reason, consciousness, moral and legal responsibility, the ability to enter contractual relationships, etc. Personhood confers numerous legal rights in human society. These include owning property, making and enforcing contracts, the right to self-govern and to hire other agents to act on one’s behalf. It guarantees civil rights to life and liberty within human society. One could argue, on behalf of non-human creatures, that they meet some of these criteria for personhood. In many cases, like aboriginal people, they were here before us, occupying the lands and waterways. Western society is now judicially and morally preoccupied with historical treaties and land claims of aboriginal groups—and more generally with redressing social wrongs committed by former generations. Certainly, our exploitive relation to nature at large is among these wrongs, as is our checkered relation to particular species such as whales, and ecological entities such as forests, rivers, oceans and atmosphere. Thinking on the subject tends to remain human-centric: we are motivated primarily by threats to our survival arising from our own abusive practices. But such a self-serving attitude remains abusive, and morally ought to give way to a consideration of victims that are not yet extinct, including the planet as a whole. If restorative justice applies to human tribes, why not to animal tribes? Why not to nature at large?

Animals reason in their own ways. Although insects may be little more than automatons, vertebrates are clearly sentient; many display concern for their own kin or kind and sometimes for members of other species. While they do not make written contracts, at one time neither did people! Animals (and even plants) maintain measured relationships, not only among their own ranks but also with other species, in an intricate system of mutual symbiosis and benefit. These interdependencies could be regarded as informal contracts, much like those which human beings engage in quite apart from written contract and law. Deer, for example, tend to browse and not over-forage their food supply. Even viruses are reluctant to kill their hosts. It could be argued that human beings, in the course of substituting formal legalities, have lost this common sense and all restraint, and thereby forfeited their claims.

Of course, we now say that natural ecological balancing acts are not “intentional,” in the conscious human sense, but the result of “instinct” or “natural selection,” and ultimately of inanimate “causal processes.” This is a very narrow and human-centric understanding of intentionality. Fundamentally, it is circular reasoning: intending is what people do, in the human world, while causality is held to be the rule in nature. However, both intention and cause (and even the concept of ‘nature’) are human constructs. Causality is no more an inherent fact of nature than intention is absent from it. It is we humans who have imposed this duality, thereby seceding from the animal kingdom and from nature at large. Humans have formalized their observations of natural patterns as laws of nature, and formalized their own social conduct and conventions as laws of jurisprudence. Other creatures simply do what works for them within the system of nature without articulating it. Are we superior simply because we gossip among ourselves about what other creatures are doing?

Are we superior because might makes so-called rights? We chase other creatures (and fellow human beings) from a habitat and claim to own it. But private property, like money, is a legalistic convention within the Club that has no clear connection with any right of ownership outside of human society. “Ownership” in this broader sense is more deserved by those whose proven behavior adheres to the unwritten laws that maintain an ecosystem in balance with their fellow creatures. It is least deserved by those who destroy that balance, while ironically permitted to do so through written laws. If stewardship is a justification for aboriginal land claims, why not for any right of ownership of the planet? Almost any species has a better claim to stewardship than humans. After all, we have rather botched the unceded territory that the Club now occupies.

Nature is self-regulating by nature; it doesn’t need to be granted the right to self-govern by human beings. It cannot speak for itself in human language, but expresses its needs directly in its own condition. Human scientists appoint themselves to interpret those expressions, largely for human purposes. The State appoints a public defender to represent people who cannot afford counsel or who are deemed unable to defend themselves. So, why not public defenders of the planet? Their job would be to insure, in the eyes of human beings, nature’s right to life, liberty, and well-being. That could only benefit us too, since it has never been more than a delusion that we can live segregated from the rest of nature.

 

An Objective Science of Economics?

Physicists and even biologists can pretend to an objective understanding of the phenomena they study. That is to the degree they can stand outside the systems concerned. Is that possible for economists? While the economist is perhaps more akin to the anthropologist, even the latter is traditionally an outsider to the cultural systems studied. The ubiquity of modern global capitalism poses the same dilemma to the economist now that the anthropologist will face when all indigenous cultures have been absorbed into modernity.

This has not stopped generations of economists from holding forth on the “true” nature of such concepts as value, wealth, capital, and growth. Biologists, and even anthropologists, have been able to analyze living systems in their own human terms—as distinct from terms proper to the living systems themselves. That is, scientists dwell in a world apart from the world they observe. No such separation is possible for economists, who inevitably think in the terms characteristic of their society, which is increasingly a monoculture alienated from nature.

Economic concepts reflect values current in modern capitalist society. Despite rival and contradictory theories, on some level this renders all economic theory fundamentally self-serving, even self-confirming. At worst it is misleading and deceptive, often embraced by governments to legitimize their policies. For example, the “trickle down” notion endorsed by Reagan and Thatcher was used to justify an unregulated market, on the promise that profit going to the elite would ultimately benefit society as a whole, if not equally. This, when it was already widely apparent that inequality of wealth was expanding far more rapidly than wealth itself! The tacit assumption is that no one should object to the accelerating concentration of wealth, provided their own situation was moderately improving or at least not getting worse. By hiding behind averages, however, economic analysis can obscure what people on the ground are actually experiencing, which in any case is subjective and includes things like envy and perceived unfairness. Concepts like ‘national income’, ‘GNP’, ‘total capital’ (private or public) are statistical averages that usefully reveal inequalities between countries or regions, and over time. Yet, they mask huge inequalities among individuals within a given country.

Life in consumer society has in many ways been visibly democratized. We are theoretically now equal before the law, each with a single vote. The modern upper class blends in with the rest of society more easily than traditional aristocrats, who flaunted their class status. The modern gated community is far less conspicuous than the castle on the hill. The Jaguar or Tesla is not as flagrant a status symbol as the gilded horse-drawn coach; it must travel on the same public highways as the rest of us and obey the same traffic laws. First class air travel is only moderately less uncomfortable than “economy” class; most of us have not seen the inside of a private jet and probably not the outside either. The extreme wealth of the very rich consists less of evident status symbols than invisible stock portfolios and bank accounts. Yet, while the modern rich may be unobtrusive, they are increasingly powerful. And the poor are increasingly numerous, obvious, and disenfranchised.

Most of economics since Marx has been little more than an apology for systemic inequality. One reason for this may be that “the economy” is as complex as human being. The stock market and modern investment “instruments” are accordingly abstruse, with those in the know privileged to gain from them. It is little wonder that the theories devised by economists (mostly academics) have been so esoteric. But this makes it all the harder, even for them, to call a spade a spade or to declare the emperor nude. Yet, millions of people are possessed enough of common sense to see that the rich are getting richer and the poor even poorer for reasons that can hardly be historical accident. The gains of the middle class in mid-twentieth century are quickly eroding. Their very assets (such as mutual or pension funds) are being used against them to line the pockets of corporate executives. The modern capitalist system has been honed by those who control it to siphon wealth from the now shriveling middle class. This can hardly be surprising, since—to put it bluntly—the purpose of capital has always been to create inequality! What is surprising is how efficient it has become in the last few decades at undoing the hard-won gains of the middle class.

Modern economists may or may not be capitalist dupes, and the top-heavy capitalist system itself may or may not be doomed to collapse. But is there another role possible for a science of economics? The traditional focus of economic theory has been anthropocentric by definition. It is about the production and distribution of wealth in very human terms, and in many cases from the perspective of the ruling class. In particular, it does not consider business and growth from a planetary point of view, in terms of its effects on nature. Such a perspective would be nearly a negative image of the human one: all that is considered an asset in human terms is largely a deficit in the natural economy. This deficit has become so large that it can no longer be ignored by the human economy; it threatens to break the bank of nature. If an objective economics is possible at all, it would at the least have to encompass the planet as the true stage upon which all economic concepts must be redefined to play out. By this I do not mean carbon credits or assigning a market value to natural resources or to services provided by nature, such as recycling air and water. I mean rather a complete shift from the human perspective to that of nature. From that perspective, what would have value would be services provided to nature by human beings, rather than the other way around.

No doubt, from the natural point of view, the greatest service we could perform would be to simply disappear—gracefully, of course, without wreaking further havoc on the planet in the process! I don’t think people are going to agree to this, but they may ultimately be forced to concede at least some rights to the rest of the biosphere as a co-participant in the game. A more objective economics would at least give parity to human and natural interests and would focus on the interaction between them. Concepts like “value” and “growth” would shift accordingly. In human economics, value has been assigned a variety of meanings, reflecting the daunting convolutions of the human world in which economists are perforce enmeshed: labor-value, use-value, exchange-value, surplus value, market value, etc. Yet, an economics that focuses equally on the needs of nature could simplify matters by moving the human player from center stage. It could become an actual science at last, insofar as it would study the world independent of human values, like other sciences are supposed to do.

The various anthropocentric concepts of value could fold into to a single parameter representing an optimum that maximizes both human and natural interests. This approach is used in game theory, where contending human players vie strategically, and the optimum looks like a compromise among opponents trying to maximize their individual “utility” and minimize their losses. Why not consider nature a player in the game? And why not consider strategies based on cooperation rather than competition? Suppose the aim is to arrive at the best possible world for all concerned, including other species and the planet as a whole. Value then would be measured by how much something advances toward that common goal, rather than how much it advances individual human players at the expense of others and nature. Growth would be the increase of overall well-being.

Money in such a world would be a measure of duties to perform rather than privileges to exercise. It is often said that money confers freedom to its possessor. In an objective economics, this freedom would be put in the context of how it affects other beings, human and non-human. The value of a “dollar” would be the net entitlement it represents: its power to benefit its holder minus its power to damage the environment, which includes other beings. Put more positively, money would measure the power to contribute to the general welfare—not the power to consume but the power to do good. The individual would benefit only statistically and indirectly from spending it, as a member of an improved world. As in prehistory, the individual would again figure more as an element of a community than as a competitor with other individuals. Admittedly, this would be communism—perhaps on steroids. But communism as we’ve known it is passé, both as a failed social system and as a dirty word in the capitalist lexicon. There are many who wish to continue to believe in the opportunism of modern capitalism, bumbling toward ecological and social collapse. But they are obliged to find a new scare word to castigate the threat to their individual liberty, which is little more than freedom to hasten the destruction of the planet.

The Problem with Money

Capitalism is the practice of using wealth to collect interest or rent. That means that those who possess capital can have an income not from their labor but from their property. It’s important to grasp that “real” productivity comes from work that provides a service or transforms materials into something humanly useful—whether the exertion comes from muscles or from machines, and whether the materials are physical or more intangible, such as concepts. Income from capital means more spending power, but not necessarily more productivity. In other words, your spending power can grow on its own, but only your labor can make real value. While capital can increase your share of total wealth, it cannot by itself increase the supply of goods in the world that have real value.

Some economists consider it a problem when global capital is not productive enough. Being super-rich already, its owners may not be motivated to use it as effectively as someone with less means. However, productive of what? Effective toward what end? In the face of ecological disaster, production that results in pollution, exhaustion of non-renewable resources, and global warming is not objectively desirable. Consumption that requires such production should be discouraged. Money sitting idle is at least money not being used to destroy the planet!

An individual is both producer and consumer, according to how their personal wealth and energy are disposed. The extremely rich have excessive spending power as consumers; yet, only so much personal consumption can be directly satisfying. (A person can only sleep in one bed at a time, drive one car, fly one plane, eat one meal, etc.) Spending power represents choice for the wealthy more than direct consumption. Of course, they may cause more environmental fallout in their quest to establish that range of choice (several houses around the world, several cars, private jets and boats, etc.) Yet, it could be telling to imagine the cumulative effect of the same total spending power if it was distributed over many poorer individuals, who might be more motivated to use their comparatively meager resources as capital to further increase their wealth. Would the redistribution result in more or less global warming? In other words, might the capital of a large middle class have a worse ecological effect than the same capital in the hands of a small super-rich elite? If so, would it not be ecologically better to maintain the unequal distribution? That is a big “if,” and I am not justifying inequality, which many perceive as the world’s foremost social problem. Rather, I want to point out that merely redistributing wealth in the name of fairness would not solve our global ecological challenges. Productivity and growth must be altogether redefined, in such a way that eliminates production detrimental to humanity’s long-term interests.

Laissez-faire means one can buy what one wants if it is available and produce what one wants in the hopes someone will buy it. But nature is no longer going to let us do just what we want. Government could intervene to limit production to certain “goods” that are indeed good (or less bad) for our human future. It could also limit what can be purchased, by specifying what credits can be used for. While libertarians would object to such impositions, they are free to embrace and advocate voluntary restraint. One could argue, under the circumstances, that one is morally obliged to do so, since the other side of freedom is responsibility. In effect, we turn to government to make us do what we know is necessary, but which we resist—in part because we fear others will not do their fair share. The job of government is to impose fairness along with order, like a parent who must deal with squabbling children.

The whole advantage of money is that it can be used for any purpose that others accept, regardless of who it belongs to or where or in what form it is stored. But, what if the dollar were not this anonymous and abstract unit of power, but could be used only for specified purposes—for example, for basic necessities or for “green” projects? This is entirely feasible in the digital economy. A unit of wealth could be spendable only on certain products or services, or used to finance only certain projects or activities. Each dollar (or euro or yen) would have its own earmarked identity, could only be used for its designated purpose, and would be a non-transferable credit. No doubt a black market would arise to get around such restriction. But that loophole effectively already exists, in the form of untaxed offshore accounts and other unfair advantages for the ultra-rich.

If total global wealth were distributed evenly, an individual typically might not have enough savings to finance an enterprise beyond a certain scale. And where wealth is distributed unevenly (as it almost always is), the vast majority will not have the needed resources—which is why they borrow from those who do. This is a fundamental unresolved problem of social organization. At every level of community, people have always needed to pool resources to get some things done. When a few control the lion’s share of resources, however, the temptation is to charge the community for their use. From the community’s point of view, this is a form of extortion. “Usury” was once forbidden by religions for good reason. We have gotten used to it, so that it seems normal and necessary, even if unfair. The result is a mounting burden of private and public debt, with crippling interest payments and skyrocketing rents and real estate prices.

If 70% of global wealth belongs to 10% of global population, then those wealthy individuals are 70% responsible for how the world’s resources are used. The fate of the planet lies proportionately in their hands. Quite apart from the question of redistribution, fairness dictates that the greatest pressure to reform should fall upon them through taxation and legislation. But rather than being an incentive to use capital more “efficiently,” these measures should be used to foster production that does not damage the biosphere. The rich could take the lead in setting an example concerning the sort of investment to make. Rather than simply reducing inequality of wealth, the focus should also be to reduce its environmental effects.

To reduce the environmental impact of production, manufactured products must be designed to last for generations. They must satisfy real needs. Public and private property must be thought of as a patrimony to be passed on generation to generation. Nothing must be produced for the sake of making money—that is, merely to establish or justify one’s slice of the economic pie. On the contrary, a basic living should be everyone’s right! An economy is a game with many players, in which each person’s share is decided and justified. Yet, the game is not only about the individual’s winnings in a contest with others. For, every society is a collective enterprise, in which something is achieved beyond the holdings of the individual. The human enterprise as a whole is far more cooperative than competitive.

Granted that we need some material stuff, it should at least be durable and of high quality. However, since the beginning of the industrial revolution, production has only indirectly been aimed at satisfying real human need. The immediate goal has been to increase the investor’s share of the economic pie: make more widgets to get more money. It was never to make lasting stuff; or stuff so excellent that it didn’t invite an improved version. Quite the contrary, it didn’t take long for manufacturers to discover built-in obsolescence. With the development of plastics, durability and quality for the most part went out the window. The real value of “goods” (that is, their ability to satisfy genuine human need) was displaced by their symbolic or token value (that is, as a pretext to make money). The irony is that the quality of available goods was lowered for the rich along with everyone else. Spending power is thereby diluted for everyone in terms of its ability to purchase things and services of real value. As always, in compensation, the ultra-rich have sustained a cadre of artisans who can still produce the quality they alone can now afford. Otherwise, what would they have to show for their wealth?

It is largely the production of material goods that results in the carbon footprint. To some extent, the modern economy is already being de-materialized by information technologies. But Man cannot live by information alone. Material goods remain the gold standard of economic value, because we remain physical beings with material needs. In some sense, de-materializing the economy inflates the unit of value and slows “real” growth. And that growth must slow down for the sake of the planet. Yet low growth has a counterintuitive consequence in addition to the obvious belt-tightening for all: it increases inequality. Low growth increases inequality because the rate of return on capital (possessed disproportionately by the rich) has always been higher than the rate of growth (produced by everyone collectively). This has mattered less in exceptional times of high growth, because there is then a surplus to benefit everyone (so-called trickle-down) even though the rich benefit more. But high growth rate is neither historically normal nor sustainable ecologically. We have to find a way of life that is both fair and minimal in its planetary impact.

 

DAN’S NEW BOOK: Holy Terror and the Beauty of It All

This is to announce the launching of my latest book, Holy Terror and the Beauty of It All: how to live with existential anxiety. The text can be downloaded at no cost from the Archive: Collected Writings. (Note: the first two pages are intentionally blank—scroll down for the title page and text.) Or, if you like to hold a book in hand, a physical copy can be ordered from this link: https://store.canambooks.com/holy-terror-and-the-beauty-of-it-all-how-to-live-with-existential-anxiety.html

From the back-cover blurb:

Human beings have always felt insecure in the world. Despite confident proclamations by science and religion, no one can be absolutely certain what is really going on here in this drama we call existence. In contrast to the eternity and boundlessness intimated in our consciousness, we are haunted by the realization that we are finite, vulnerable, mortal—and perhaps meaningless—creatures. The ambiguity of all experience leaves us in a state of fundamental uncertainty, with a buried anxiety underlined by fear of mortality.

Holy Terror and the Beauty of It All is a reflection on the human condition. The first part of the book explores personal and cultural defenses against existential anxiety. The second part presents a theory of consciousness as a guided hallucination created by the brain. The third part proposes a stance of appreciation as an antidote to anxiety: the other side of “holy terror” is “the beauty of it all.” The ability to consider consciousness as a personal creation, rather than a window on the world, enables us to appreciate experience for its own sake, despite uncertainty. It is the basis of art, and also of tolerance, responsibility, and the creative ability to think, perceive, and act outside the box. Available at all times, this stance can provide special consolation in old age and in situations of deprivation or despair. The book concludes with a discussion of the complementary roles of belief and doubt, reaffirming the value of standing back to appreciate the grandeur of the whole. A Postscript follows up the topical relevance of these ideas in the age of pandemics and fake news.

Uncertainty is nothing new

Much is made these days of the uncertain times we live in. In particular, we seem to be in the midst of an information crisis. Paradoxically, a glut of available words, images, and numbers floods daily life—from news, social media, websites, etc. At the same time, there is growing distrust of traditional sources of information. The two are clearly related. The internet provides instant and easy access to facts and opinions that formerly required a lot more trouble to encounter. Specialized experts used to be trusted to establish knowledge and to locate and vet sources for reliability on our behalf. Scientists and academics, medical professionals, librarians and editors, critics and censors, peer reviewers, journalists, news casters, and professional writers of all sorts mediated information for us, advising us of what was worthy of consideration. The new glut of ­unmediated voices is a modern tower of Babel that confounds us and creates an atmosphere of suspicion. For, the sheer abundance of choice gives the deceptive impression that all claims might be of equal merit, that everything boils down to opinion. Even before the advent of social media, the religious right had popularized this notion with its platform that Creationism should be taught in public schools as a legitimate alternative to Darwinism. And then, perhaps, consumer advertising conditioned us to think of ideas less for their content than for their packaging.

The political climate has indeed dovetailed with religion, and overlapped with suspicion, at least in the United States. But there is a long history of the struggle of secular and religious ideas leading up to the present situation. There was a similar information crisis when the printing press changed European society forever, just as the digital revolution is now doing worldwide. The printed word was instrumental in the Reformation, prior to which a literate elite controlled the flow of information. The distribution of pamphlets and translation of the Bible into common language meant that literacy increased and lay people could study and evaluate the written basis of their ideology. And that meant that the prevailing order could be and was questioned. In particular, one could see for oneself that the Church did not follow the teachings of the scripture upon which it was supposedly founded. Expanding literacy and availability of the printed word led to a crisis of faith in authority and the keepers of knowledge. Opinions were multiplied, much as in the present crisis. The burden of interpreting the truth then fell upon the common citizen, as it now does once again. What to believe was dangerously up for grabs. The stakes were high in the Reformation period, because one’s eternal spiritual fate hinged on what one believed, and in many cases one’s physical safety did too. As now, this created deep divisions in society, with tensions that could (and did) erupt into violence.

It is no coincidence that the major religions came into being at a time when relatively isolated societies began to come into more frequent contact with each other. Ethical precepts emerged to govern the relation among individuals newly recognized as members of an expanded human tribe. That is, religion advised how to treat one’s “neighbor” in a broader sense, when traditional identity within a social (often racial) group was confused by the intrusion of outsiders. The forceful imposition of religion upon “pagan” or “infidel” groups in conquest could be viewed as a desperate measure to assimilate them into one’s own society. Unfortunately, the converted were typically never more than second-class citizens. Often they were literal slaves, in ironic contradiction to the ethical teachings of the religion concerned, which were applied only to those fully recognized as the group’s own kind. Such hypocrisy, when perceived, has frequently been a stimulus to reform. On the other hand, tribal identity has always shaped our behavior toward others, despite law and religious ethics.

In post-Civil War America, a large racially distinct population of slaves had suddenly to be assimilated into the citizenry. In the post-colonial Europe, people with an ambiguous status as subjects of the former empire began to infiltrate the home society in a kind of reverse osmosis. Though the U.S. was scarcely a literal colonial power, economically it certainly has been. Many from its virtual empire have returned to roost within U.S. borders—especially from Mexico, much of whose early territory had been stolen by the U.S. in the first place. Most recently, an influx of political and economic refugees from Africa to Europe constitutes a global migration that will only be accentuated by climate change. In all these cases, ethics has largely failed to guide society with a universal understanding of how to behave toward one’s fellows, according to a more inclusive definition of fellowship (the lesson of the Good Samaritan).

The struggles of the Reformation against the corruption of the Church are mirrored in the struggles of popular movements against the corruption of ostensibly democratic government, which is really oligarchy. Such movements can be left or right-leaning, with little recognition of their common ground. (The left-leaning movements tend to be secularist, while the right-leaning ones tend to be fundamentalist.) Humanists forget the origins of humanism in religious values; nominal Christians forget Christ’s teaching of love and humility. This is only to say that we have been here before. During the Reformation and again now, society was extremely divided because the basis for a common understanding collapsed. That is a dangerous situation, as history has shown, because without a common understanding there might be no common acceptance of authority; and without that, no rule of law.

However, it is not uncertainty itself that is the problem. Life has always been uncertain, for all creatures. (Imagine how the dinosaurs would have felt if they had understood astronomy and the real catastrophe looming at them from outer space!) The problem is our low tolerance for uncertainty, which translates as the need to believe. That is: the need to believe something—anything at all—even when it lacks a basis in fact. The need for certainty is the root of conspiracy theories and the dividedness that now besets America and many places in the world. For, the very notion of fact presumes a common ground of which we can all be assured, a common recognition of what qualifies as reality. The problem chases its own tail: without a common understanding, there can be no facts, and hence no agreement, and hence no certainty.

That situation is not new either. It was the situation on the eve of the scientific revolution, which displaced medieval scholasticism. Fact was the missing ingredient in scholastic argument, which drew mostly on hearsay and referred only to the writings and claims of other thinkers—never to actual observations of the natural world. The human world had enveloped itself in its own closed realms of circular reasoning. In those realms, imagination had free sway, unguided by reality, at the same time that it purported to be an account of reality. It was therefore a hotbed of contentious nitpicking. Science broke through this bubble of self-confirming opinion by insisting that nature had to be consulted. Nature lay outside the human world, as the stage upon which the human drama was played out. It is the literal common ground of all creatures, the common basis for fact. Unlike scholastic philosophers, scientists could come to agreement because nature was the ultimate authority, the arbiter of disputes.

The current rejection of the authority of experts includes suspicion of science—all the more when scientists appear to be “for hire” to support positions on issues outside science. All the more when the technology brought to us by experts seems also to be ruining the world! In throwing the baby out with the bathwater, however, we revert to the medieval ethos of contention without a grounding in the possibility of mutually recognized fact. In part, this may reflect the migration of humanity away from contact with nature to live in urban environments, which has recreated a new self-enclosed cultural bubble. The concept of the wild has lost all meaning for most of humanity. The human transformation of the planet is now so extensive that there hardly seems a nature left for science to study. Scientific concepts and theories have become so abstruse, and the experimental evidence on which they are based so tenuous and esoteric, that they seem to resemble scholastic arguments. In other words, they are perceived as little more than opinions.

Once again, everything seems up for grabs in a free-for-all of opinion. In some ways, we’ve come full circle since the days of the Renaissance. Though we seem to be sinking back into a morass of blind faith and superstition, perhaps this means that we are on the threshold of a new Reformation, and a new rebirth of thought. As nature was the common ground upon which science first arose, perhaps it will be again the basis for a more unified world, as its demands on us become more urgent, unmistakable, and indisputable. In the first Renaissance, the early scientists called upon nature to provide a common understanding for human benefit. Now nature is calling upon us to come together for nature’s benefit. One hopes for the best.

Tragedies of the Internet commons

A specter is haunting America. It is the culture of divisiveness and hate fostered by social media. The combination of ubiquitous personal devices with social media platforms has precipitated an epidemic new opiate of the people. The internet as a whole started out as a commons—the Information Highway—only to become a plundered landscape, polluted by advertising and trivialized by misuse. Beginning as a great boon to the common welfare, it has degenerated to a weapon for mischief makers, a mere marketplace, or a new entertainment.

Social media platforms may rightly be held accountable for nefarious consequences to society of their “services,” which include the potential to shape public discourse and even derail elections. But the user, too, is accountable. While it is easy to blame technology and successful corporations for society’s woes, here I want to explore the internet user’s responsibility. After all, it is consumers who enable consumerism. The domination of the world by large corporations is only possible because people buy their products. Just so, social media are only successful because people misuse them and overuse use them—in particular, to relieve frustration and boredom. Commercial advertising has literally filled cyberspace. But, advertising is only profitable because it is presumed to work. Underlying the consumer mindset is a kind of passivity and emptiness, a lack of proactivity and self-possession. The first simple truth I propose is this: those who are not driven sensibly from within will be driven senselessly from without. Social media have stepped in to fill a vacuum of will.

Teens seem to be particularly vulnerable to exploitation by social media. That is because they are by definition not yet completely formed. They may not yet know fully who they are or what they value and intend in their lives and in the world. That is as it should be in a period of experimentation and self-discovery. Education is supposed to elicit knowing who you are and provide the thinking skills to find, evaluate and use information. It is very challenging for educators when students cannot focus because mobile phones dominate their attention. All the more when their attention span has shriveled because they had the misfortune to grow up in the digital society. Exploiting the vulnerability of the under-aged is child abuse. Exploiting the vulnerabilities of those who are old enough to know better may be legal but is unethical. It may have begun with good intentions; yet, it points back at the exploiters as sociopaths, who are either unable to properly foresee the consequences of their actions or simply don’t care. Good intention is not sufficient to guarantee a good result.

A second simple truth is this: Information is of little positive use to those with no clear intention of their own. At best it is entertainment, distraction, noise. At worse, it means giving oneself over to be manipulated. If you are not pursuing your own goals (which might include the deliberate goal of having no goal), chances are you are being used to further someone else’s. There is no excuse for that state of affairs in the adult of the species. But there is a name for it. Anomie is “a condition of instability resulting from a breakdown of standards and values or from a lack of purpose or ideals.” It is the vacuum inside, which makes a society prey to demagoguery and media manipulation.

While America has always been divided (they fought a bloody civil war), it has not always been so vacuous. Just as education has been disrupted by social media and an atrophied attention span, so has the political culture. Neil Postman notes that people who gathered at the Lincoln-Douglas debates stood listening attentively for hours. They would adjourn for dinner and come back for hours more of political debate. Media now typically assume that attention can be maintained for seconds only. But attention is relative to intention. If you are seeking information for your own purposes (such as to know who to vote for), you will have (or you will acquire) the discipline and patience necessary to accomplish your goal. If you have no goal, your mind will likely wander until it is grabbed by the most seductive bauble, the most quotable tweet or persistent ad—or by anything at all simply to fill the void.

Here is a third simple truth: even a conscientious tool user may be raw material for someone else. In our high-tech society, we love our tools and apps. Though they may seem to be free of charge, however, there are hidden costs. (As critics of social media are fond of pointing out, we are the resource they are mining.) If your data profile is commercially worth something to someone, it is because they believe they can influence you in some way, usually to buy something. In other words, they hope you have no will of your own! If you know your own mind, you know what you need and will actively shop for it of your own accord without solicitation (though you can still be influenced by misinformation and how your access to information is controlled). You also know what you don’t need and you will not be a victim of irrelevant temptation or gossip. Advertising and other attempts to manipulate people work best on those who are unclear about what they need and don’t need. The more intentional you are about your tools and your goals, the less likely you are to be (even literally) sold a bill of goods.

And here is a fourth simple truth: profit is no justification. The bottom line is no basis on which to run a decent society. Corporate profit comes down to personal gain of executives and shareholders. But personal gain, personal status, or personal influence is simply a poor reason to do anything­ at all—even commerce. A better reason (yes, even in business) is to promote the general good—to make the world an objectively better place. Since the “world” is all of us, this can only be attempted in respectful dialogue and cooperation with others. This is the common truth that America is missing in its infatuation with individualism and freedom of expression.

The internet began as a giant show-and-tell, where people could share information with others for their mutual benefit. It didn’t take long for it to degenerate into a marketplace. Just as nature was once a commons, in principle shared by all life, the internet started out as a commons for non-commercial use. There were infrastructure costs involved from the outset, of course, and ongoing maintenance, which we continue to pay as fees for internet access, just as we pay taxes to maintain our roadways and parks.

We enjoy the view along the Internet Highway better without the clutter of billboards. The Web was (and should be again) a non-commercial public utility, free of advertising. It is only greed that saw it as a New World to pillage, the biggest post-industrial resource. As with conventional mining operations, there is pollution and environmental destruction. In the case of the internet, the groundwater of useful information is poisoned by self-interest. Ironically, most web sites are frustrating to visit because of the intrusion of annoying ads and notices about “cookies” for the purpose of collecting your supposedly valuable data. One would think the increasing saturation by ads would mean diminishing returns, as the visual clutter renders websites unusable.

One would also think that the atmosphere of invective in social media would turn people off to their use. The ethos of goodwill and confidence is destroyed by vituperation and unsupported claims. Once again the tree of knowledge has been violated—and this time not because it was forbidden—quite the contrary. Again, we have been expelled from a paradise, yet it is the same old serpent at work.

Those of us who know how to find and share information may simply refuse to use for-profit social media platforms or search engines that are biased or track and sell our data. We may avoid websites cluttered with distracting ads. We may refuse to frequent online podiums for venom and hate. Beyond these protests of refusal, there may be some creative and well-motivated souls who will invent new formats for communication that reassert the ideal of the friendly non-commercial internet commons. Indeed, some of the existing platforms were founded on such ideals, which did not prevent them from being eventually corrupted.

So, here is a fifth simple truth: evil always has the advantage in its contest with good. This is because it is a social phenomenon: as the adage goes, a rotten apple can spoil the barrel, whereas a good apple can hardly preserve it. Those who do not know who they are, or what they intend, may conclude that they may as well join the rotten apples who (as an another adage tells us) may seem indefeasible. Those who do know themselves will persist, despite unfavorable odds, to reinvent a better world, because that is the only thing really worth doing.

The subject is no object

The parts of speech recognized in many languages reflect an inescapable duality: the fundamental divide between subject and object. To be a subject is to have a point of view. To be an object is to appear (to a subject) in and from a point of view. One cannot see where one is looking from, and what one does see is not the seer. This difference is more than a matter of location, space or physicality. Rather, it is a categorical difference so fundamental that nothing can undercut it because it underlies every act of cognition including imagination. I say this advisedly, because there have been many attempts to transcend this dualism. In my opinion, they do not succeed.

The subject can never see itself, nor can another subject see it. It is not, in fact, an ‘it’. It cannot even be an object of thought (as opposed to a physical object) except as a kind of abstraction. Any way you slice it,  the subject is here—the point of view of the mind in question—and the object is there, some content of a mind’s consciousness. One can look in a mirror, but it is the physical body that one sees, not the subject who is doing the seeing. One might feel a configuration of sensations one is tempted to take to be one’s self, the seeming locus of one’s consciousness; but these are no more than subtle objects of attention. They are not the witnessing subject. One might nevertheless imagine this sort of experience to imply a “presence” underlying consciousness; but that is an act of imagination and not a logical inference. (For this reason, I have no more confidence in the hypothetical enduring Self of Vedanta than in the immortal Christian soul.)

Consciousness requires objects to be aware of. But do objects require subjects? Does the tree fall in the forest whether or not anyone sees or hears it? Naïve realism (or materialism) is the belief that objects can exist without subjects. Naïve idealism is the belief that subjects can exist without a physical basis or “real” objects to be aware of. It includes the notion that there are no physical objects, or that mind creates them. Such beliefs are extreme responses to the conundrum of subject-object dualism, trying to wish away the dualism itself. Materialism denies or ignores the subject. Idealism denies or ignores the independent existence of the object. It is more sensible to realize that subject and object enter conjointly and inseparably into what we know as consciousness.

This mutual involvement means there is no way the object “really” is, apart from how observers see it. Observation is an interaction of subject and object. Observers are necessarily physical agents, subjects embodied as objects. Human beings routinely assume that the way the world appears to them is just how it objectively is. In truth, how it appears to them depends on their state and their needs as living creatures.

Such subtleties do not stand in the way of human affairs. But they do have consequences. The subject/object dualism is at the core of certain problems in science, for example. Observation in classical physics presumes a clear divide between observer and observed: the act of observation ideally has little physical effect on the system observed. Moreover, the observer is supposed to be a fly on the wall, whose considerations or state ideally have no bearing on the world’s appearance. The naiveté of this standard was finally challenged by phenomena that could not be accounted for with classical models. Yet, the subject-object relationship undercuts any possible account, including quantum models. (Entanglement, for example, refers to ties between objects, not the entanglement of object with subject we are discussing.) Science is still implicitly about subjects trying to make sense of objects, even when the subject goes unmentioned.

Self-reference is spoken of as though it were actually possible. (One seems to self-refer in speaking of oneself, for example.) Strictly speaking, however, a subject cannot refer to itself. Logical paradoxes that involve self-reference arise from mishandling the categorical dissimilarity of subject and object. The putative self-reference is actually the reference of a subject to a linguistic or conceptual object. In the famous Liar Paradox, the statement “this sentence is false” appears to involve self-reference. (It contradicts itself because it also involves negation. The statement “this sentence is true” seems comparably self-referential, without being contradictory. In effect, it asserts that “this sentence is true” is true.) However, as the logician Tarski pointed out, “this sentence” is not interchangeable with “this sentence is true,” nor with “{‘this sentence is true’} is true,” and so forth. (Similarly, for “this sentience is false.”) Rather, each reference is a separate object of a separate assertion, iterated on a different logical level, ad infinitum. The asserting subject always escapes to a realm logically distinct from what it asserts, just as the classical observer always stands outside the system observed.

While science and logic are not immune to dualism, religion is dualism writ large. The child innocently asks, “if God created the world, who or what created God?” I doubt that any learned theologian has ever given a satisfying answer. A subject can create an object, since every artifact was made by someone. But a subject cannot create itself. The notion that God is self-creating (‘I am that I am’) is an evasion. So is the notion of a Prime Mover, which simply refuses to consider an endless causal chain. This dilemma has been inherited by modern cosmology, which asks how something (the universe) could boot-strap itself from nothing (the unstable vacuum?) through some series of causal transitions. It aspires to show how an object (the universe) can self-create, without invoking a subject.

We are familiar with objects that can self-replicate, and phenomena that can self-organize. There are computer programs that can copy or modify themselves. And, of course, DNA copies itself, and an organism can self-modify and reproduce. It evolved somehow in the first place. A self-replicating device, even if only a string of computer code, must include instructions about how to copy itself in its entirety, which must include the instructions themselves. Von Neumann addressed the problem by separating functions. There must be a description (blueprint), a constructor mechanism, and a further mechanism to copy the description—which the constructor then installs into the new self-replicator. Presumably a dividing cell does something similar. Strictly speaking, however, these entities are not subjects. They are theoretical objects in the purview of biologists or programmers, who are the actual subjects involved.

Causal explanation involves only interactions among inanimate objects, not the actions or interrelations of subjects. This categorical distinction may seem of interest only to philosophers and brain scientists. In everyday life, however, we are continually beset with the challenge to imagine the subjective experiences of others when all that is presented to our senses is their form as objects. We are used to dealing with inanimate things, along with the physical bodies of people and creatures. Sometimes we fail to make an essential distinction among such objects, for it is our biological nature to treat all objects as potential resources to exploit or threats to avoid. As Buber pointed out, from that fundamental position everything is object to the subject, including some objects that might also be sentient beings like oneself. Yet, another sort of relationship is possible and necessary for human beings. This is the relationship between I and thou, which is categorically different from the relationship of a subject to sensible objects.

Underlying the ethical problem of behavior toward other subjects is the physiological fact that one’s nervous system serves one’s own body uniquely. Empathy is a deliberate effort to compensate for this isolation of subjects in separate bodies. It attempts to imagine an experience that animates the other’s body and resembles one’s own experience—to walk in the other’s shoes, as we say. It is difficult to imagine the experience of another body when one has in mind a use of one’s own for that body as an object. It is particularly difficult to imagine the pain in another body when not in pain oneself. Or to imagine someone’s feelings and motivations when one feels quite differently or unsympathetic. The temptation is to shrink from this challenge and identify only with one’s own point of view as a subject—opposed to the other as mere object, whose experience then is not forefront or does not count.

This retreat is aptly named depersonalization. While deemed morally vicious in extreme cases, fundamentally it amounts to a mundane failure of nerve. Through an unbecoming yet commonplace cognitive trick, one withdraws back into the subject-object relationship that naturally dominates the life of the organism. However, the joke is ironically upon the subject who thus retreats, who is thereby depersonalized along with the victim. For, the depth, the mystery, and the privilege—indeed, the very distinction—of being a conscious subject lies in the relationship to other subjects. Granted, the relationship to objects is necessary and inevitable for biological beings. But human beings have never settled for being merely biological or merely objects.

I, Thou, and It

Martin Buber underlined the deep significance of two complementary ways of relating:  I to It and I to Thou. It is no coincidence that these correspond to parts of speech. For, the relationship of subject to subject (me to you) is above all a function of language, of two-way communication. It is fundamentally unlike the relation of subject to object (me to that), which is unilateral and potentially exploitative. One perceives, uses, and physically interacts with objects; but one communicates with other subjects.

A basic survival mode of organisms is to treat everything as an object to consume, manipulate, or avoid. Human beings live in that biological context. So, for us too, the I/it relationship is fundamental and unavoidable in the material world. Yet, we are social animals that have developed fully grammatical language, and have developed ideals and codes of social conduct as well. One of these ideals (which implies certain conduct) is the notion of person. A person is a subject who gives and receives communications. One is a perceiver (a mind) in relation to various objects of perception and thought. But one is a person only in relationship to other persons.

Buber goes further, to claim the I/thou relation as a stance that can apply where one wills. At the least, it abstains from the I/it stance. By default, it should apply at least to human subjects (though their bodies are objects). But it might be applied as well to animals, plants, non-living things, and abstractions like “nature” or “the universe.” It is a way of relating that does not depend on what is related to. For the mystic Buber, the epitome of this stance is the relationship to God—who is always transcendentally a Thou and never an it. I am content to have this stance be essential within human affairs, without bringing God into it. I understand the motive: to write the principle as large as possible. That might work for Buber, who had the mental fortitude to refuse to conceive God as an object (that is, to refuse to conceive “Him” at all!) In general, however, religious believers have demonstrated again and again their willingness to treat God—and other persons—as mere objects to manage.

The I/thou relationship is a dynamic of mutual respect between peers. The dynamic of the I/it relationship is not respect, but control over objects. It is not a symmetrically mutual relationship, but a one-way stance of mastery. For modern man, “objects” include animals, trees, minerals, and nature-at-large—anything to be used as a resource. Even when the object is incomparably vaster and more powerful than the insignificant subject (such as a planet or the cosmos at large), it is cut down to size by the stance of control. That the universe deigns to tolerate our puny existence absurdly gives us courage to mount an offensive attack.

Reason and analysis are frequently brought to bear to prosecute the I/it stance. But the human psyche is much deeper than reason. Underneath the presumption to dominate nature lurks a primordial fear that we are out of our league. It is the fear that we are dealing with a stupendous Subject, who might be angered by our hubris, rather than a mere object under one’s thumb. I suspect this is where the “fear” of God comes from; for God projects the idea of an all-powerful person with whom one is necessarily a subordinate. We address the divine familiarly as “Thou,” but hardly as a peer. The wrath we fear is retribution for the insolence of presuming to trifle with a powerful superior. At the same time, the ideal of power we humanly aspire to is projected as divine control over everything, including nature and us. This is likely a displaced version of nature’s power over us, personified as a being one could attempt to placate, as a child learns to manipulate its parents. In this way, we vicariously trump nature, on which we are utterly dependent. (If God created the world, didn’t we create God?) However, such a relationship with the Creator does not qualify as I/thou in Buber’s sense. Rather, it merely turns the tables on nature.

The idea of legal rights—especially of universal human rights—hinges on the I/thou dynamic. Persons have rights and responsibilities; objects do not. In patriarchal society, women, children, slaves, and animals fell in an intermediate category. While children have a will of their own, adults generally do not consider them peers, but persons in training, who must be controlled for their own good. In class society, only the ruling class had full rights. Equality is not automatic even within the group, let alone accorded to other groups. It used to be routinely denied on the basis of race, gender, or class. The idea of universal “suffrage” is relatively new. In a long painful process, some groups within society demanded that the dominant group grant them equal status, as though it were not to be assumed. This was resisted, because in theory personhood, with rights, also exempts its bearer from being treated as a useful object. Slavery, for example, was justified on the basis of race, by denying to some people status as persons before the law.

Buber admits that the default stance is I/it. He does not invoke biology, but clearly this is the organism’s natural stance in regard to its environment. Otherwise, natural selection would not have allowed the organism to establish itself. He attributes a spiritual side to humanity, which consists essentially in embracing the relation of I to thou. This does not replace the fundamental relation of I to it, but augments it and puts it in perspective. It reclaims the freedom of the subject to choose an alternative stance. It exercises the ability to transcend mental limits, since one must abandon any preconception, plan, desire, scheme, or intention to use the other. In effect, the other becomes literally useless. To truly meet, one must temporarily lose one’s mind.

This may seem to set a nearly unattainable standard for human relationship. It requires us to push beyond appearances (the literal content of experience) toward some unseen essence of the other, which seems to pull the strings while coyly hiding, as it were, behind the physical puppet we recognize as their body. This is metaphor, but it needn’t mystify. It does not propose an animating soul or a mystical quest. It merely reminds us to treat our fellows as we would have them treat us. It asks us to leave our weapons at the door and our tools as well. It invites us to take the stance of unknowing, at least in regard to any thou. And it is this that enables us to be fully persons ourselves.

 

 

 

Left to our own devices

Social media are paradoxically anti-social. A common scene goes like this: a group of people around the dinner table ignore each other, having wandered off in separate worlds on their respective “devices.” Even before the pandemic, Facebook had displaced face-to-face contact. Ironically, modern connectivity is disconnecting us. How can this be?

While this phenomenon is an aspect of modern technology, perhaps it all began with theater and the willing suspension of disbelief. On the stage are flesh and blood actors, but the characters they play belong to another realm, another time and place, which draws attention away from the immediate surroundings. The story they tell is not the here-and-now in the theater hall with the others in the audience. Thus, we enjoy a dual focus. Theater morphed into cinema, which morphed into the pixelated screen. But the situation is the same: there is a choice between worlds. One attends either to the proximal here-and-now reality that includes the physical interface (iPhone, TV, computer) or to the distal reality transmitted through it. Even the telephone presents a similar alternative. Is the locus of reality my ear pressed against the receiver or is it the person speaking on the other end? In the case of the real-time conversation, one focuses on the person at the other end as though they were nearby, which is a sort of hallucination. In the case of the digital device, the distal focus involved in texting chronically engulfs the participants in a sort of online role-playing game that has little to do with the here and now.

Our own senses offer a parallel choice. For, our natural senses are the original “personal devices.” Like a laptop or mobile phone, they serve as an interface with the world. Yet, they can be used for other purposes. Our sensations monitor and present a reality beyond the nervous system; yet we have the ability and option to pay attention to the sensations themselves. We can focus on the interface or the reality it is supposed to transmit. We thus have the ability to entertain ourselves with our own sensory input. Even the visual field can be regarded either as a sort of proximal screen one observes or as a transparent window on the real world beyond. Hence, the fascination of painting as a pure visual experience, whether or not it represents a real scene. Hence, the even greater fascination of the cinema. I suggest it is this ambiguity, built into perception, that renders us especially susceptible to the adictiveness of screens.

Modern technology provides the current metaphor to understand how perception works. The brain computes for us a virtual reality to represent the external world. In that metaphor, the world appears to us as a “show” on the “screen” of consciousness. The participatory nature of perception underlined by the metaphor gives us some control and responsibility over what we perceive. This has a social advantage. Subjectivity is a capacity evolved to help us get along in the real world, by realizing we are active co-producers of this “show,” not just passive bystanders looking through a transparent window. Yet, by its very nature this realization permits us to focus on the screen rather than the supposed reality it transmits. This freedom, afforded by self-consciousness, is a defining human ability. But, like all freedoms, it can be misused. We turn away from reality at our own risk.

There is a difference between the senses as information channels and digital devices as information channels. The senses are naturally oriented outward. We survive by paying attention to the external world, and natural selection guarantees that only those organisms survive that get reliable information about their environment. There is little guarantee that electronic channels give us reliable information, however. They can do that, of course, but it depends on how they are used and who actually controls them. One can use a cell phone to contact someone for a specific purpose, just as one can use a computer to seek important information. But a mobile “phone” is not just a cordless telephone whose advantage is convenience. It functions more as pocket entertainment. The computer screen, too, is as likely to be used for video games or other entertainment that has nothing to do with reality. Even “googling” stuff online is sometimes no more than a trivial entertainment.

Worse, digital devices are not part of our natural equipment, and thus not under control as parts of one’s body. Back in the 17th century, Descartes was perhaps the first to recognize this dilemma. He realized that the information flowing into the brain could be faked—even if it came from the natural senses, if someone malevolent were in a position to tamper with your nervous system. If your sensory input could be manipulated, then so could your experience and behavior. He reassured himself that God (today we might say nature) would not permit such deception by the natural senses. But there is no such guarantee against deception by those who control the information that comes to us through our artificial senses—our external devices.

In the ancestral environment of the wild, being distracted by your experience as a form of self-entertainment would have been frivolous and lethal, if at that moment you needed to pay close attention to the predator stalking you or the prey you were stalking. (Daydreaming, you could miss your meal or become the meal!) People then had designated times and venues for enjoying their imagination and subjectivity—such as story-telling around the campfire. In the modern age we have blurred those boundaries. We now carry an instant “campfire” around in our pockets. While mobile phones would have been highly useful for coordinating the hunt, as currently used they would have dangerously disrupted it. They could be used by a malicious tribe to steer you away from your quarry or over a cliff.

Of course, we no longer live in the wild but in man-made environments, where traditional threats have mostly been eliminated (no saber-tooth tigers roaming the streets). We can now enjoy our entertainments at the push of a button. One can further argue that the human world has always been primarily social and that social media are merely the modern extension of an ancient connectivity. However, external threats have hardly ceased. (If they had, there would be no basis for interest in the “news”.) The modern world may be an artificial environment, a sort of virtual reality  in which we wish to wander freely; but it is still immersed in the natural order, which still holds over us the power of life and death. It may seem to us, at leisure in the well-off parts of the world, that no distinction is really needed anymore between the virtual and the real. However, this will never be the case. Even if hordes of transhumanists or gamers migrate to live in cyber-space, ignoring political realities, the computers and energy required to maintain the illusion will continue to exist in the real world. Someone will have to stay behind to mind the store, and that someone will have enormous power. (See the TV series Upload, or for that matter The Matrix films.)

The new virtual universe no doubt has its own rules, but we don’t yet know what they are. We knew what a stalking tiger looked like. We still know what a rapidly approaching bus looks like. But we are hard pressed to recognize the modern predators seeking our blood and the man-made entities that can run us down if we don’t pay attention. In the Information Age, ironically, we no longer know the difference between what informs us and what deforms us. Instant connectivity and fingertip information have rendered obsolete the traditional institutions for vetting information, such as librarians, peer review, and trusted news reporting. Everything has become a subjective blur, a free-for-all of competing opinions, suspected propaganda. Yet, because we are genetically conditioned to deal with reality, mere opinion must present itself as fact to get our attention and acceptance. This provides a perfect opportunity for those who control information channels to manage what we think.

We are challenged to know what to believe. The downside of the glut of information is the task of sorting it. It is for the convenience of avoiding that task that we rely on trusted sources to do it on our behalf. We’re used to such conveniences in the consumer culture, where information has been appealingly packaged as another consumer product, just as we’ve come to trust what is on the supermarket shelf, in part seduced by its glitzy over-packaging. When the trust is no longer there, however, the burden falls back on the consumer to read the fine print on the labels, so to speak. (More extremely, one may learn to grow one’s own food. But, then too, weeding is necessary!) In a divided world, with no rules of engagement, the challenge of sifting information is so daunting and tedious that it is tempting to throw up one’s hands in despair and regard the clash of viewpoints as ultimately no more than another harmless entertainment. That would be throwing the baby out with the bath water, however, since presumably there is a reality, a truth or fact of the matter that can affect us and which we can affect.

It is convenient, but hardly reasonable, to believe a claim because one has decided to trust its source. By the same token, it is convenient, but not reasonable, to automatically dismiss the claims of suspicious sources because other parties misuse the information. (These fallacies are known in logic as argumentum ad hominem.) The reasonable approach is to evaluate all claims on their own merit. But that means a lot of time-consuming work cross-checking apparent facts and evaluating arguments. It means taking responsibility for deciding what to believe—or, alternatively, to admit that one doesn’t know yet. It is far easier to simply take a side that has some emotional appeal in a heated controversy, and be done with the ardors of thinking. It’s easier to be a believer than a doubter, but easier still to sit back and be entertained.

 

La Belle Planète

The discovery that planets are actually typical in the universe is one of the great achievements of modern astronomy. From an understanding of how solar systems form, it follows that in our galaxy there could be on average at least one planet for each of more than a hundred billion stars. Most of these would not resemble the Earth and could not foster the development of life. But with such staggering numbers, the odds are that life in some form should actually be abundant in the galaxy. Somewhere there must be civilizations capable of space travel or sending trans-galactic messages. This raises the question of why ours is, so far, the only planet we know of that works to support life. Unless you are one of those folks who believe the aliens are already here, walking disguised among us, the question then also arises, where are they? If aliens are so probable, why have they not made their presence known to us? Named after a famous physicist of the 1950s, this question is known as the Fermi paradox and has stimulated many creative answers. I will focus on some that seem most relevant to our time.

One sobering explanation is that technological civilization is doomed to self-destruct, or is inherently self-limiting to the degree that it cannot penetrate the great distances between stars. Of course, any such thinking presumes a great deal about the possible manifestations of intelligence. For example, a “technological civilization” would be a form of biological life—and thus is a product of ruthless natural selection—or derived from such a form (robots). Like us, aliens would not be angels but craven animals who exploit other life forms. We need only look at slaughterhouses and how ideals of progress are regularly thwarted by human nature.

We owe our civilization to our great numbers and to fossil fuels, which are the remnants of past organisms. We have nearly used up this resource in ways not directly connected to the goal of space travel, in the process polluting and changing our world in ways that may limit future efforts to enter space. The sheer success of our species at dominating this planet has created living conditions which favor the spread of diseases that may ultimately cripple our civilization to such extent that space travel is not feasible. SpaceX notwithstanding, we may already be experiencing the beginning of this with the corona virus epidemic. We still live under the shadow of nuclear annihilation, not to mention biological warfare. Climate change may disrupt society with migrations and armed conflicts to such an extent that it cannot afford space exploration. Rampant nationalism and factionalism may preclude the cooperative global effort required. The violent and competitive nature at the base of our existence, reflected in the notion of conquering space, may be the very thing that prevents it. Our counterparts on other planets could face similar obstacles.

Another possible explanation is that aliens might simply not be interested in space travel. They could even lose interest in the external world altogether. This could happen if, like modern humans, they became too self-absorbed in entertainment, or decided to migrate to cyberspace, where consciousness can dwell in a virtual world. We have long had spiritual and meditative traditions that focus inward rather than outward, as well as myths and narratives of non-physical realms. Especially in a circumstance of limited or dwindling resources, and in the face of mortality, an alien civilization might choose some form of mental life over physical participation in the material world. Already our own civilization is moving in this direction, with digital entertainments and communication devices, not to mention the perennial resurgence of religion.

There is also the possibility that we have simply not been searching for extra-terrestrial signals long enough. We have only been able to receive and transmit radio emissions for little more than a century, and optical signals not much longer. Even the telltale heat signature of civilization has existed for but a few millennia. Technological development has been exponential, with most rapid change in the past couple of centuries. We can expect the change in the next couple of centuries to be far greater. Extrapolating, some people predict an imminent “singularity,” a point of no return when automation takes technology beyond human control. Perhaps the forms of AI that will control this world (or others) will have no interest in space travel and may even make the planet inhospitable to biological life. Some transhumanists foresee artificial life as an advancement over humanity. Equally likely seems the prospect that a post-human world could be dominated by artificial versions of viruses, bacteria and insects, rather than high-level artificial intelligence capable of space travel.

The fanciful vision of inter-galactic space travel, which fired our imaginations in Star Trek and Star Wars, projects characteristic human ambitions, values, and social dynamics on a cosmological scale. There is no sound reason to expect aliens to have a humanoid form, however, much less to gather in saloons at galactic outposts. Yet, if alien intelligence is a product of natural selection, as on this planet, there would be every reason to expect it to follow the fundamentally opportunistic patterns we see on Earth. Civilized aliens could only develop on other planets in the context of their biosphere, upon which they would be as dependent as we are here. Granted that we humans attempt to set ourselves apart from our biosphere, we might expect space-faring civilizations to have developed ideals and codes of behavior that facilitate cooperation, and a science that facilitates control of nature. And, of course, mathematics.

There is an understanding among scientists and science-fiction writers that the math glorified on this planet would be a lingua franca among galactic civilizations—not the same symbols, of course, but similar concepts deemed universal. However, the biological basis and parochial nature of our mathematics is even less recognized than that of our physics. The basis of our math is the natural numbers, which abstract the integrity of discrete “objects” such as human beings perceive in their physical or mental environment, and which they themselves exemplify as individuals. The finite steps of a proof, calculation, or verification exemplify discrete acts of an agent upon a world of objects it can manipulate. This corresponds to primate experience and action in an environment consisting of countable things, whether tangible or abstract. What if, at some level perceivable to aliens, the world does not consist of discrete objects and actions? Would another concept of mathematics and of computation be more suitable? What about an alien whose body does not amount to a discrete object? Of course, the technology of space travel may require manipulating and assembling what we take to be countable objects. Yet, an amorphous or non-localized creature might have ways to change its environment analogically—for example, through chemical emissions. (Indeed, this is how most self-regulation works within the organisms we know.) Computation for such an alien would not be digital processing, but direct covariance with environmental changes. What need or use could such a being have for natural “laws” as algorithmic compressions of an input? Would such a math, and the biology and mentality behind it, lend itself to space travel?

I had an epiphany while watching the original Star Wars movie. I realized that this planet—our home in space­—is an alien world and that we are the bizarre aliens that inhabit it! Whether or not any of those fancied other interstellar watering holes exist, the vision of them was created here, modelled on what is familiar to us. Just as children are attached to their mothers, we naturally find beautiful the place of our origins, without which we could not have come forth into existence. That first Star Wars film was released when the Province of Quebec still issued license plates with the motto: “La Belle Province.” Just to let all that alien traffic know how we feel about home, perhaps one day our missions to the stars will bear license plates that read “The Beautiful Planet,” in whatever language then prevails.