The Ideology of Individualism and the Myth of Personal Freedom

Individuals are tokens of a type. Each person is a living organism, a mammal, a social primate, an example of homo sapiens, and a member of a race, linguistic group, religion, nation or tribe. While each “type” has specific characteristics to some degree, individuality is relative: variety within uniformity. In the West, we raise the ideal of individuality to mythical status. The myth is functional in our society, which is more than a collection of individuals—a working whole in its own right. The needs of society always find a balance with the needs of its individual members, but that balance varies widely in different societies over time. How does the ideology of individualism function within current Western society? And why has there been such resistance to collectivism in the United States in particular?

The actual balance of individual versus collective needs in each society is accepted to the degree it is perceived as normal and fair. Social organization during human origins was stable when things could change little from one generation to the next. In the case of life organized on a small-scale tribal basis, the social structure might be relatively egalitarian. Status within the group would simply be perceived as the natural order of things, readily accepted by all. Individuals would be relatively interchangeable in their social and productive functions. There would be little opportunity or reason not to conform. To protest the social order would be as unthinkable as objecting to gravity. For, gravity affects all equally in an unchanging way. There is nothing unfair about it.

Fast forward to modernity with its extreme specialization and rapid change, its idea of “progress” and compulsive “growth.” And fast forward to the universal triumph of capitalism, which inevitably allows some members of society to accumulate vastly more assets than others. The social arrangement is now at the opposite end of the spectrum from equality, and yet it may be perceived by many as fair. That is no coincidence. The ideology of individualism correlates with high social disparity and is used to justify it. Individualism leads to disparity since it places personal interest above the interest of the group; and disparity leads to individualism because it motivates self-interest in defense. Selfishness breeds selfishness.

While society consists of individuals, that does not mean that it exists for the sake of individuals. Biologists as well as political philosophers might argue that the individual exists rather for the sake of society, if not the species. Organisms strive naturally toward their own survival. Social organisms, however, also strive toward the collective good, often sacrificing individual interest for the good of the colony or group. While human beings are not the only intelligent creature on the planet, we are the only species to create civilization, because we are the most intensely cooperative intelligent creature. We are so interdependent that very few modern individuals could survive in the wild, without the infrastructure created by the collective effort we know as civilization. Despite the emphasis on kings and conquests, history is the story of that communal effort. The individual is the late-comer. How, then, have we become so obsessed by individual rights and freedoms?

The French Revolution gave impetus and concrete form to the concept of personal rights and freedoms. The motivation for this event was an outraged sense of injustice. This was never about the rights of all against all, however, but of one propertied class versus another. It was less about the freedoms of an abstract individual, pitted against the state, than about the competition between a middle class and an aristocracy. In short, it was about envy and perceived unfairness, which are timeless aspects of human and even animal nature. (Experiments demonstrate that monkeys and chimps are highly sensitive to perceived unfairness, to which they may react aggressively. They will accept a less prized reward when the other fellow consistently gets the same. But if the other fellow is rewarded better, they will angrily do without rather than receive the lesser prize.)

In tribal societies, envy could be stimulated by some advantage unfairly gained in breach of accepted norms; and tribal societies had ways to deal with offenders, such as ostracism or shaming. Injustice was perceived in the context of an expectation that all members of the group would have roughly equal access to goods in that society. Justice served to ensure the cooperation needed for society at that scale to function. Modern society operates differently, of course, and on a much larger scale. Over millennia, people adapted to a class structure and domination by a powerful elite. Even in the nominally classless society of modern democracies, the ideology of individualism serves to promote acceptance of inequalities that would never have been tolerated in tribal society. Instead, the legal definitions of justice have modified to accommodate social disparity.

The French revolution failed in many ways to become a true social revolution. The American “revolution” never set out to be that. The Communist revolutions in Russia and China intended to level disparities absolutely, but quickly succumbed to the greed that had produced the disparities in the first place. This corruption simply resulted in a new class structure. The collapse of corrupt communism left the way open for corrupt capitalism globally, with no alternative model. The U.S. has strongly resisted any form of collective action that might decrease the disparity that has existed since the Industrial Revolution. The policies of the New Deal to cope with the Depression, and then with WW2, are the closest America has come to communalism. Those policies resulted in the temporary rise of the middle class, which is now in rapid decline.

America was founded on the premise of personal liberty—in many cases by people whose alternative was literal imprisonment. The vast frontier of the New World achieved its own balance of individual versus collective demands. While America is no longer a frontier, this precarious balance persists in the current dividedness of the country, which pits individualism against social conscience almost along the lines of Republican versus Democrat. The great irony of the peculiar American social dynamic is that the poor often support the programs of the rich, vaunting the theoretical cause of freedom, despite the fact that actual freedom in the U.S. consists in spending power they do not have. The rich, of course, vaunt the cause of unregulated markets, which allow them to accumulate even more spending power without social obligation. The poor have every reason to resent the rich and to reject the system as unfair. But many do not, because instead they idolize the rich and powerful as models of success with whom they seek to identify. For their part, the rich take advantage of this foolishness by preaching the cause of individualism.

Statistics can be confusing because they are facts about the collective, not the individual. Average income or lifespan, for example, does not mean the actual income or lifespan of a given individual. One could mistake the statistic for individual reality—thinking, for example, that the average income in a “wealthy” society represents a typical income (which it rarely does because of the extreme range of actual incomes). For this reason, the statistics indicating economic growth or well-being do not mean that most people are better off, only that the fictional average person is better off. In truth, most are getting poorer!

Nowadays, social planning in general embraces a statistical approach to the common good. Statistics is a group concept, which is also true in epidemiology. Official strategies to deal with the current pandemic are necessarily oriented toward the collective good more than the individual. Obligatory due is paid, of course, to the plight of individuals, and medical treatment is individual; yet strategies concern the outcome for society as a whole. A balance is sought between the habitual satisfactions of life and the collective actions needed to stem the disease. Demands on the individual to self-restrain for the sake of the collective are bound to strain tolerance in a society used to individualism. It raises issues of fairness, when some are seen disregarding rules in the name of freedom, which others choose to obey in the name of the common good. But, as we have seen, fairness is a matter of what we have grown used to.

One thing we can be sure of: the more populated and interconnected the world becomes, the more the individual will have to give way to the common good. That may not mean a return to communism, but it will require more willingness to forfeit personal freedoms in the measure we are truly “all in it together.” Individualists should be realistic about the stands they take against regulation, to be sure the liberties they seek are tangibly important rather than merely ideological. Social planners, for their part, should recall that no one wants to be merely an anonymous statistic. Individualism will have to be redefined, less as the right to pursue personal interest and more as the obligation to use individual talents and resources for the common good.

E pluribus unum: the fundamental political dilemma

Any political body comprised of more than one person faces the question of who decides. In a dictatorship, monarchy, or one-party system, a single agency can decide a given issue and come forth with a relatively uncontested plan. From the viewpoint of decisiveness and efficiency, ideally a single person is in control, which is the basis of chains of command, as in the military. At the other end of possibility, imagine an organization with a hundred members. Potentially there are one hundred different plans and one hundred commanders, with zero followers. Without a means to come to agreement, the organization cannot pursue a consistent course of action or perhaps any action at all. Even with a technical means such as majority vote, there is always the possibility that 49 members will only nominally agree with the decision and will remain disaffected. Their implicit choice is to remain bound by the majority decision or leave the organization. This is a basic dilemma facing all so-called democracies.

While the 100 members could have as many different ideas, in practice they will likely join together in smaller factions. (Unanimity means one faction.) In many representative democracies, including Canada, political opinion is divided among several official political parties, whose member representatives appear on a ballot and generally adhere to party policy. In the U.S., there have nearly always been only two major political parties.

Any political arrangement has its challenges. Unless it represents a true unanimity of opinion, the single party system is not a democracy by Western standards, but severely constricts the scope of dissent. On the other hand, a multi-party system can fail to achieve a majority vote, except for coalitions that typically compromise the positions of the differing factions. Either the two-party system is unstable because the parties cannot agree even that their adversaries are legitimate; or else it is ineffective in the long run because the parties, which agree to legitimately take turns, end up cancelling each other out. The U.S. has experienced both those possibilities.

The basic challenge is how to come to agreement that is both effective and stabilizing. The ideal of consensus is rarely achieved. Simple majority rule allows for decision to be made and action taken, but potentially at the cost of virtually half the people dragged along against their better judgment: the tyranny of the majority. The danger of a large disaffected minority is that the system can break apart; or else that it engages in civil war, in which roughly equal factions try forcibly to conquer each other. A polarized system that manages to cohere in spite of dividedness is faced with a different dysfunction. As in the U.S. currently, the parties tend to alternate in office. A given administration will try to undo or mitigate the accomplishments of the previous one, so that there is little net progress from either’s point of view. A further irony of polarization is that a party may end up taking on the policies of its nemesis. This happened, for example, at the beginning of American history, when Jefferson, who believed in minimal federal and presidential powers, ended by expanding them.

The U.S. was highly unstable in its first years. The fragile association among the states was fraught with widely differing interests and intransigent positions. As in England, the factions that later became official political parties were at each other’s throats. The “Federalists” and the “Republicans” had diametrically opposed ideas about how to run the new country and regularly accused each other of treason. Only haltingly did they come to recognize each other as legitimate differences of opinion, and there arose a mutually accepted concept of a “loyal opposition.” Basically, the price paid for union was an agreement to take turns between regimes. This meant accepting a reduction in the effectiveness of government, since each party tended to hamstring the other when in power. This has been viewed as an informal part of the cherished system of checks and balances. But it could also be viewed as a limit on the power of a society to take control of its direction—or to have any consistent direction at all.

Another, quite current, problem is minority rule. The U.S. Constitution was designed to avoid rule by an hereditary oligarchic elite. For the most part, it has successfully avoided the hereditary part, but hardly rule by oligarchy. American faith in democracy was founded on a relative economic equality among its citizens that no longer exists. Far from it, the last half-century has seen a return to extreme concentration of wealth (and widespread poverty) reminiscent of 18th century Europe. The prestige of aristocratic status has simply transferred to celebrity and financial success, which are closely entwined. Holding office, like being rich or famous, commands the sort awe that nobility did in old Britain.

A country may be ruled indirectly by corporations. (Technically, corporations are internally democratic, though voter turn-out at their AGMs can be small. Externally, in a sense, consumers vote by proxy in the marketplace.) While the interests of corporations may or may not align with a nation’s financial interests in a world market, they hardly coincide with that nation’s social well-being at home. The electorate plays a merely formal role, as the literal hands that cast the votes, while the outcome is regularly determined by corporate-sponsored propaganda that panders to voters. Government policy is decided by lobbies that regularly buy the loyalties of elected representatives. When it costs a fortune to run for office, those elected (whatever their values) are indebted to moneyed backers. And, contrary to reason, the poor often politically support the rich—perhaps because they represent an elusive dream of success.

People can always disagree over fundamental principles; hence, there can always be irreconcilable factions. Yet, it seems obvious that a selfless concern for the objective good of the whole is a more promising basis for unity than personal gain or the economic interests of a class or faction or political party. Corporate rule is based on the bottom line: maximizing profit for shareholders, with particular benefit to its “elected” officers. It embodies the greed of the consumer/investor society, often translated into legalized corruption. Contrast this with the ancient Taoist ideal of the wise but reluctant ruler: the sage who flees worldly involvement but is called against his or her will to serve. This is the opposite of the glory-seeking presidential candidate; but it is also the opposite of the earnest candidate who believes in a cause and seeks office to implement it. Perhaps the best candidate is neither egoistic nor ideologically motivated. The closest analogy is jury duty, where candidates are selected by random lottery.

The expedient of majority rule follows from factionalism, but also fosters it. To get its way, a faction need only a 51% approval of its proposal, leaving the opposition in the lurch. The bar could be set higher—and is, for special measures like changing a constitution. The ultimate bar is consensus, or a unanimous vote. This does not necessarily mean that everyone views the matter alike or perfectly agrees with the course of action. It does mean that they all officially assent, even with reservations, which is like giving your word or signing a binding contract.

The best way to come to consensus is through lengthy discussion. (If unanimity is required, then there is no limit to the discussion that may ensue.) Again, a model is the jury: in most criminal cases—but often not in civil cases—unanimity is required for a “conviction” (a term that implies the sincere belief of the jurors.) The jury must reach its conclusion “beyond a reasonable doubt.” A parliament or board of directors may find this ideal impractical, especially in timely matters. But what is considered urgent and timely is sometimes relative, or a matter of opinion, and could be put in a larger perspective of longer-term priorities.

The goal of consensus is especially relevant in long-term planning, which should set the general directions of a group for the far future. Since such matters involve the greatest uncertainty and room for disagreement, they merit the most thorough consideration and involve the least time constraint. A parliament, for example, might conduct both levels of discussion, perhaps in separate sessions: urgent matters at hand and long-term planning. Discussing the long-term provides a forum for rapprochement of opposing ideologies, resolving misunderstandings, and finding common ground. Even on shorter-term issues, it may turn out that wiser decisions are made through lengthier consideration.

In any case, the most productive attitude through which to approach any group decision is through careful listening to all arguments, from as objective and impersonal a point of view as possible. That means a humble attitude of mutual respect and cooperation, and an openness to novel possibilities. Effectively: brainstorming for the common good rather than barnstorming to support a cause or special interest. Democracy tends to align with individualism, and to fall into factions that represent a divergence of opinions and interests. What these have in common is always the world itself. When that is the common concern, there is an objective basis for agreement and the motive to cooperate.

Why the rich get richer

Since about 1970, there has been a world-wide trend toward increasing concentration of wealth. That means, for example, that ten percent of a population own far more than ten percent of wealth, and that ten percent of that group possess a disproportionate amount of its wealth—and so on up. It is not surprising that this trend is global, since capitalism is global. Yet it is accentuated in the English-speaking world and in the United States in particular. Why?

Thomas Piketty (Capital in the 21st Century and Capital and Ideology) attributes the trend to the fact that CEOs are awarded (or award themselves) outrageous salaries and other benefits, out of line with any service they could actually perform. In Europe and other places, there are social norms that limit the acceptance by society of such a practice, and there is also support for higher wages for labour in relation to management. Historically, the U.S. has had a low minimum wage. The ethos of that country also glorifies “winners” of all sorts, from celebrities to billionaires. Americans, it would seem, prefer a slim chance at extreme wealth over a reasonable chance at moderate wealth. Fairness is perceived in terms of equal opportunity rather than equal benefit. You could call that the lottery system, where the big prize comes out of the pockets of many small ticket holders. You could also call it the star system, where fame is promised as well as fortune. Like the economy, even the political system is a “winner takes all” sweepstake. Winning per se is the measure of success and the valued skill, so that those who know how to play the system to advantage are romanticized as heroes even while they are robbing you.

Perhaps one reason corporate executives can literally command such high rewards is that they are virtually not held accountable to shareholders. Even when elected, voter turn-out at AGMs is probably small and appropriate payment may not be on the agenda. We could expect corporate leadership to act in our interests, and without our supervision, just as we expect government officials to run a well-oiled machine. Yet, the general attitude of complacency is that shareholders don’t (and shouldn’t) care about excessive remuneration for those at the very top so long as the bottom line for shareholders is a positive gain. The same attitude spills over toward political leadership. If it’s too much trouble for citizens to track politics in the public sector, it is even more bothersome to track it in the private sector, where the issues and contenders are relatively invisible. As a result, CEOs have carte blanche to pay themselves whatever they like. They are in a position to use the system they control for their own benefit. If that breaks the corporation, the shareholders will pay, not the retiring executive; or else the public at large will pay through a government bailout.

There may be further explanations. As the nature of products has evolved, so has the method of valuation. Primary manufacturing made tangible products that entailed a recognizable amount of labour. Economic growth in modern times in developed countries has involved making less tangible products, which are more like services—for example, intellectual property such as computer programs and apps. (Primary manufacturing has migrated to countries where labour is cheaper.) In addition, investment itself has evolved from financing primary manufacturing or resource extraction to meta-investments in “commodities” and other financial abstractions. Speculation then promises a possible gain without a tangible product or expended effort. That has always been the real purpose of capital: to get a larger slice of the pie without doing any actually productive work. An effective form of gambling, the speculative game has evolved to great sophistication in modern times, requiring knowledge and skill. But it does not produce a real increase in wealth, only a redistribution upward.

For the most part, the modern rich have acquired or increased their wealth through applying certain skills dignified as business acumen. In many cases it has been a matter of being in the right place at the right time, as in the dotcom and social media booms. But in many cases these are skills to milk the system more than to produce “genuine” wealth, or to provide services of dubious benefit. Let us suppose there is an objective quantity of real goods and services in the world—things that actually improve people’s lives. It is this objective value which, like gold, ultimately backs up the currency in circulation. Anything that is falsely counted as an increase in genuine wealth can only dilute that objective quantity of real value—which makes GNP misleading. Money represents spending power—the ability to exchange symbolic units for real goods and services. Ultimately money buys other people’s time and effort. Getting money to flow into one’s pocket from someone else’s pocket increases one’s spending power, but does not produce more real wealth. Nor does simply printing more money. It takes effort (human or machine) to produce value that any and all can use. However, some efforts produce things that only the parties doing them can use. Stealing is such an effort. So is war. So is a lot of financial gain that is considered return on investment.

Historically, return on capital to its owners has always exceeded the amount of real growth in the economy. That has always been its very reason for being. But, only real growth can be shared in such a way as to benefit humanity as a whole. That means an increase in the infrastructure of civilization, the common wealth of humanity, potentially enjoyed by all. Spending power is a more private matter. The surplus generated as return on capital accumulates in certain hands as the exclusive ability to access and command the benefit resulting from real growth. Spending power trickles upward because certain individuals have figured out to make that happen. Modern capitalism, with its complex abstractions, is a sophisticated machine to pump spending power into particular coffers.

The distribution of wealth becomes the distribution of spending power; and the growing inequality means unequal access to the common resources of humanity. I do not mean only natural resources or the land, but also products of human enterprise (“improvements” as they are called). This includes things like buildings and art, but also technology and all that we find useful. Perhaps there is no intrinsic reason why there should be equal distribution of either property (capital, which is earning power) or spending/buying power. I am not advocating communism, which failed for the same reason that capitalism is faltering: selfish greed. The problem is not only that some players take advantage of others, which has always been the case. There is also a larger ecological fall-out of the game, which affects everyone and the whole earth. A lot of production is not rationally designed to better the human lot but only to gain advantage over others: widgets to make money rather than real improvement. Thoughtlessly, such production damages the biosphere and our future prospects.

There are ways to redistribute wealth more fairly, short of outright revolution, which never seems to be permanent anyway. Income tax can be made more steeply progressive. Assets and property of all sorts can be taxed progressively as well, so that wealth is redistributed to circulate more freely and does not accumulate in so few hands. There could be a guaranteed living, a guaranteed inheritance, and a guaranteed education for all. The wealthy should not dismiss these ideas as utopian. For, as things are, we are indeed on the historical track to violent revolution. Yet, along with redistribution of wealth as conventionally measured, we must also revolutionize our ideas about what constitutes real wealth—that is, what we truly value as improving life. Upon that our collective survival depends.

We would be far better off to think in terms of real value instead of spending power. However, those who seek personal gain above human betterment will no doubt continue to promote production that gives them what they opportunistically seek rather than what is objectively needed. Perhaps it is fortunate that much of modern economic activity does not involve material production at all. It may be a boon to the planet that humankind seems to be migrating to cyberspace. It may even be a boon that the pandemic has shut down much of the regular economy. What remains to see is how we put Humpty together again.

Is Mortality Necessary? Part Two: the demographic theory of senescence

In an earlier posting I raised the question of whether death is theoretically necessary for life—especially in a way that would thwart desires to extend healthful human longevity (“Is Mortality Necessary?” Aug 23, 2020). I pointed out that to answer this question would involve an evolutionary understanding of why mortality exists, and how the senescence of the multi-celled organism relates to that of its cells. In this second article I pursue these questions further, in relation to the prospect for life extension.

Evolution depends on natural selection, which depends on the passing of generations: in other words, on death. Each individual life is a trial, favoring those with traits better adapted to the current environment in a way that increases reproductive success. It seems counterintuitive that mortality itself could be an adaptive trait, though without it evolution by natural selection could never have taken place. However, the pace of natural evolution of the human species is less relevant now that our technological evolution exponentially outstrips it. Perhaps, then, evolution’s incidental shortcomings (mortality and aging of the individual) can and should be overcome. At the least, this would mean disabling or countering the mechanisms involved.

The history of culture amounts to a long quest to improve the human lot by managing environmental factors and creating a sheltering man-made world. (See my book, Second Nature, archived on this site.) This has been an accelerating process, so that most of the increase of human longevity has occurred only in the past couple of centuries, largely through medicine, sanitation, and technology. Infant and child mortality in particular had heavily weighted average lifespan downward for most of human existence. Their recent mitigation caused life “expectancy” to double from 40 to 80 or so years. Because it is a statistic, this is misleading. It is not about an individual lifespan, which (barring death by external causes) has not changed much over the centuries. Nevertheless, diseases of old age have replaced diseases of childhood as the principle threat to an individual life. These diseases may not be new, but formerly most people simply did not live long enough to die of them. Because they typically occur after reproductive age, there was no way for natural selection to weed them out. Heart disease, cancer, diabetes, Alzheimer’s, etc., are now deemed the principle causes of death. While these may be in part a product of our modern lifestyle, increasing vulnerability to such diseases is considered a marker of aging.

Yet, some scientists are coming to view aging itself as the primordial disease and cause of mortality. It was the triumph against external causes that resulted in the huge gain in the longevity statistic over the past century. However, to increase it much further would require eliminating internal causes. There has been a lot of research about particular mechanisms associated with aging, with optimism that these can be manipulated to increase an individual lifespan. Yet, there remains the possibility that aging is deeply built-in for evolutionary reasons. If so, aging might resist mere relief of its associated disease symptoms and might require different strategies to overcome.

The intuitive notion that the aged must die to make place for the young dates from antiquity. However, it goes against the modern idea that natural selection maximizes the fitness of the individual. (Shouldn’t a more fit individual live longer?) To make place for new generations, a built-in mechanism would be needed to cause the “fit” organism to senesce and die, if it had not already succumbed to external causes. This mechanism would have to prove advantageous to the species or to a sub-population, else it would not be selected. Such mechanisms have been identified at the cellular level (apoptosis, telomere shortening), but not at the higher level of the organism as an individual or a population. If there is some reason why individuals must die, it is not clear how this necessity relates to the cellular level, where these mechanisms do serve a purpose, or why some cells can reproduce indefinitely and some telomeres either do not shorten or else can be repaired.

Some creatures seem programmed to die soon after reproducing. Unlike the mayfly and the octopus, a human being can live on to reproduce several times and continue to live long after. Humans can have a post-reproductive life equal at least to the length of their reproductive stage. But the other side of that question is why the reproductive phase comes to an end at all. If evolution favored maximum proliferation of the species, shouldn’t individuals live longer in vigor to reproduce more? This gets to the heart of the question of why there might be an evolutionary reason for aging and mortality, which—though not favoring the interests of the individual—might favor the interests of a population.

The Demographic Theory of Senescence is the intriguing idea that built-in aging and mortality serve to stabilize population level and growth. (See the writings of Joshua Mitteldorf.) Without them, populations of predator and prey could fluctuate chaotically, driving one or both to extinction. Built-in senescence dampens runaway population growth in times of feast, without inhibiting survival in times of famine (brought on, for example, by overpopulation and resource depletion). In fact, individuals hunker down, eating and reproducing less and living longer under deprivation, to see a better day when reproduction makes more sense. In other words, mortality is a trait selected for at the group level, which can override selection against it at the individual level. Mortality also favors the genetic diversity needed to defend against ever-mutating disease and environmental change (the very function of sexual reproduction). For, a population dominated by immortals would disfavor new blood with better disease resistance. Yet, theorists have been reluctant to view senescence as a property that could be selected for, because it seems so contrary to the interests of the individual. In western society, sacrifice of individuality is a hard pill to swallow—even, apparently, for evolutionary theorists. Yet the whole history of life is founded on organisms giving up their individuality for the sake of cooperating toward the common good within a higher level of organization: mitochondria incorporated into cells, cells into differentiated tissue and organs, organs into organism, individuals into colonies and communities.

Apoptosis (programmed cell death) has been documented in yeast cells, for example, as an “altruistic” adaptation during periods of food scarcity. In the multi-celled organism, it serves to sculpt development of the embryo, and later to prevent cancer, by pruning away certain cells. In the demographic theory, it is also a mechanism naturally selected to stabilize population. Since telomerase is available in the organism to repair telomeres as needed, the fact that it can be withheld might also be an adaptation to reduce life span for the sake of stabilizing population. Pleiotropy (when a gene serves multiple functions that are sometimes contradictory) is selected, then, because it promotes aging and thus acts against runaway population growth.

If this is the right way of looking at it, on the one hand specific mechanisms are selected that result in mortality as a benefit to populations though at a cost to individuals. It might be possible to counteract these mechanisms in such a way as to prolong individual life. On the other hand, following this path successfully would defeat the evolutionary function of built-in mortality unless intentional measures are taken to insure genetic diversity and to control population growth. The lesson is that if we are to deliberately interfere with aging and mortality (as we are certainly trying to do), we must also deliberately do what is required in their place: limit human population while providing artificially for the health benefits of genetic diversity. If the goal is a sustainably stable population, people cannot be born at current rates when they are no longer dying at natural rates.

These are global political and ethical issues. Who would be allowed to reproduce? Who would be kept artificially alive? The production and rearing of children could be controlled, indeed could become a state or communal function, divorced from sexual intercourse. The elderly could be educated to let go of an unsatisfying existence—or be forced to. The diversity issue is a technological problem, which genetic medicine already proposes to engage; it too raises the unsavory prospect of eugenics. All this brings to mind the transhumanist project to assume total conscious control of human nature, evolution, and destiny.

At present, of course, we do not have the institutions required for humanity to take full charge of its future. Science and technology attempt to control nature for piecemeal, often short-sighted purposes, or as rear-guard actions (as we have experienced in the pandemic). But far more would be involved to wrest from nature control over evolution and total regulation of the earth’s biosphere (or of an artificial one somewhere else). Immortality would require a lot more information than we now have. To be sustainable, it would also require a level of organization of which humanity is currently incapable. It would need a selfless spirit of global cooperation to act objectively for the common good, which does not yet exist and perhaps never will. (Imagine instead a tyrannical dictator who could live forever!) Immortality would require a serious upgrading of our values. At the very least, it would require rethinking the individualism on which modern humanism and science itself were founded, let alone the consumer culture. Science, politics, religion, and philosophy will have to finally merge if we are to take charge of human destiny. (Plato may have grasped this, well ahead of our time.)

Ecological thinking is about populations rather than individuals. Communism was perverted by the societies that embraced it and rejected by the societies that cling to individual property rights—in both cases out of individualist self-obsession. In the current pandemic, however, we are forced to think of populations as well as individuals. Yet, the strategy so far has been oriented toward “saving lives at any cost.” If the demographic theory of senescence holds any water, our social development will have to keep pace with strategies to increase lifespan if we are not to breed ourselves to oblivion. Individuals will have to voluntarily redefine personal identity and their relation to the collective. And, of course, their relationship to aging and death.

Why nature should be accorded the rights of personhood

Let us first understand what a person is and why nature is not regarded as a person in our culture. But, even before that, let us acknowledge that nature once was so regarded. Long before industrial society became obsessed with the metaphor of mechanism, ancient peoples conceived the world as a sort of living organism. It was peopled with creatures, plants, spirits, and other entities who were treated as personalities on a par with human beings. Then along came the scientific revolution.

Nature for us moderns is impersonal by definition. For us, physical things are inert and incapable of responding to us in the way that fellow human beings are. Our science defines nature as comprised only of physical entities, not of moral agents with responsibilities and obligations. Humans have formed an exclusive club, which admits only members of our kind. Members are accorded all sorts of extravagant courtesies, such as human rights, equality before the law, the obligation to relieve (human) suffering and save life at any cost, the sentiment that (human) life is infinitely precious. We have appropriated half the earth as our clubhouse and grounds, expelling other inhabitants toward whom we feel no need to extend these courtesies. Having eliminated most of those formidable competitors who posed a threat and demanded respect, the remainder of creatures are no more to us than raw materials, to use and enslave as we please, or to consume as flesh.

We have done all this because we have been able to, with little thought for the rightness of such an attitude. In the Western hemisphere, our society was founded on domination of the peoples who occupied the land before the European invasion (the “discovery” of the New World). Significantly, it went without saying that this included the assumed right to dominate the other species as well. In other words, plundering these lands for their resources, and disregarding their native human inhabitants, went hand in hand to express an attitude that depersonalized both. The Conquest, as it has been called, was simultaneously about conquering people and conquering nature.

Probably this attitude started long before, perhaps with the domestication of wild animals and plants. The relationship to prey in the traditional hunt was fierce but respectful, a contest between worthy opponents, especially when large creatures were involved. (Can it be coincidence that losers of this contest are called “game”?) Often an apology of gratitude was made to the murdered creature, no part of whose valued carcass was wasted. When animals were enslaved and bred for human purposes, the tenor of the relationship changed from respect to control. The war on animals concluded in a regime of permanent occupation. A similar shift occurred when plants were no longer gathered but cultivated as crops. Both plants and animals were once personified and treated as the peers of humans. That relationship of awe gave way to a relationship of managing a resource. The growing numbers of the human invaders, and their lack of self-restraint, led eventually to the present breakdown of sustainable management.

Today we embrace a universal biological definition of human being, and a notion of universal human rights and personhood. Throughout history, however, this was hardly the inclusive category that it now is. Groups other than one’s own could be considered non-human, reduced to the status of chattel, or even literally hunted as prey. In other words, the distinction between person and thing remained ambiguous. Depersonalization is still a tactic used to justify mistreating people that some group considers undesirable, inferior, or threatening. Personhood is still a function of membership in the exclusive human Club. But since the qualifications for that membership were unclear throughout most of human history, perhaps it works both ways. Instead of excluding animals, plants, and nature at large, we could give them the benefit of the doubt to welcome them into our fellowship.

Now, what exactly is a person? In fact, there is no universally accepted legal or even philosophical definition. The word derives from the Greek term for the theatre mask, which indicated the stage character as distinguished from the mere actor wearing it. It means literally the sound coming through the mask, implying a transcendent origin independent of the physical speaker. Personhood so understood is intimately tied to language and hence limited to human beings as the sole known users of “fully grammatical language.” Other creatures do obviously communicate, but not in a way that entitles them to membership in the Club.

Various features of personhood—loosely related to language use—include reason, consciousness, moral and legal responsibility, the ability to enter contractual relationships, etc. Personhood confers numerous legal rights in human society. These include owning property, making and enforcing contracts, the right to self-govern and to hire other agents to act on one’s behalf. It guarantees civil rights to life and liberty within human society. One could argue, on behalf of non-human creatures, that they meet some of these criteria for personhood. In many cases, like aboriginal people, they were here before us, occupying the lands and waterways. Western society is now judicially and morally preoccupied with historical treaties and land claims of aboriginal groups—and more generally with redressing social wrongs committed by former generations. Certainly, our exploitive relation to nature at large is among these wrongs, as is our checkered relation to particular species such as whales, and ecological entities such as forests, rivers, oceans and atmosphere. Thinking on the subject tends to remain human-centric: we are motivated primarily by threats to our survival arising from our own abusive practices. But such a self-serving attitude remains abusive, and morally ought to give way to a consideration of victims that are not yet extinct, including the planet as a whole. If restorative justice applies to human tribes, why not to animal tribes? Why not to nature at large?

Animals reason in their own ways. Although insects may be little more than automatons, vertebrates are clearly sentient; many display concern for their own kin or kind and sometimes for members of other species. While they do not make written contracts, at one time neither did people! Animals (and even plants) maintain measured relationships, not only among their own ranks but also with other species, in an intricate system of mutual symbiosis and benefit. These interdependencies could be regarded as informal contracts, much like those which human beings engage in quite apart from written contract and law. Deer, for example, tend to browse and not over-forage their food supply. Even viruses are reluctant to kill their hosts. It could be argued that human beings, in the course of substituting formal legalities, have lost this common sense and all restraint, and thereby forfeited their claims.

Of course, we now say that natural ecological balancing acts are not “intentional,” in the conscious human sense, but the result of “instinct” or “natural selection,” and ultimately of inanimate “causal processes.” This is a very narrow and human-centric understanding of intentionality. Fundamentally, it is circular reasoning: intending is what people do, in the human world, while causality is held to be the rule in nature. However, both intention and cause (and even the concept of ‘nature’) are human constructs. Causality is no more an inherent fact of nature than intention is absent from it. It is we humans who have imposed this duality, thereby seceding from the animal kingdom and from nature at large. Humans have formalized their observations of natural patterns as laws of nature, and formalized their own social conduct and conventions as laws of jurisprudence. Other creatures simply do what works for them within the system of nature without articulating it. Are we superior simply because we gossip among ourselves about what other creatures are doing?

Are we superior because might makes so-called rights? We chase other creatures (and fellow human beings) from a habitat and claim to own it. But private property, like money, is a legalistic convention within the Club that has no clear connection with any right of ownership outside of human society. “Ownership” in this broader sense is more deserved by those whose proven behavior adheres to the unwritten laws that maintain an ecosystem in balance with their fellow creatures. It is least deserved by those who destroy that balance, while ironically permitted to do so through written laws. If stewardship is a justification for aboriginal land claims, why not for any right of ownership of the planet? Almost any species has a better claim to stewardship than humans. After all, we have rather botched the unceded territory that the Club now occupies.

Nature is self-regulating by nature; it doesn’t need to be granted the right to self-govern by human beings. It cannot speak for itself in human language, but expresses its needs directly in its own condition. Human scientists appoint themselves to interpret those expressions, largely for human purposes. The State appoints a public defender to represent people who cannot afford counsel or who are deemed unable to defend themselves. So, why not public defenders of the planet? Their job would be to insure, in the eyes of human beings, nature’s right to life, liberty, and well-being. That could only benefit us too, since it has never been more than a delusion that we can live segregated from the rest of nature.

 

An Objective Science of Economics?

Physicists and even biologists can pretend to an objective understanding of the phenomena they study. That is to the degree they can stand outside the systems concerned. Is that possible for economists? While the economist is perhaps more akin to the anthropologist, even the latter is traditionally an outsider to the cultural systems studied. The ubiquity of modern global capitalism poses the same dilemma to the economist now that the anthropologist will face when all indigenous cultures have been absorbed into modernity.

This has not stopped generations of economists from holding forth on the “true” nature of such concepts as value, wealth, capital, and growth. Biologists, and even anthropologists, have been able to analyze living systems in their own human terms—as distinct from terms proper to the living systems themselves. That is, scientists dwell in a world apart from the world they observe. No such separation is possible for economists, who inevitably think in the terms characteristic of their society, which is increasingly a monoculture alienated from nature.

Economic concepts reflect values current in modern capitalist society. Despite rival and contradictory theories, on some level this renders all economic theory fundamentally self-serving, even self-confirming. At worst it is misleading and deceptive, often embraced by governments to legitimize their policies. For example, the “trickle down” notion endorsed by Reagan and Thatcher was used to justify an unregulated market, on the promise that profit going to the elite would ultimately benefit society as a whole, if not equally. This, when it was already widely apparent that inequality of wealth was expanding far more rapidly than wealth itself! The tacit assumption is that no one should object to the accelerating concentration of wealth, provided their own situation was moderately improving or at least not getting worse. By hiding behind averages, however, economic analysis can obscure what people on the ground are actually experiencing, which in any case is subjective and includes things like envy and perceived unfairness. Concepts like ‘national income’, ‘GNP’, ‘total capital’ (private or public) are statistical averages that usefully reveal inequalities between countries or regions, and over time. Yet, they mask huge inequalities among individuals within a given country.

Life in consumer society has in many ways been visibly democratized. We are theoretically now equal before the law, each with a single vote. The modern upper class blends in with the rest of society more easily than traditional aristocrats, who flaunted their class status. The modern gated community is far less conspicuous than the castle on the hill. The Jaguar or Tesla is not as flagrant a status symbol as the gilded horse-drawn coach; it must travel on the same public highways as the rest of us and obey the same traffic laws. First class air travel is only moderately less uncomfortable than “economy” class; most of us have not seen the inside of a private jet and probably not the outside either. The extreme wealth of the very rich consists less of evident status symbols than invisible stock portfolios and bank accounts. Yet, while the modern rich may be unobtrusive, they are increasingly powerful. And the poor are increasingly numerous, obvious, and disenfranchised.

Most of economics since Marx has been little more than an apology for systemic inequality. One reason for this may be that “the economy” is as complex as human being. The stock market and modern investment “instruments” are accordingly abstruse, with those in the know privileged to gain from them. It is little wonder that the theories devised by economists (mostly academics) have been so esoteric. But this makes it all the harder, even for them, to call a spade a spade or to declare the emperor nude. Yet, millions of people are possessed enough of common sense to see that the rich are getting richer and the poor even poorer for reasons that can hardly be historical accident. The gains of the middle class in mid-twentieth century are quickly eroding. Their very assets (such as mutual or pension funds) are being used against them to line the pockets of corporate executives. The modern capitalist system has been honed by those who control it to siphon wealth from the now shriveling middle class. This can hardly be surprising, since—to put it bluntly—the purpose of capital has always been to create inequality! What is surprising is how efficient it has become in the last few decades at undoing the hard-won gains of the middle class.

Modern economists may or may not be capitalist dupes, and the top-heavy capitalist system itself may or may not be doomed to collapse. But is there another role possible for a science of economics? The traditional focus of economic theory has been anthropocentric by definition. It is about the production and distribution of wealth in very human terms, and in many cases from the perspective of the ruling class. In particular, it does not consider business and growth from a planetary point of view, in terms of its effects on nature. Such a perspective would be nearly a negative image of the human one: all that is considered an asset in human terms is largely a deficit in the natural economy. This deficit has become so large that it can no longer be ignored by the human economy; it threatens to break the bank of nature. If an objective economics is possible at all, it would at the least have to encompass the planet as the true stage upon which all economic concepts must be redefined to play out. By this I do not mean carbon credits or assigning a market value to natural resources or to services provided by nature, such as recycling air and water. I mean rather a complete shift from the human perspective to that of nature. From that perspective, what would have value would be services provided to nature by human beings, rather than the other way around.

No doubt, from the natural point of view, the greatest service we could perform would be to simply disappear—gracefully, of course, without wreaking further havoc on the planet in the process! I don’t think people are going to agree to this, but they may ultimately be forced to concede at least some rights to the rest of the biosphere as a co-participant in the game. A more objective economics would at least give parity to human and natural interests and would focus on the interaction between them. Concepts like “value” and “growth” would shift accordingly. In human economics, value has been assigned a variety of meanings, reflecting the daunting convolutions of the human world in which economists are perforce enmeshed: labor-value, use-value, exchange-value, surplus value, market value, etc. Yet, an economics that focuses equally on the needs of nature could simplify matters by moving the human player from center stage. It could become an actual science at last, insofar as it would study the world independent of human values, like other sciences are supposed to do.

The various anthropocentric concepts of value could fold into to a single parameter representing an optimum that maximizes both human and natural interests. This approach is used in game theory, where contending human players vie strategically, and the optimum looks like a compromise among opponents trying to maximize their individual “utility” and minimize their losses. Why not consider nature a player in the game? And why not consider strategies based on cooperation rather than competition? Suppose the aim is to arrive at the best possible world for all concerned, including other species and the planet as a whole. Value then would be measured by how much something advances toward that common goal, rather than how much it advances individual human players at the expense of others and nature. Growth would be the increase of overall well-being.

Money in such a world would be a measure of duties to perform rather than privileges to exercise. It is often said that money confers freedom to its possessor. In an objective economics, this freedom would be put in the context of how it affects other beings, human and non-human. The value of a “dollar” would be the net entitlement it represents: its power to benefit its holder minus its power to damage the environment, which includes other beings. Put more positively, money would measure the power to contribute to the general welfare—not the power to consume but the power to do good. The individual would benefit only statistically and indirectly from spending it, as a member of an improved world. As in prehistory, the individual would again figure more as an element of a community than as a competitor with other individuals. Admittedly, this would be communism—perhaps on steroids. But communism as we’ve known it is passé, both as a failed social system and as a dirty word in the capitalist lexicon. There are many who wish to continue to believe in the opportunism of modern capitalism, bumbling toward ecological and social collapse. But they are obliged to find a new scare word to castigate the threat to their individual liberty, which is little more than freedom to hasten the destruction of the planet.

The Problem with Money

Capitalism is the practice of using wealth to collect interest or rent. That means that those who possess capital can have an income not from their labor but from their property. It’s important to grasp that “real” productivity comes from work that provides a service or transforms materials into something humanly useful—whether the exertion comes from muscles or from machines, and whether the materials are physical or more intangible, such as concepts. Income from capital means more spending power, but not necessarily more productivity. In other words, your spending power can grow on its own, but only your labor can make real value. While capital can increase your share of total wealth, it cannot by itself increase the supply of goods in the world that have real value.

Some economists consider it a problem when global capital is not productive enough. Being super-rich already, its owners may not be motivated to use it as effectively as someone with less means. However, productive of what? Effective toward what end? In the face of ecological disaster, production that results in pollution, exhaustion of non-renewable resources, and global warming is not objectively desirable. Consumption that requires such production should be discouraged. Money sitting idle is at least money not being used to destroy the planet!

An individual is both producer and consumer, according to how their personal wealth and energy are disposed. The extremely rich have excessive spending power as consumers; yet, only so much personal consumption can be directly satisfying. (A person can only sleep in one bed at a time, drive one car, fly one plane, eat one meal, etc.) Spending power represents choice for the wealthy more than direct consumption. Of course, they may cause more environmental fallout in their quest to establish that range of choice (several houses around the world, several cars, private jets and boats, etc.) Yet, it could be telling to imagine the cumulative effect of the same total spending power if it was distributed over many poorer individuals, who might be more motivated to use their comparatively meager resources as capital to further increase their wealth. Would the redistribution result in more or less global warming? In other words, might the capital of a large middle class have a worse ecological effect than the same capital in the hands of a small super-rich elite? If so, would it not be ecologically better to maintain the unequal distribution? That is a big “if,” and I am not justifying inequality, which many perceive as the world’s foremost social problem. Rather, I want to point out that merely redistributing wealth in the name of fairness would not solve our global ecological challenges. Productivity and growth must be altogether redefined, in such a way that eliminates production detrimental to humanity’s long-term interests.

Laissez-faire means one can buy what one wants if it is available and produce what one wants in the hopes someone will buy it. But nature is no longer going to let us do just what we want. Government could intervene to limit production to certain “goods” that are indeed good (or less bad) for our human future. It could also limit what can be purchased, by specifying what credits can be used for. While libertarians would object to such impositions, they are free to embrace and advocate voluntary restraint. One could argue, under the circumstances, that one is morally obliged to do so, since the other side of freedom is responsibility. In effect, we turn to government to make us do what we know is necessary, but which we resist—in part because we fear others will not do their fair share. The job of government is to impose fairness along with order, like a parent who must deal with squabbling children.

The whole advantage of money is that it can be used for any purpose that others accept, regardless of who it belongs to or where or in what form it is stored. But, what if the dollar were not this anonymous and abstract unit of power, but could be used only for specified purposes—for example, for basic necessities or for “green” projects? This is entirely feasible in the digital economy. A unit of wealth could be spendable only on certain products or services, or used to finance only certain projects or activities. Each dollar (or euro or yen) would have its own earmarked identity, could only be used for its designated purpose, and would be a non-transferable credit. No doubt a black market would arise to get around such restriction. But that loophole effectively already exists, in the form of untaxed offshore accounts and other unfair advantages for the ultra-rich.

If total global wealth were distributed evenly, an individual typically might not have enough savings to finance an enterprise beyond a certain scale. And where wealth is distributed unevenly (as it almost always is), the vast majority will not have the needed resources—which is why they borrow from those who do. This is a fundamental unresolved problem of social organization. At every level of community, people have always needed to pool resources to get some things done. When a few control the lion’s share of resources, however, the temptation is to charge the community for their use. From the community’s point of view, this is a form of extortion. “Usury” was once forbidden by religions for good reason. We have gotten used to it, so that it seems normal and necessary, even if unfair. The result is a mounting burden of private and public debt, with crippling interest payments and skyrocketing rents and real estate prices.

If 70% of global wealth belongs to 10% of global population, then those wealthy individuals are 70% responsible for how the world’s resources are used. The fate of the planet lies proportionately in their hands. Quite apart from the question of redistribution, fairness dictates that the greatest pressure to reform should fall upon them through taxation and legislation. But rather than being an incentive to use capital more “efficiently,” these measures should be used to foster production that does not damage the biosphere. The rich could take the lead in setting an example concerning the sort of investment to make. Rather than simply reducing inequality of wealth, the focus should also be to reduce its environmental effects.

To reduce the environmental impact of production, manufactured products must be designed to last for generations. They must satisfy real needs. Public and private property must be thought of as a patrimony to be passed on generation to generation. Nothing must be produced for the sake of making money—that is, merely to establish or justify one’s slice of the economic pie. On the contrary, a basic living should be everyone’s right! An economy is a game with many players, in which each person’s share is decided and justified. Yet, the game is not only about the individual’s winnings in a contest with others. For, every society is a collective enterprise, in which something is achieved beyond the holdings of the individual. The human enterprise as a whole is far more cooperative than competitive.

Granted that we need some material stuff, it should at least be durable and of high quality. However, since the beginning of the industrial revolution, production has only indirectly been aimed at satisfying real human need. The immediate goal has been to increase the investor’s share of the economic pie: make more widgets to get more money. It was never to make lasting stuff; or stuff so excellent that it didn’t invite an improved version. Quite the contrary, it didn’t take long for manufacturers to discover built-in obsolescence. With the development of plastics, durability and quality for the most part went out the window. The real value of “goods” (that is, their ability to satisfy genuine human need) was displaced by their symbolic or token value (that is, as a pretext to make money). The irony is that the quality of available goods was lowered for the rich along with everyone else. Spending power is thereby diluted for everyone in terms of its ability to purchase things and services of real value. As always, in compensation, the ultra-rich have sustained a cadre of artisans who can still produce the quality they alone can now afford. Otherwise, what would they have to show for their wealth?

It is largely the production of material goods that results in the carbon footprint. To some extent, the modern economy is already being de-materialized by information technologies. But Man cannot live by information alone. Material goods remain the gold standard of economic value, because we remain physical beings with material needs. In some sense, de-materializing the economy inflates the unit of value and slows “real” growth. And that growth must slow down for the sake of the planet. Yet low growth has a counterintuitive consequence in addition to the obvious belt-tightening for all: it increases inequality. Low growth increases inequality because the rate of return on capital (possessed disproportionately by the rich) has always been higher than the rate of growth (produced by everyone collectively). This has mattered less in exceptional times of high growth, because there is then a surplus to benefit everyone (so-called trickle-down) even though the rich benefit more. But high growth rate is neither historically normal nor sustainable ecologically. We have to find a way of life that is both fair and minimal in its planetary impact.

 

DAN’S NEW BOOK: Holy Terror and the Beauty of It All

This is to announce the launching of my latest book, Holy Terror and the Beauty of It All: how to live with existential anxiety. The text can be downloaded at no cost from the Archive: Collected Writings. (Note: the first two pages are intentionally blank—scroll down for the title page and text.) Or, if you like to hold a book in hand, a physical copy can be ordered from this link: https://store.canambooks.com/holy-terror-and-the-beauty-of-it-all-how-to-live-with-existential-anxiety.html

From the back-cover blurb:

Human beings have always felt insecure in the world. Despite confident proclamations by science and religion, no one can be absolutely certain what is really going on here in this drama we call existence. In contrast to the eternity and boundlessness intimated in our consciousness, we are haunted by the realization that we are finite, vulnerable, mortal—and perhaps meaningless—creatures. The ambiguity of all experience leaves us in a state of fundamental uncertainty, with a buried anxiety underlined by fear of mortality.

Holy Terror and the Beauty of It All is a reflection on the human condition. The first part of the book explores personal and cultural defenses against existential anxiety. The second part presents a theory of consciousness as a guided hallucination created by the brain. The third part proposes a stance of appreciation as an antidote to anxiety: the other side of “holy terror” is “the beauty of it all.” The ability to consider consciousness as a personal creation, rather than a window on the world, enables us to appreciate experience for its own sake, despite uncertainty. It is the basis of art, and also of tolerance, responsibility, and the creative ability to think, perceive, and act outside the box. Available at all times, this stance can provide special consolation in old age and in situations of deprivation or despair. The book concludes with a discussion of the complementary roles of belief and doubt, reaffirming the value of standing back to appreciate the grandeur of the whole. A Postscript follows up the topical relevance of these ideas in the age of pandemics and fake news.

Uncertainty is nothing new

Much is made these days of the uncertain times we live in. In particular, we seem to be in the midst of an information crisis. Paradoxically, a glut of available words, images, and numbers floods daily life—from news, social media, websites, etc. At the same time, there is growing distrust of traditional sources of information. The two are clearly related. The internet provides instant and easy access to facts and opinions that formerly required a lot more trouble to encounter. Specialized experts used to be trusted to establish knowledge and to locate and vet sources for reliability on our behalf. Scientists and academics, medical professionals, librarians and editors, critics and censors, peer reviewers, journalists, news casters, and professional writers of all sorts mediated information for us, advising us of what was worthy of consideration. The new glut of ­unmediated voices is a modern tower of Babel that confounds us and creates an atmosphere of suspicion. For, the sheer abundance of choice gives the deceptive impression that all claims might be of equal merit, that everything boils down to opinion. Even before the advent of social media, the religious right had popularized this notion with its platform that Creationism should be taught in public schools as a legitimate alternative to Darwinism. And then, perhaps, consumer advertising conditioned us to think of ideas less for their content than for their packaging.

The political climate has indeed dovetailed with religion, and overlapped with suspicion, at least in the United States. But there is a long history of the struggle of secular and religious ideas leading up to the present situation. There was a similar information crisis when the printing press changed European society forever, just as the digital revolution is now doing worldwide. The printed word was instrumental in the Reformation, prior to which a literate elite controlled the flow of information. The distribution of pamphlets and translation of the Bible into common language meant that literacy increased and lay people could study and evaluate the written basis of their ideology. And that meant that the prevailing order could be and was questioned. In particular, one could see for oneself that the Church did not follow the teachings of the scripture upon which it was supposedly founded. Expanding literacy and availability of the printed word led to a crisis of faith in authority and the keepers of knowledge. Opinions were multiplied, much as in the present crisis. The burden of interpreting the truth then fell upon the common citizen, as it now does once again. What to believe was dangerously up for grabs. The stakes were high in the Reformation period, because one’s eternal spiritual fate hinged on what one believed, and in many cases one’s physical safety did too. As now, this created deep divisions in society, with tensions that could (and did) erupt into violence.

It is no coincidence that the major religions came into being at a time when relatively isolated societies began to come into more frequent contact with each other. Ethical precepts emerged to govern the relation among individuals newly recognized as members of an expanded human tribe. That is, religion advised how to treat one’s “neighbor” in a broader sense, when traditional identity within a social (often racial) group was confused by the intrusion of outsiders. The forceful imposition of religion upon “pagan” or “infidel” groups in conquest could be viewed as a desperate measure to assimilate them into one’s own society. Unfortunately, the converted were typically never more than second-class citizens. Often they were literal slaves, in ironic contradiction to the ethical teachings of the religion concerned, which were applied only to those fully recognized as the group’s own kind. Such hypocrisy, when perceived, has frequently been a stimulus to reform. On the other hand, tribal identity has always shaped our behavior toward others, despite law and religious ethics.

In post-Civil War America, a large racially distinct population of slaves had suddenly to be assimilated into the citizenry. In the post-colonial Europe, people with an ambiguous status as subjects of the former empire began to infiltrate the home society in a kind of reverse osmosis. Though the U.S. was scarcely a literal colonial power, economically it certainly has been. Many from its virtual empire have returned to roost within U.S. borders—especially from Mexico, much of whose early territory had been stolen by the U.S. in the first place. Most recently, an influx of political and economic refugees from Africa to Europe constitutes a global migration that will only be accentuated by climate change. In all these cases, ethics has largely failed to guide society with a universal understanding of how to behave toward one’s fellows, according to a more inclusive definition of fellowship (the lesson of the Good Samaritan).

The struggles of the Reformation against the corruption of the Church are mirrored in the struggles of popular movements against the corruption of ostensibly democratic government, which is really oligarchy. Such movements can be left or right-leaning, with little recognition of their common ground. (The left-leaning movements tend to be secularist, while the right-leaning ones tend to be fundamentalist.) Humanists forget the origins of humanism in religious values; nominal Christians forget Christ’s teaching of love and humility. This is only to say that we have been here before. During the Reformation and again now, society was extremely divided because the basis for a common understanding collapsed. That is a dangerous situation, as history has shown, because without a common understanding there might be no common acceptance of authority; and without that, no rule of law.

However, it is not uncertainty itself that is the problem. Life has always been uncertain, for all creatures. (Imagine how the dinosaurs would have felt if they had understood astronomy and the real catastrophe looming at them from outer space!) The problem is our low tolerance for uncertainty, which translates as the need to believe. That is: the need to believe something—anything at all—even when it lacks a basis in fact. The need for certainty is the root of conspiracy theories and the dividedness that now besets America and many places in the world. For, the very notion of fact presumes a common ground of which we can all be assured, a common recognition of what qualifies as reality. The problem chases its own tail: without a common understanding, there can be no facts, and hence no agreement, and hence no certainty.

That situation is not new either. It was the situation on the eve of the scientific revolution, which displaced medieval scholasticism. Fact was the missing ingredient in scholastic argument, which drew mostly on hearsay and referred only to the writings and claims of other thinkers—never to actual observations of the natural world. The human world had enveloped itself in its own closed realms of circular reasoning. In those realms, imagination had free sway, unguided by reality, at the same time that it purported to be an account of reality. It was therefore a hotbed of contentious nitpicking. Science broke through this bubble of self-confirming opinion by insisting that nature had to be consulted. Nature lay outside the human world, as the stage upon which the human drama was played out. It is the literal common ground of all creatures, the common basis for fact. Unlike scholastic philosophers, scientists could come to agreement because nature was the ultimate authority, the arbiter of disputes.

The current rejection of the authority of experts includes suspicion of science—all the more when scientists appear to be “for hire” to support positions on issues outside science. All the more when the technology brought to us by experts seems also to be ruining the world! In throwing the baby out with the bathwater, however, we revert to the medieval ethos of contention without a grounding in the possibility of mutually recognized fact. In part, this may reflect the migration of humanity away from contact with nature to live in urban environments, which has recreated a new self-enclosed cultural bubble. The concept of the wild has lost all meaning for most of humanity. The human transformation of the planet is now so extensive that there hardly seems a nature left for science to study. Scientific concepts and theories have become so abstruse, and the experimental evidence on which they are based so tenuous and esoteric, that they seem to resemble scholastic arguments. In other words, they are perceived as little more than opinions.

Once again, everything seems up for grabs in a free-for-all of opinion. In some ways, we’ve come full circle since the days of the Renaissance. Though we seem to be sinking back into a morass of blind faith and superstition, perhaps this means that we are on the threshold of a new Reformation, and a new rebirth of thought. As nature was the common ground upon which science first arose, perhaps it will be again the basis for a more unified world, as its demands on us become more urgent, unmistakable, and indisputable. In the first Renaissance, the early scientists called upon nature to provide a common understanding for human benefit. Now nature is calling upon us to come together for nature’s benefit. One hopes for the best.

Tragedies of the Internet commons

A specter is haunting America. It is the culture of divisiveness and hate fostered by social media. The combination of ubiquitous personal devices with social media platforms has precipitated an epidemic new opiate of the people. The internet as a whole started out as a commons—the Information Highway—only to become a plundered landscape, polluted by advertising and trivialized by misuse. Beginning as a great boon to the common welfare, it has degenerated to a weapon for mischief makers, a mere marketplace, or a new entertainment.

Social media platforms may rightly be held accountable for nefarious consequences to society of their “services,” which include the potential to shape public discourse and even derail elections. But the user, too, is accountable. While it is easy to blame technology and successful corporations for society’s woes, here I want to explore the internet user’s responsibility. After all, it is consumers who enable consumerism. The domination of the world by large corporations is only possible because people buy their products. Just so, social media are only successful because people misuse them and overuse use them—in particular, to relieve frustration and boredom. Commercial advertising has literally filled cyberspace. But, advertising is only profitable because it is presumed to work. Underlying the consumer mindset is a kind of passivity and emptiness, a lack of proactivity and self-possession. The first simple truth I propose is this: those who are not driven sensibly from within will be driven senselessly from without. Social media have stepped in to fill a vacuum of will.

Teens seem to be particularly vulnerable to exploitation by social media. That is because they are by definition not yet completely formed. They may not yet know fully who they are or what they value and intend in their lives and in the world. That is as it should be in a period of experimentation and self-discovery. Education is supposed to elicit knowing who you are and provide the thinking skills to find, evaluate and use information. It is very challenging for educators when students cannot focus because mobile phones dominate their attention. All the more when their attention span has shriveled because they had the misfortune to grow up in the digital society. Exploiting the vulnerability of the under-aged is child abuse. Exploiting the vulnerabilities of those who are old enough to know better may be legal but is unethical. It may have begun with good intentions; yet, it points back at the exploiters as sociopaths, who are either unable to properly foresee the consequences of their actions or simply don’t care. Good intention is not sufficient to guarantee a good result.

A second simple truth is this: Information is of little positive use to those with no clear intention of their own. At best it is entertainment, distraction, noise. At worse, it means giving oneself over to be manipulated. If you are not pursuing your own goals (which might include the deliberate goal of having no goal), chances are you are being used to further someone else’s. There is no excuse for that state of affairs in the adult of the species. But there is a name for it. Anomie is “a condition of instability resulting from a breakdown of standards and values or from a lack of purpose or ideals.” It is the vacuum inside, which makes a society prey to demagoguery and media manipulation.

While America has always been divided (they fought a bloody civil war), it has not always been so vacuous. Just as education has been disrupted by social media and an atrophied attention span, so has the political culture. Neil Postman notes that people who gathered at the Lincoln-Douglas debates stood listening attentively for hours. They would adjourn for dinner and come back for hours more of political debate. Media now typically assume that attention can be maintained for seconds only. But attention is relative to intention. If you are seeking information for your own purposes (such as to know who to vote for), you will have (or you will acquire) the discipline and patience necessary to accomplish your goal. If you have no goal, your mind will likely wander until it is grabbed by the most seductive bauble, the most quotable tweet or persistent ad—or by anything at all simply to fill the void.

Here is a third simple truth: even a conscientious tool user may be raw material for someone else. In our high-tech society, we love our tools and apps. Though they may seem to be free of charge, however, there are hidden costs. (As critics of social media are fond of pointing out, we are the resource they are mining.) If your data profile is commercially worth something to someone, it is because they believe they can influence you in some way, usually to buy something. In other words, they hope you have no will of your own! If you know your own mind, you know what you need and will actively shop for it of your own accord without solicitation (though you can still be influenced by misinformation and how your access to information is controlled). You also know what you don’t need and you will not be a victim of irrelevant temptation or gossip. Advertising and other attempts to manipulate people work best on those who are unclear about what they need and don’t need. The more intentional you are about your tools and your goals, the less likely you are to be (even literally) sold a bill of goods.

And here is a fourth simple truth: profit is no justification. The bottom line is no basis on which to run a decent society. Corporate profit comes down to personal gain of executives and shareholders. But personal gain, personal status, or personal influence is simply a poor reason to do anything­ at all—even commerce. A better reason (yes, even in business) is to promote the general good—to make the world an objectively better place. Since the “world” is all of us, this can only be attempted in respectful dialogue and cooperation with others. This is the common truth that America is missing in its infatuation with individualism and freedom of expression.

The internet began as a giant show-and-tell, where people could share information with others for their mutual benefit. It didn’t take long for it to degenerate into a marketplace. Just as nature was once a commons, in principle shared by all life, the internet started out as a commons for non-commercial use. There were infrastructure costs involved from the outset, of course, and ongoing maintenance, which we continue to pay as fees for internet access, just as we pay taxes to maintain our roadways and parks.

We enjoy the view along the Internet Highway better without the clutter of billboards. The Web was (and should be again) a non-commercial public utility, free of advertising. It is only greed that saw it as a New World to pillage, the biggest post-industrial resource. As with conventional mining operations, there is pollution and environmental destruction. In the case of the internet, the groundwater of useful information is poisoned by self-interest. Ironically, most web sites are frustrating to visit because of the intrusion of annoying ads and notices about “cookies” for the purpose of collecting your supposedly valuable data. One would think the increasing saturation by ads would mean diminishing returns, as the visual clutter renders websites unusable.

One would also think that the atmosphere of invective in social media would turn people off to their use. The ethos of goodwill and confidence is destroyed by vituperation and unsupported claims. Once again the tree of knowledge has been violated—and this time not because it was forbidden—quite the contrary. Again, we have been expelled from a paradise, yet it is the same old serpent at work.

Those of us who know how to find and share information may simply refuse to use for-profit social media platforms or search engines that are biased or track and sell our data. We may avoid websites cluttered with distracting ads. We may refuse to frequent online podiums for venom and hate. Beyond these protests of refusal, there may be some creative and well-motivated souls who will invent new formats for communication that reassert the ideal of the friendly non-commercial internet commons. Indeed, some of the existing platforms were founded on such ideals, which did not prevent them from being eventually corrupted.

So, here is a fifth simple truth: evil always has the advantage in its contest with good. This is because it is a social phenomenon: as the adage goes, a rotten apple can spoil the barrel, whereas a good apple can hardly preserve it. Those who do not know who they are, or what they intend, may conclude that they may as well join the rotten apples who (as an another adage tells us) may seem indefeasible. Those who do know themselves will persist, despite unfavorable odds, to reinvent a better world, because that is the only thing really worth doing.