Left to our own devices

Social media are paradoxically anti-social. A common scene goes like this: a group of people around the dinner table ignore each other, having wandered off in separate worlds on their respective “devices.” Even before the pandemic, Facebook had displaced face-to-face contact. Ironically, modern connectivity is disconnecting us. How can this be?

While this phenomenon is an aspect of modern technology, perhaps it all began with theater and the willing suspension of disbelief. On the stage are flesh and blood actors, but the characters they play belong to another realm, another time and place, which draws attention away from the immediate surroundings. The story they tell is not the here-and-now in the theater hall with the others in the audience. Thus, we enjoy a dual focus. Theater morphed into cinema, which morphed into the pixelated screen. But the situation is the same: there is a choice between worlds. One attends either to the proximal here-and-now reality that includes the physical interface (iPhone, TV, computer) or to the distal reality transmitted through it. Even the telephone presents a similar alternative. Is the locus of reality my ear pressed against the receiver or is it the person speaking on the other end? In the case of the real-time conversation, one focuses on the person at the other end as though they were nearby, which is a sort of hallucination. In the case of the digital device, the distal focus involved in texting chronically engulfs the participants in a sort of online role-playing game that has little to do with the here and now.

Our own senses offer a parallel choice. For, our natural senses are the original “personal devices.” Like a laptop or mobile phone, they serve as an interface with the world. Yet, they can be used for other purposes. Our sensations monitor and present a reality beyond the nervous system; yet we have the ability and option to pay attention to the sensations themselves. We can focus on the interface or the reality it is supposed to transmit. We thus have the ability to entertain ourselves with our own sensory input. Even the visual field can be regarded either as a sort of proximal screen one observes or as a transparent window on the real world beyond. Hence, the fascination of painting as a pure visual experience, whether or not it represents a real scene. Hence, the even greater fascination of the cinema. I suggest it is this ambiguity, built into perception, that renders us especially susceptible to the adictiveness of screens.

Modern technology provides the current metaphor to understand how perception works. The brain computes for us a virtual reality to represent the external world. In that metaphor, the world appears to us as a “show” on the “screen” of consciousness. The participatory nature of perception underlined by the metaphor gives us some control and responsibility over what we perceive. This has a social advantage. Subjectivity is a capacity evolved to help us get along in the real world, by realizing we are active co-producers of this “show,” not just passive bystanders looking through a transparent window. Yet, by its very nature this realization permits us to focus on the screen rather than the supposed reality it transmits. This freedom, afforded by self-consciousness, is a defining human ability. But, like all freedoms, it can be misused. We turn away from reality at our own risk.

There is a difference between the senses as information channels and digital devices as information channels. The senses are naturally oriented outward. We survive by paying attention to the external world, and natural selection guarantees that only those organisms survive that get reliable information about their environment. There is little guarantee that electronic channels give us reliable information, however. They can do that, of course, but it depends on how they are used and who actually controls them. One can use a cell phone to contact someone for a specific purpose, just as one can use a computer to seek important information. But a mobile “phone” is not just a cordless telephone whose advantage is convenience. It functions more as pocket entertainment. The computer screen, too, is as likely to be used for video games or other entertainment that has nothing to do with reality. Even “googling” stuff online is sometimes no more than a trivial entertainment.

Worse, digital devices are not part of our natural equipment, and thus not under control as parts of one’s body. Back in the 17th century, Descartes was perhaps the first to recognize this dilemma. He realized that the information flowing into the brain could be faked—even if it came from the natural senses, if someone malevolent were in a position to tamper with your nervous system. If your sensory input could be manipulated, then so could your experience and behavior. He reassured himself that God (today we might say nature) would not permit such deception by the natural senses. But there is no such guarantee against deception by those who control the information that comes to us through our artificial senses—our external devices.

In the ancestral environment of the wild, being distracted by your experience as a form of self-entertainment would have been frivolous and lethal, if at that moment you needed to pay close attention to the predator stalking you or the prey you were stalking. (Daydreaming, you could miss your meal or become the meal!) People then had designated times and venues for enjoying their imagination and subjectivity—such as story-telling around the campfire. In the modern age we have blurred those boundaries. We now carry an instant “campfire” around in our pockets. While mobile phones would have been highly useful for coordinating the hunt, as currently used they would have dangerously disrupted it. They could be used by a malicious tribe to steer you away from your quarry or over a cliff.

Of course, we no longer live in the wild but in man-made environments, where traditional threats have mostly been eliminated (no saber-tooth tigers roaming the streets). We can now enjoy our entertainments at the push of a button. One can further argue that the human world has always been primarily social and that social media are merely the modern extension of an ancient connectivity. However, external threats have hardly ceased. (If they had, there would be no basis for interest in the “news”.) The modern world may be an artificial environment, a sort of virtual reality  in which we wish to wander freely; but it is still immersed in the natural order, which still holds over us the power of life and death. It may seem to us, at leisure in the well-off parts of the world, that no distinction is really needed anymore between the virtual and the real. However, this will never be the case. Even if hordes of transhumanists or gamers migrate to live in cyber-space, ignoring political realities, the computers and energy required to maintain the illusion will continue to exist in the real world. Someone will have to stay behind to mind the store, and that someone will have enormous power. (See the TV series Upload, or for that matter The Matrix films.)

The new virtual universe no doubt has its own rules, but we don’t yet know what they are. We knew what a stalking tiger looked like. We still know what a rapidly approaching bus looks like. But we are hard pressed to recognize the modern predators seeking our blood and the man-made entities that can run us down if we don’t pay attention. In the Information Age, ironically, we no longer know the difference between what informs us and what deforms us. Instant connectivity and fingertip information have rendered obsolete the traditional institutions for vetting information, such as librarians, peer review, and trusted news reporting. Everything has become a subjective blur, a free-for-all of competing opinions, suspected propaganda. Yet, because we are genetically conditioned to deal with reality, mere opinion must present itself as fact to get our attention and acceptance. This provides a perfect opportunity for those who control information channels to manage what we think.

We are challenged to know what to believe. The downside of the glut of information is the task of sorting it. It is for the convenience of avoiding that task that we rely on trusted sources to do it on our behalf. We’re used to such conveniences in the consumer culture, where information has been appealingly packaged as another consumer product, just as we’ve come to trust what is on the supermarket shelf, in part seduced by its glitzy over-packaging. When the trust is no longer there, however, the burden falls back on the consumer to read the fine print on the labels, so to speak. (More extremely, one may learn to grow one’s own food. But, then too, weeding is necessary!) In a divided world, with no rules of engagement, the challenge of sifting information is so daunting and tedious that it is tempting to throw up one’s hands in despair and regard the clash of viewpoints as ultimately no more than another harmless entertainment. That would be throwing the baby out with the bath water, however, since presumably there is a reality, a truth or fact of the matter that can affect us and which we can affect.

It is convenient, but hardly reasonable, to believe a claim because one has decided to trust its source. By the same token, it is convenient, but not reasonable, to automatically dismiss the claims of suspicious sources because other parties misuse the information. (These fallacies are known in logic as argumentum ad hominem.) The reasonable approach is to evaluate all claims on their own merit. But that means a lot of time-consuming work cross-checking apparent facts and evaluating arguments. It means taking responsibility for deciding what to believe—or, alternatively, to admit that one doesn’t know yet. It is far easier to simply take a side that has some emotional appeal in a heated controversy, and be done with the ardors of thinking. It’s easier to be a believer than a doubter, but easier still to sit back and be entertained.