The Neuroscience of the Spectocracy
What happens to our brains when we spend our days looking at screens? Might this explain why Americans have gone berserk? Part III of an exposition by Adam Garfinkle.
By Adam Garfinkle
Part I: The Spectocracy: Connecting the dots of American Political Dysfunction
Part II: Spectacle defined and illustrated
Spending more than seven hours a day mostly alone, entertained by a screen, may well lock the brain into a vulnerable and impressionable la-la-land, ripe for any ideological entrepreneur who comes along.
1. Brainwaves
Brainwave frequencies, measured using electroencephalography, are patterns of neural activity in the brain. They are typically categorized into five bands based on their range in cycles per second (Hz). Each is associated with specific mental states and cognitive functions. These frequencies may operate simultaneously in different regions of the brain, but one frequency tends to be dominant depending on the subject’s mental state or activity.
Delta waves (0.5-4 Hz) are the slowest brainwaves, usually observed during unconsciousness or deep, dreamless sleep. They dominate our brains during our first year of life. When delta dominates, we have almost no awareness of the outside world, but we can access stored information. Delta waves appear to play a key role in memory organization by consolidating newly acquired information, pruning unnecessary synaptic connections, replaying memories for reinforcement, and processing emotions associated with those memories.
Theta waves (4-8 Hz) are associated with light sleep, deep relaxation, prayer, hypnosis, and meditative states. This is slow brain activity compared to alpha, beta, and gamma, but it isn’t boring; it’s associated with dreaming during sleep, and when awake, with creativity, imagination, intuition, pattern recognition, deep memory, and emotional arousal. Theta-dominance is rare in wide-awake adults, but it is the normal state of a child engaged in typical imaginative play. Typically, when adults are in a theta-dominant state, they are seated or lying down. Their eyes don’t actively move and scan the scene.
Alpha waves (8-12 Hz) are associated with relaxed wakefulness. This is the default state of the waking adult brain. They’re our normal, get-it-done frequency, characteristic of a calm, alert, and receptive state of mind. When alpha dominates, our mood is even, we’re reasonably relaxed, we take in the world and respond to cues, we move our bodies, and our eyes scan the scene and shift visual focus.
Beta waves (12-30 Hz) are associated with active thinking, problem-solving, and focused attention. When performing cognitive tasks that require attention, memory, and reasoning, beta-wave activity increases, as it does when we’re anxious, stressed, or excited. In this state, the object of our attention may be external—a predator, say—or internal, as when we recall the distinction between an average value and a mean. In high beta, we are very alert and focused, and we may feel a certain amount of agitation or stress, as when we’re solving a problem for a clear and immediate purpose.
Gamma waves (30-100 Hz) are the rarest brainwaves, related to high-level cognition and intense concentration. Here the brain is not just thinking but integrating and synthesizing knowledge and information at a high level. In this state, unlike others, every part of the brain registers at or above 30Hz. Gamma states tend to be of medium length: We don’t slip into them for just a few moments, and rarely do we have the energy to stay there for more than a few hours.
Our brainwave state responds to our environment. A busy urban area that confronts us with novelty induces high-alpha and low-beta brain waves. If we take a walk in the woods, however, our brains will typically be in low-alpha, tending to theta. If we must solve a problem that requires creativity and integrating different areas of knowledge on a deadline, our brainwaves shoot up to high beta and sometimes into gamma.
Importantly, the technology we use also induces specific brainwave states. If we watch an entertaining show on a screen in a darkened room, our brains will typically be in low-alpha, tending to theta. If we are addicted to wallowing in social media chatter, Instagram shorts and YouTube feeds on smartphones, that matters, too. The way mediated images affect our brains and our evolving neural pathways is different from the way unmediated images (that is, natural or real images) affect them.
2. The medium really is the message
If, as Arendt theorized, totalitarianism is “organized loneliness,” what society could be more vulnerable than one made up of lonely young semi-literates who spend their days watching images flicker across a screen?
It is not just that brainwave frequencies correlate to different kinds of activities. When we walk in the woods, for example, we apprehend reality unmediated, through all of our senses. Time is open-ended and real. When we watch a fantasy movie on a screen, however, humans and technology mediate between us and reality. The script has long since been written, and in this sense, normal time and its infinite possibilities do not exist. The difference between the way we apprehend a real object and an image of that object is quite profound. Commercial, mediated images are to reality what junk food is to food.
As Robert Haas observed in his 1984 book, Twentieth-Century Pleasures:
Images are not quite ideas, they are stiller than that, with less implication outside themselves. And they are not myth, they do not have explanatory power; they are nearer to pure story. Nor are they always metaphors; they do not say this is that, they say this is.
An image does not ask that we interpret it. It is self-explanatory. No cognitive work is required to take it in. We don’t absorb the image so much as the image absorbs us.
If we spend the bulk of our waking hours walking in the woods, our brains will be populated by images drawn from reality and remembered in that richer way. Our dreams and our imagination will feature these images and the connections among them. If we spend the bulk of our time watching fantasy movies, our dreams and waking imaginations will instead tend to employ this comparatively flat material.
What we remember and how we remember it are also significantly affected by the technology we use. Memories acquired from photographs, for example, are not stored the way those acquired from reality are.1 When we consider an image, we focus exclusively on visual clues; our memories of the object in the image are less involved and detailed than they would have been had we encountered it in real life. We lack the data from our other senses, emotions, and context that we would have had we really seen the object.
When we watch mediated images—on television, movie screens, or on a smartphone, especially in a darkened room—our brains, dominated by alpha and tending toward theta the longer we watch, are unable to apply focused attention to the information we receive. This is why advertisers love television. The medium of communication, irrespective of its content, primes the brain to receive its message, thus Marshall McLuhan’s famous but cryptic aphorism. (To understand it, you have to read his book.)
McLuhan was right that the medium shapes our understanding of the message, and in the case of screen-mediated images, the medium is not only the message but its recipient’s receptiveness to that message. But he was wrong about the mechanics of it. In fact, he had it exactly backward.
McLuhan held that television was a “cool” medium. Because of its visual emphasis and low-resolution images, he argued, it required more audience participation and engagement than media forms like print or radio. (In his defense, images at the time were of a lower resolution.)
But neurologically speaking, print is the cool medium. For example, when we learn of a scandal involving a government minister by reading a newspaper, our brains are generally in high alpha tending to low. If we learn of it from television, our brains are more likely to be in low alpha, depending on the time of day and whether we’re alone. In this brain state, we will be more receptive and less critical of the message.
This is not the only reason print is cool and television is hot. Images come at us fast, whereas we need time to process the written word. What’s more, images are received in an older part of the brain, a seat of emotion: Our retinas convert images to electrical signals; these are transmitted, via the optic nerve, directly to our hypothalamus, which controls our hormones. Print, on the other hand, lights up the more evolved frontal cortex because that’s where our literacy wiring is thickest. Reading about the ill-behaved politician therefore won’t produce quite the same geyser of hormones as seeing his arrogant face.
These hormones and the emotions with which they’re associated are important, too, for the way we store memories. Experiences imprint more readily in our memories when accompanied by a strong emotion. We will remember the minister’s travails in a different way if we see a show about them than we would have had we read about them.
In light of this, it is aggravating to hear friends debate which television news shows are the most biased or the least trustworthy. “It’s a screen,” I interpose. “It’s put you in low alpha, and anyway it’s infotainment, not news. You can’t understand, let alone master, a significant political or public policy issue by watching a screen, whether the station in question is biased or not. You must read about it, with your brain preferably somewhere in beta.” For some reason, these constructive remarks are rarely as welcome as one would expect.
3. A Hypothesis
What do these neurological effects imply about the way we cope with the real world? Is it possible that people with less formal education, through no fault of their own—as well, perhaps, as people whose jobs don’t require sustained cognitive work—are more vulnerable to conspiracy theories and simplistic ideologies because their brains get stuck, so to speak, in alpha? Note carefully that we are not arguing that relatively unintelligent people have characteristic brainwave patterns based on genetic fate, but that behavioral choices induce brainwave states that, no matter our raw intelligence endowment, affect our ability to assess, analyze, and think about our circumstances rationally and functionally. It follows that when the behavior of the majority of a society shifts, over a relatively short time, in such a way as that their modal brainwave state changes, it will affect the larger culture and the politics that are in turn shaped by it.
So if you’re naturalized, so to speak, to low-alpha states because of what you do at work and because outside of work you don’t deep-read or really read much at all, you may become partial to media and experiences that are readily and seamlessly consumed when this brain state dominates. You may even come to ignore or dislike experiences that require more beta brainwave energies and efforts.
This could be significant to the way we understand and perhaps help the large number of American males who in recent years have absented themselves from the work force.2 According to one analysis, unemployed men who have ceased looking for work spend four hours more per day watching screens than working women do, almost four hours more than working men, and at least an hour more than men who are still seeking employment.3 What precisely they are watching probably matters at the margins: “Dancing with the Stars” may not evoke the same brain states as gaming or marinating in porn. But no matter the program, if the viewer is alone in a relatively dark room and his orientation to the screen keeps his eyes from moving and changing focus, the brainwave result will be more or less the same.
A significant percentage of men who are not in the labor force are unmarried or live alone. Spending more than seven hours a day mostly alone, entertained by a screen, may well lock the brain into a vulnerable and impressionable la-la-land, ripe for any ideological entrepreneur who comes along. How many of the January 6 insurrectionists, one wonders, were unemployed? How many were recruited to the cause while sunk in a lonely, screen-mediated funk?
It’s a commonplace, and probably an accurate one, that demagogues, grifters, and conspiracy theorists prey on the isolated, offering them an ersatz community to relieve the ache of their loneliness. Loners, by definition, lack peers to confront them with reality, and pay no penalty for saying crazy things. Atomized, they are often on the verge of depression or sunk in it. “Loneliness,” wrote Milton in the Tetrachordon, “was the first thing which God’s eye named not good.” Saul Bellow’s character Herzog, echoing Hannah Arendt’s analysis of fascism, says, “a terrible loneliness throughout life is simply the plankton on which Leviathan feeds.” For a host of reasons—changing social norms, increased life expectancy, urbanization, delayed marriage—more Americans now live alone than ever. Loneliness sires desperation, escapism, and strange obsessions. Isolated, atomized individuals have good reason to want to escape from reality. Reality is not their friend. As Arendt wrote, the isolated can’t bear the “accidental, incomprehensible aspects” of reality because they lack company with whom to bear it.
Now return to the problem of the loss of deep literacy: Who could be lonelier than the loner who cannot even find fellowship with others through the written word? And if, as Arendt theorized, totalitarianism is “organized loneliness,” what society could be more vulnerable than one made up of lonely young semi-literates who spend their days watching images flicker across a screen?
Suffer the children
There is no longer doubt that spending long hours before screens harms brain development in young children. After their first birthday, children should be in theta, immersed in play and imagination. If screens tend to downshift adults from beta to alpha, they shift kids up from theta to alpha. The destination is the same, but not the point of departure. Theta is the key mode of mental development in children, and by yanking them out of it we may be depriving them of the chance to develop, through play, a mature imagination and empathy. The stunted minds this produces could become easy prey for charlatans and demagogues.
In 2019, a study of 47 prekindergarten children published in JAMA found that high screen use was associated with diminished organization and myelination in the white matter fibers in the brain that support language, executive function, and literacy.4 The authors, alarmed, emphasized the yet-unknown neurobiological risks of screen exposure during this critical period of early brain development.
In 2023, another study published in JAMA correlated excessive screen time in young children with impaired brain function, which, the authors wrote, “may have detrimental effects beyond early childhood and impair future learning.” Specifically, it led to an abundance of low-frequency brain waves—alpha tending toward theta. The longer children spent before screens, the researchers found, the greater their cognitive deficits. These effects were observed even when the children grew older. As Evelyn Law, the lead author, told Singapore’s Straits Times: “The study provides compelling evidence [in addition] to existing studies that our children’s screen time needs to be closely monitored, particularly during early brain development.”5
This study, like others before it, correlated excess screen time with retarded prefrontal maturation, which affects the development of executive function. The afflicted have difficulty regulating emotional stress, paying attention, following directions, and controlling their impulses. The authors offered an important hypothesis about why this occurs:
When watching a screen, the infant is bombarded with a stream of fast-paced movements, ongoing blinking lights and scene changes, which require ample cognitive resources to make sense of a process. The brain becomes “overwhelmed” and is unable to leave adequate resources for itself to mature in cognitive skills such as executive function.
This child, lacking any accumulation of experience that aids him in interpreting these images, gives up and becomes passive, much the way very young children will sometimes fall asleep in response to insistent loud noises. Screen bombardment causes a child to experience the astounding complex we discussed in the first part of this series. The novelty bias draws his attention to the screen, but if it persists, it generates anxiety. If he isn’t yet screen-addicted, he’ll look away to get his bearings again in the real world.
We once thought that brain development and the maturation of cognitive skills were plastic only in children, and particularly in infants. We now know that it takes much longer than we imagined for the frontal lobe to reach maturity—in males, it occurs in the mid-twenties. The frontal lobe manages risk perception, among other things, which is why auto insurance premiums for young male drivers are so high. The studies cited above offer no clue about the impact of this technology on older humans, but neuroscience generally is concluding that there is no age at which the impact of mediated images are negligible or inexistent. The more sensory input of certain types we expose ourselves to, the bigger the impact, no matter how old we are. Worse, perhaps, as cognitive metabolisms slow with aging, we may return to a higher level of vulnerability over time.
All of this suggests a major shadow effect of saturation in cyber-gadgets. People who induce in themselves a habitual alpha-theta brainwave pattern may not be able to shift when environmental stimuli require it into low-beta, let alone high-beta or gamma, as readily those who more often direct their unmediated attention to reality. Our comfort zone becomes defined, in due course, by what we habitually do—not “locked” precisely, but certainly inclined.
So it isn’t just that the blue light of our screens inhibits the downshift to theta and healthful sleep. Our screens retard or inhibit upshifts to cognitive alertness, and may do so at all ages. This is in part because realistic but unreal visual images ask little of the viewer, being neither ideas nor myths nor metaphors. Reading, of course, shifts us up into more rapid brainwave patterns—it does so even when we’re reading non-fiction, and certainly when we’re reading an exciting novel or sci-fi thriller.
If there is now a general cultural proclivity toward alpha states induced by our gadgets, and if alpha-to-theta is the brainwave zone suited to enjoying spectacles, transmitting conspiracy theories, and absorbing highly simplified but emotionally appealing ideologies along with other irrational forms of discourse, perhaps this explains the change in our mentality and political culture. Much empirical work is needed to examine the proposition, but this is the basic neurobiological premise, in simple form, that undergirds my thesis.
More on the broader cultural and political implications in the next and final installation.
Adam Garfinkle is a member of the Cosmopolitan Globalist editorial board.
Many researchers have explored this point. See, e.g., Alison Winter’s Memory: Fragments of Modern History (2012).
See Richard Reeves, Of Boys and Men.
Eberstadt and Abramsky, “What Do Prime-Age ‘NILF’ Men Do All Day? A Cautionary on Universal Basic Income,” Institute for Family Studies, February 8, 2021.
J.S. Hutton et al., “Associations between screen-based media use and brain white matter integrity in preschool-aged children,” JAMA Pediatrics, 2020, 174(1).
Ng Wei Kai, “Screen lined to impaired brain function, may affect learning beyond childhood: Study,” Straits Times, January 31, 2023.
Confirming what Plato wrote in the Republic about democracy, spectacle, and changes in the modes.
Now I regret the afternoons I wasted as a kid watching the Three Stooges, Godzilla, Francis the Talking Mule, and RKO monster flicks on those pesky UHF channels!