🔴 Viewpoint: The Machine’s Tempo

We are transitioning, with breathtaking and heartbreaking speed, from a species that contemplates to a species that merely processes.

🔴 Viewpoint: The Machine’s Tempo

How Speed is Eroding Critical Faculties and the Human Imagination

Part I: The Catastrophe of the Instantaneous

The Moment of Surrender

It begins not with a thought, but with a reflex—a synaptic surrender, a neurological genuflection to the glass oracle in my pocket.

I want you to understand the precise phenomenology of this moment, because in its minutiae lies the whole catastrophe. The thumb, that evolutionary triumph that separated us from our primate cousins, that digit that allowed us to craft tools and build civilisations, now hovers over the screen like a diviner's pendulum seeking water in a desert. But we are not seeking water. We are seeking something we cannot name—a feeling, a fix, a fleeting sense of connection to a network that connects us to everything and nothing simultaneously.

The micro-movement begins in the motor cortex, travels down the median nerve, activates the flexor pollicis brevis, and the thumb descends those fatal millimetres to make contact with the capacitive surface. In that touch—that Michelangelo finger-of-God moment perverted—the world doesn't refresh. It replaces itself. Reality Version 1.0 is overwritten by Reality Version 2.0, and the previous world, the one that existed half a second ago, is gone forever, consigned to digital oblivion.

The dopamine loop that governs this action is tighter than a garrote wire, and like a garrote, it is slowly strangling something essential in us. Neuroscientists have mapped this pathway: from the ventral tegmental area to the nucleus accumbens, flooding our reward circuits with the same neurotransmitters that once motivated our ancestors to seek food, shelter, and mates (Montag et al., 2017). But those ancestral rewards required effort, planning, delay. This new reward requires only a twitch.

Each scroll is a small death of presence, a micro-forsaking of the here for the elsewhere.

The Theft of Time

Last Tuesday—or was it Wednesday? The days blur together in the age of infinite scroll—I experienced a temporal rupture that shook me to my core. It was 11:35 PM. I know this with certainty because I had just looked at the clock on the news channel. I picked up my phone to check one thing—just one thing—the weather for the following days.

The next time I looked at that clock, it was 1:43 AM.

The time between these two anchors had not passed in any way I could recognise as passage. It had been annihilated, vaporised, deleted from my personal history. Einstein told us time was relative, but he never imagined it could be stolen so completely. In this temporal black hole—this digital delirium tremens—I had consumed (or been consumed by) an extraordinary array of content. The tragic minutiae of a war in Eastern Europe, where real humans bled real blood while I swiped past their suffering. A recipe for "egg-and-rice pizza" that I saved, knowing with absolute certainty I would never cook it. A frantic, fractious debate about artificial intelligence that generated more heat than light. A video of a cat startled by a cucumber, its expression of pure existential terror at the vegetable intruder somehow perfectly capturing my own relationship with modernity.

I remember the cat's expression with photographic clarity—the dilated pupils blown wide in cartoon alarm, the arched back forming a perfect parabola of feline panic, the bottle-brush tail rigid as a medieval mace. I can reconstruct the precise angle of the kitchen lighting, the grain of the wooden floor, even the brand of the cucumber (organic, I think, though what does it matter?). I remember thinking it was "kind of funny". I almost shared it. I bet at least three people would have "liked" my share. This, I ruefully remember. Every trivial particular, preserved as if in amber.

I do not recall the location where the soldiers fell dead—was it Pokrovsk? Sumy? Zaporizhzhia? Perhaps some village whose name I cannot pronounce?—or how many there were, or whether they were young conscripts or hardened veterans, or if the footage showed them falling or already fallen. I cannot tell you if it was winter or summer, day or night, urban rubble or open field. I feel I should remember with absolute clarity the dramatic and horrific expression of humanity, that terrible convergence of surprise and understanding, as if death had arrived both too suddenly and exactly when expected. But I don't. The image passed through my consciousness like water through a sieve, leaving nothing behind but a vague sense of having witnessed something terrible—something I immediately submerged beneath the next wave of content.

The cucumber startled the cat. The war killed the men. One lodged in memory with crystalline precision. The other dissolved into the digital ether, as ephemeral as morning mist.

This is not hyperbole. This is documentary evidence of a species undergoing metamorphosis in real-time. We are watching—no, we are experiencing—the transformation of Homo sapiens into something else, something unprecedented in evolutionary history: a creature that has outsourced its consciousness to machines, that has willingly surrendered its interiority to the exterior, that has traded depth for breadth and wisdom for information.

The Architecture of Amnesia

The phenomenon we are witnessing—and I use "witnessing" loosely, because to witness requires a kind of attention we are rapidly losing—is not merely a bad habit, like nail-biting or procrastination. It is not a failure of individual discipline that can be corrected with the right app or life hack. It is a neurological coup d'état, a hostile takeover of consciousness by forces that profit from our distraction.

We are living through what I can only describe as a quiet but violent geological shift in the bedrock of human cognition. Imagine, if you will, the tectonic plates of consciousness slowly grinding against each other, building pressure, until suddenly there's a rupture, an earthquake, and the entire landscape of the mind is reformed. The machine's tempo—that relentless, pitiless, millisecond-latency of silicon logic—is not just influencing but overwriting the biological rhythms that have governed human consciousness since we first looked up at the stars and wondered what they were.

For 300,000 years, Homo sapiens has been a creature of contemplation (Hublin et al., 2017). We are the only species we know of that can think about thinking, that can step outside the immediate present and imagine alternative futures, that can hold abstract concepts like justice and beauty and meaning in our minds. This metacognitive ability—this miraculous evolutionary accident—is what allowed us to build civilisations, create art, develop science, fall in love with ideas.

And we are abandoning it. Willingly. Eagerly. For what? For the next dopamine hit, the next notification, the next micro-dose of digital stimulation.

We are transitioning, with breathtaking and heartbreaking speed, from a species that contemplates to a species that merely processes. And here's the truly terrifying realisation that helps keep me awake at night: we are not just losing our ability to think deeply. We are losing our ability to want to think deeply. The machine has colonised not just our time but our desires. We are beginning to prefer the surface to the depths, the quick to the slow, the simple to the complex.

Part II: The Dromological Imperative

Virilio's Prophecy Fulfilled

In 1977, Paul Virilio, that prescient philosopher of velocity, introduced the world to "dromology"—the logic of speed, from the Greek dromos, meaning race or running (Virilio, 1977/2006). Writing in the shadow of the Cold War, watching missiles that could circle the globe in minutes, Virilio understood something fundamental: speed is not merely a measurement of distance over time. Speed is power. Speed is politics. Speed is theology.

But even Virilio, brilliant as he was, could not have imagined the total victory of velocity we are now witnessing. He wrote about the speed of missiles and automobiles, of telegraphs and telephones. He could not have conceived of a world where thoughts themselves would be subject to acceleration, where consciousness would be uploaded, downloaded, refreshed at gigahertz frequencies.

"The speed of light does not merely transform the world," Virilio wrote. "It becomes the world" (Virilio, 2000, p. 143).

We have crossed that threshold. We now live inside the speed of light, prisoners in an electromagnetic cage of our own construction.

In our digital ecosystem, speed has achieved something beyond totalitarian control—it has become theological, a new religion with its own dogma, its own rituals, its own promises of transcendence. The algorithm is its high priest, the notification its call to prayer, the infinite scroll its promise of eternal life. We worship at the altar of acceleration, sacrificing our attention—that most precious resource—to gods made of code.

Consider the language we use: we "worship" certain apps, we're "addicted" to our devices, we have "religious" devotion to certain platforms. This is not metaphorical language; it's diagnostic. We have created a new form of the sacred, one measured in milliseconds and megabits.

The Punishment of Pause

The architecture of the internet doesn't just favour speed—it punishes slowness with the ruthlessness of natural selection. Every platform, every app, every digital interaction is optimised for velocity. The content that survives, that thrives, that colonises our collective consciousness, is content that can be consumed instantly, that requires no pause, no reflection, no metabolization.

I think of the great writers I love—Proust with his meandering sentences that stretched like cathedral corridors, Joyce with his labyrinthine wordplay that demanded rereading, Woolf with her streams of consciousness that flowed like honey, slow and golden, Whitman with his sprawling catalogs that contained multitudes and sang like the open road, Dostoevsky with his fevered descents into the soul's darkest chambers where guilt and grace wrestled until dawn. Could they exist today? Could In Search of Lost Time find readers in the age of TL;DR? Could Ulysses compete with a tweet?

The answer breaks my heart: Probably not. They would be algorithmic failures, their engagement metrics too low, their bounce rates too high. They would be buried beneath an avalanche of listicles and hot takes, their profound meditations on time and memory and meaning lost in the noise.

The Degradation of My Own Mind

I need to confess something, and this confession costs me greatly because it reveals the extent of my own degradation. Twenty years ago—two decades that feel like a century and also like yesterday—I could lose myself in a book for entire afternoons. I remember reading War and Peace one summer, sprawled on a blanket in Prospect Park, the sun tracking across the sky as I tracked across Russia with Pierre and Natasha. Hours would pass unnoticed. My consciousness would merge with Tolstoy's, and I would think his thoughts, feel his feelings, see his visions. This was not entertainment; it was transmigration of souls.

Today, when I attempt to read serious literature, something terrible sometimes happens. It begins as a physical sensation—a phantom itch behind my eyes, a restlessness in my fingers, a peculiar hollowness in my chest where my phone usually rests in my shirt pocket. My brain, rewired by years of digital stimulation, rebels against the slowness of literary time. It searches for the "skip intro" button that doesn't exist. I suspect it might want the Wikipedia summary, not the actual experience.

When I encounter a paragraph of complex syntax—when Henry James asks me to hold seven dependent clauses in my mind simultaneously, when Kant demands I follow a chain of logic that extends over pages—my consciousness fragments. Part of me is reading, but other parts are wandering and wondering: What's happening on that WhatsApp group? Has anyone responded to my email? What's the weather tomorrow? What was that actor's name from that movie?

This is not a mere distraction. This is cognitive mutiny. My own mind has seemingly become foreign territory, colonised by impulses that are not fully my own.

More on that later.

The Surface Tension of Modern Consciousness

What we've created is what I call the "surface tension" of modern consciousness—a psychological state where we skim across the meniscus of meaning like water striders, never breaking through to the depths below. We mistake the reflection for the thing itself, the headline for the article, the summary for the work, the reaction for the thought.

The image of web-reading patterns that researchers have documented—the infamous F-pattern—is a map of our new literacy. Or rather, our new illiteracy. Users scan horizontally across the top, then halfway down, then abandon the effort entirely. The bottom right quadrant of any text—traditionally where writers place their conclusions, their revelations, their deepest insights—has become cognitive terra incognita, an undiscovered country that might as well not exist.

But this structural deformation in reading precipitates a structural deformation in thinking itself. Maryanne Wolf, the neuroscientist who has become the Cassandra of the digital age, warns us with increasing urgency: the "reading brain" is not genetically guaranteed (Wolf, 2018). It is a cultural construction, built over millennia, maintained through practice, and it can be lost in a generation.

We are witnessing—in fact, we are living—the first generation in human history that may be fundamentally less literate than their parents. Not in the mechanical ability to decode symbols—our youth can read words. But in the capacity to construct meaning from complexity, to follow extended arguments, to hold contradictory ideas in productive tension, to engage in what Wolf calls "deep reading"—that miraculous process where we think thoughts we've never thought before, guided by another mind through marks on a page.

Part III: The Colonisation of the Gap

The Anthropology of Empty Time

To understand what we've lost, we must first understand what boredom was—not as absence but as presence, not as emptiness but as potential. Boredom, that state we now treat as a disease to be cured, was once the fertile void from which all creativity emerged.

Consider the anthropology of empty time. For the entirety of human history until roughly twenty years ago, life was punctuated by gaps—lacunae of unstimulated consciousness. Our ancestors knew boredom intimately. They knew the weight of empty Sunday afternoons, the stretch of winter evenings by the fire, the long walks between villages with nothing but their own thoughts for company. These were not dead zones but generative spaces, creative wombs where the mind, left to its own devices, would begin to play, to wonder, to imagine.

I remember my own childhood boredom with something approaching reverence. Summer afternoons that stretched like taffy, each minute an hour, each hour a day. I would lie on my back in the grass, watching clouds morph from dragons to ships to faces of forgotten gods. My mind, unstimulated by constant external input, would turn inward and begin its strange work: inventing stories, solving problems I didn't know I was working on, asking questions that had no answers yet: Why is there something instead of nothing? What does blue look like to other people? If I had been born in another century, would I still be me?

This was not idle time. This was the mind's maintenance mode, its defragmentation process, its dream state while awake. Neuroscientists now know that these moments of boredom activate what they call the "default mode network"—a constellation of brain regions that light up when we're not actively engaged in tasks (Raichle, 2015). This network is associated with creativity, self-reflection, moral reasoning, future planning, and the construction of personal identity. It is, quite literally, where we become ourselves.

Good times.

The War on the Gap

But the machine has declared total war on the gap. It has militarised every moment, weaponised every pause, colonised every instant of potential boredom with what the tech industry, with characteristic Orwellian doublespeak, calls "content."

There is no longer any truly "dead time." The thirty seconds waiting for coffee to brew—once a moment to stare out the window and notice the particular quality of morning light—is now filled with a quick scroll through Instagram. The elevator ride—once an opportunity for a brief meditation on the day ahead—is now a chance to delete emails. The red light when crossing the street—once a pause in the momentum of life—is now an opportunity to check notifications.

I observe myself with the horrified fascination of an entomologist studying a particularly disturbing insect. My hand reaches for my phone with the automaticity of a reflex, before my conscious mind has even registered boredom. The movement is pre-cognitive, instinctual, as if my phone were not a device but an organ, an external appendage that my body needs to touch to confirm its own existence.

Last week, I conducted an experiment on myself. I counted how many times I reached for my phone in a single day. The number was 112. One hundred and twelve times my hand moved toward that black mirror. That's over 6 times per hour of waking consciousness. But here is where my horror deepens: I cannot truthfully claim full agency over those 112 touches. At least sixty of them—more than half—were not idle curiosity or weakness of will. They were work-related. Messages from colleagues. Urgent emails that could not wait. Notifications for messages demanding immediate response. The digital leash by which my job maintains constant access to my attention, my time, my presence.

This is the insidious genius of our current technological arrangement: it has made the voluntary indistinguishable from the compulsory. The architecture of constant connectivity serves both my compulsions and my obligations, until I can no longer tell where one ends and the other begins. Am I checking my phone because I am addicted, or because I am employed? The answer, of course, is both—and this dual nature makes resistance nearly impossible.

When I subtract those work-mandated interruptions, I'm left with approximately fifty-two voluntary checks. Still catastrophic—still more than three times per hour—but the distinction matters. It reveals how contemporary capitalism has weaponised our devices, transforming them from tools of potential distraction into instruments of mandatory engagement, chains we cannot remove because they are also our lifelines.

The Creativity Famine

Boredom, I now understand, was not a problem to be solved but a precondition for creativity. Mind-wandering facilitates creative incubation—undemanding tasks that allow the mind to drift actually improve subsequent creative performance (Baird et al., 2012). And without the negative space of unstimulated time, we cannot create anything genuinely new. We can only recombine, remix, recycle.

Look at the cultural evidence. Hollywood, once the dream factory of the world, now produces almost nothing but sequels, prequels, reboots, and reimaginings. The Marvel Cinematic Universe—that ultimate expression of creative capitalism—is just the same story told infinite times with different costumes. Our popular music, increasingly written by algorithms analysing previous hits, has converged on a kind of sonic average, every song sounding vaguely like every other song. Our political discourse consists of the same arguments repeated with increasing volume but decreasing nuance, as if we're trapped in a temporal loop, doomed to relitigate the same issues forever.

Even our memes—those supposedly spontaneous eruptions of collective creativity—follow predictable patterns, templates, formats. We don't create; we fill in the blanks. We don't imagine; we iterate.

When there is no time to dream, we can only remix. When there is no silence, we can only echo. When there is no solitude, we can only mirror.

The Neuroscience of Absence

The research on this is unequivocal and terrifying. Studies show that people would rather self-administer electric shocks than sit alone with their thoughts for fifteen minutes—67% of men and 25% of women in one study chose to shock themselves rather than simply sit and think (Wilson et al., 2014). Think about that (even if for a minute only). We have become so uncomfortable with our own consciousness that we prefer physical pain to mental silence.

What have we become? What are we becoming?

Dr. Marcus Raichle, who discovered the default mode network, describes it as the brain's "dark energy"—invisible but essential, the background processing that maintains the coherence of consciousness itself (Raichle, 2006). When we never allow this network to activate, when we fill every gap with stimulation, we are essentially preventing our brains from performing basic maintenance. It's like never allowing a computer to run its cleanup processes, never defragmenting the hard drive, never clearing the cache. Eventually, the system begins to fail.

I see this failure in my own cognitive functioning. My attention span, once measured in hours, is now measured in minutes. My memory, once crystalline, is now clouded. I find myself forgetting why I walked into rooms, losing track of conversations mid-sentence, unable to recall what I was told yesterday or even an hour ago. My mind feels increasingly like a browser with too many tabs open, everything running slowly, nothing working quite right.

Part IV: The Commodification of Consciousness

The New Inequality

If speed is the currency of the masses, slowness has become the ultimate luxury good—a form of conspicuous consumption that would make Thorstein Veblen weep with recognition. We have created a world where the ability to think clearly, to focus deeply, to be present fully, is increasingly reserved for the wealthy.

Consider the grotesque inequality of attention in late capitalism. On one side, we have the attention-poor—the gig economy driver whose survival depends on responding instantly to the app's ping, the warehouse worker whose every movement is tracked and optimised, the call centre employee whose bathroom breaks are monitored to the second. These are the new proletariat, selling not just their labour but their consciousness, their very ability to be present in their own lives.

On the other side, we have the attention-rich—the tech executives who strictly limit their children's screen time while designing apps to be maximally addictive for yours, the CEOs who hire armies of assistants to filter their inputs, the celebrities who can afford to disappear for months-long "digital detoxes" in Big Sur or Bali, paying thousands of dollars a day to rediscover what humans knew for free for millennia: the sound of their own thoughts.

Consider the phenomenon of digital detox retreats—those 3000€ sanctuaries where phones are confiscated at the gate, where discipline is purchased because it can no longer be summoned from within. The participant roster reads like a directory of complicity: investment bankers, tech executives, Hollywood producers—the very architects of our attention economy, now paying handsomely to escape their own creation.

They sit in circles, these refugees from connectivity, relearning the lost art of conversation without documentation. They walk through redwood groves, marvelling at their capacity to notice birds and trees when not mediating the experience through a screen. They treat presence as a rare vintage to be sipped slowly, savoured, committed to memory—a luxury good available only to those who can afford its price.

The ritual would be touching if it weren't so obscene. For while these elites perform their carefully curated disconnection, millions beyond the retreat's perimeter continue their fragmented existence: delivering food for algorithm-determined wages, driving strangers for star ratings, performing their lives for likes and follows. Their attention remains fractured into a thousand pieces, each piece monetised, commodified, sold to the highest bidder. They cannot purchase respite. They cannot afford to disconnect.

This is the ultimate privilege: not merely to have destroyed one's own attention, but to have the resources to temporarily repair it while the destruction of others' attention continues unabated—and continues, moreover, to fund the very retreats where the destroyers seek absolution.

The Privatisation of Silence

Silence—that fundamental human inheritance, that primordial soup from which consciousness emerged—has been privatised. It is no longer the background radiation of existence but a premium product to be purchased, like bottled water in a world where we've poisoned all the wells.

"I have often said that the sole cause of man's unhappiness is that he does not know how to stay quietly in his room," wrote Pascal in his Pensées, that masterwork of contemplation written in an age when sitting quietly in a room was the default state, not an achievement (Pascal, 1670/1995, p. 37). Pascal could not have imagined a world where sitting quietly in a room would cost 500€ an hour, where solitude would require a subscription service, where silence would be sold back to us by the very companies that stole it in the first place.

The meditation app industry—valued at over €2 billion and growing—is perhaps the perfect symbol of our predicament. We pay monthly subscriptions to learn how to sit still, to be guided through the basic human act of breathing, to have someone else's voice tell us how to find our own inner voice. Headspace, Calm, Ten Percent Happier—these apps promise to solve the problems created by apps, to use technology to escape technology, to find peace through the very devices that destroyed our peace in the first place.

It's like paying someone to stop punching you in the face.

The Empathy Gap

This commodification of consciousness creates a cascading crisis that extends far beyond individual psychology. When attention becomes a luxury good, empathy becomes a luxury emotion. Deep empathy—the kind that changes hearts and minds, that builds bridges across difference, that enables genuine understanding—requires temporal investment. It requires sitting with discomfort, metabolising another's pain without immediately converting it to a take or a stance or a hashtag.

But in the age of speed, empathy is replaced by its performance. We signal virtue through instant reactions—the angry emoji at injustice, the heart emoji for tragedy, the reshare of the correct opinion. These are empathy's substitutes, its simulacra, its fast-food version. We feel we have "done something" about suffering because we've tapped a screen, but we haven't done the difficult internal work of genuinely understanding another's reality.

I think of my younger self, who would sit with grieving friends for hours, saying little, just being present. I didn't post about it. I didn't signal my virtue. I simply showed up, again and again, with my full attention, my complete presence. This kind of empathy is becoming extinct, replaced by what the philosopher Byung-Chul Han calls "the violence of positivity"—the mandatory optimism and instant response that prevents genuine emotional processing (Han, 2010/2015).

The empathy gap mirrors the wealth gap, and both are symptoms of the same disease: the prioritisation of efficiency over humanity, speed over depth, reaction over reflection.

Part V: The Neuroplasticity of Impatience

The Architecture of Addiction

To truly comprehend the machine's colonisation of consciousness, we must descend into the neurological basement, where the architecture of addiction is being rebuilt, synapse by synapse, in silicon and dopamine. This is not metaphorical. This is measurable, observable, as real as a broken bone or a diseased liver.

The internet operates on what behavioural psychologists call a "variable ratio reinforcement schedule"—the most powerful mechanism for creating addiction ever discovered (Ferster & Skinner, 1957). B.F. Skinner discovered this phenomenon when studying operant conditioning: when rewards come unpredictably rather than consistently, the behaviour seeking those rewards becomes more obsessive, more compulsive, more resistant to extinction. The rats pressed the lever frantically, unable to stop.

We are Skinner's rats, and our phones are the lever.

Every scroll is a pull of that lever. Most of the time, we get nothing—a mediocre meme, an ad for something we don't need, a post from someone we don't really care about. But occasionally, unpredictably, we hit the jackpot—breaking news that confirms our worldview, a notification that someone important has acknowledged our existence, a piece of content that triggers that perfect neurochemical cocktail of outrage and superiority.

This intermittent reinforcement doesn't just spike dopamine; it fundamentally reorganises the brain's reward architecture. Dr. Anna Lembke, who runs the Stanford Addiction Medicine Clinic, describes smartphones as "the modern-day hypodermic needle, delivering digital dopamine 24/7 for a wired generation" (Lembke, 2021, pp. 7-8). The same neural pathways that govern substance addiction—the mesolimbic dopamine system—are hijacked by our devices.

People's neurochemistry is compromised. The anxiety that builds when they haven't checked their phone for an hour—the tightness in their chest, the restlessness in their limbs, the intrusive thoughts about what they might be missing. These are withdrawal symptoms, as real as those experienced by any addict. The sweet relief when they finally check, the flood of calm as they scroll, even when there's nothing interesting to see. This is the fix, the hit, the dose that gets them through to the next one.

The Intolerance for Friction

The neurological changes go deeper than simple addiction, though. We are witnessing the development of what I call a profound intolerance for cognitive friction. Our brains, marinated in the frictionless user experience of modern technology, have lost the ability to tolerate difficulty, complexity, ambiguity.

We demand frictionless commerce—one-click purchasing, same-day delivery, instant gratification. We demand frictionless communication—texts instead of calls, emojis instead of words, likes instead of conversations. We demand frictionless relationships—swipe right for intimacy, ghost when things get complicated, unfriend rather than work through conflict.

But here's what we've forgotten: friction is not a bug in consciousness; it's a feature. Critical thinking is friction. It is the mental labour of stopping, examining, testing, doubting. It is the grain of sand that creates the pearl. It is the resistance that builds strength. When we eliminate friction, we eliminate the traction necessary for the mind to gain purchase on reality.

This is why conspiracy theories metastasise in the age of speed. A conspiracy theory is the path of least cognitive resistance—a frictionless explanation for a complex reality that would otherwise require patient, careful analysis. QAnon, flat earth, anti-vax—these aren't just failures of education. They're symptoms of minds that have lost the ability to tolerate complexity, to sit with uncertainty, to say "I don't know" and be comfortable with not knowing.

The conspiracy theory offers the dopamine hit of sudden comprehension without the work of actual understanding. It connects dots that should remain unconnected, creating constellations of meaning from random stars. It provides the satisfaction of secret knowledge, of being "awakened," of seeing patterns that others miss. It's the intellectual equivalent of a slot machine jackpot—random events aligned into the appearance of meaning.

The Rewiring of Memory

Perhaps most disturbing is what the machine's tempo is doing to memory—that fundamental faculty that makes us who we are. Memory is not just storage; it's the story we tell ourselves about ourselves. It's the thread that connects our past to our present to our imagined future. Without memory, we are not fully human.

But digital technology is restructuring memory in ways we're only beginning to understand. We've outsourced our memories to our devices—phone numbers, addresses, birthdays, and even the faces of our loved ones, all stored in silicon rather than in our synapses. This "Google effect" or "digital amnesia" means we no longer remember information; we remember where to find information (Sparrow et al., 2011).

More troubling still is how the constant documentation of life—the photographing, the posting, the sharing—is replacing actual memory formation. Studies show that when we photograph an experience, we remember it less clearly than if we had simply paid attention (Henkel, 2014). Interestingly, this "photo-taking impairment effect" disappears when we zoom in on specific details—suggesting that engagement, not documentation, is what matters for memory. We're so busy creating the external record that we fail to create the internal one. We have perfect documentation of lives we can't actually remember living.

I think of my own photographed life—thousands of images on my phone, each one supposedly capturing a moment worth preserving. But when I scroll through them, they sometimes feel like someone else's memories, distant and disconnected. The meals I photographed instead of tasting, the sunsets I captured instead of watching, the moments I recorded instead of living. I have the receipts but not the experience, the evidence but not the memory.

Part VI: A Moment of Self-Excoriation

The Hypocrite at the Keyboard

I must pause here, in this comfortable artisan bakery where I write on a laptop, surrounded by other people staring at their own screens, to acknowledge the profound hypocrisy of my position. I am not a prophet standing outside the system, warning of its dangers. I am a participant, a collaborator, an addict criticising the dealer while standing in line for my next fix.

During the composition of the previous section—that high-minded meditation on memory and meaning—I checked my email seven times. I responded to three WhatsApp messages. I looked up the weather in Tokyo, a city I sadly won't be visiting anytime soon. I watched a short video about traditional Japanese sword-making that I will forget by tomorrow. I checked my Instagram notifications twice, my heart sinking each time slightly at the low engagement on my last posts for no good reason, which I thought were profound but which the algorithm deemed insufficiently viral.

This is the truth: I have been drowning in the same digital tsunami I'm describing. My own attention is as fractured as anyone's, my own mind as colonised, my own life as mediated. I write about the importance of presence while being absent. I advocate for slowness while rushing. I praise depth while skimming surfaces.

I am rebelling against it, though. Hold that thought.

The Seduction of Connection

But here's what makes it so hard to resist: the machine offers us something we desperately need—connection. Or at least the simulation of connection, which, in our loneliness, we accept as the real thing.

I think of the pandemic years, when the screens that divide us became the only thing connecting us. Zoom calls with family, FaceTime with friends, Discord servers replacing dinner parties. We learned that digital connection, however inadequate, was better than no connection at all. The machine saved us from total isolation even as it deepened our existential loneliness.

And there's this: through these devices, I've found communities of people who share my obscure interests, who understand my particular struggles, who offer support and insight and occasionally genuine wisdom. I've read brilliant essays by people I'll never meet, discovered music that moves me from bedrooms on the other side of the world, participated in conversations that have genuinely changed my mind about important things.

The machine's tempo is not inherently evil—this must be said and said clearly. Speed enables the rapid dissemination of life-saving medical information during pandemics. It allows marginalised voices to bypass traditional gatekeepers and reach global audiences. It connects human rights activists across borders, enables real-time documentation of injustices, facilitates the organisation of resistance against tyranny.

The Arab Spring, #MeToo, Black Lives Matter—these movements were enabled by the very technologies I'm critiquing. The speed of information spread faster than the speed of suppression. The democratisation of communication broke through centuries of enforced silence.

The Biological Mismatch

And yet—and yet—there is something fundamentally anti-biological about the speed we've achieved. We are carbon-based lifeforms trying to keep pace with silicon-based systems. We evolved for a world where information travelled at the speed of speech, where news moved at the pace of horses, where change happened gradually enough for adaptation.

Our brains have roughly the same processing power they had 50,000 years ago. Our neurons fire at the same speed. Our working memory can still only hold roughly seven items, plus or minus two (Miller, 1956). We still need eight hours of sleep, regular meals, physical movement, sunlight, human touch. These needs are not negotiable; they're encoded in our DNA.

But we're trying to run biological software on digital hardware, or perhaps it's the other way around—trying to run digital software on biological hardware. Either way, there's what engineers call an impedance mismatch—a fundamental incompatibility between two systems trying to work together. When you have an impedance mismatch, you get distortion, noise, system failure.

The distortion is everywhere: in our epidemic of anxiety and depression, in our crisis of meaning and purpose, in our inability to be present with the people we love, in our constant, nagging sense that something essential is slipping away but we can't quite name what it is. 

Part VII: The Archive of the Unread

The Library of Babel Realised

Jorge Luis Borges, in his prescient story "The Library of Babel," imagined a universe consisting of an infinite library containing every possible book—every combination of letters, every permutation of meaning (Borges, 1941/1962). The librarians in this universe go mad searching for meaning in the infinite chaos of information. They develop cults around certain books, wage wars over interpretations, eventually realise that in a library containing everything, nothing has meaning.

We have built Borges's library. We call it the internet.

We are the first generation in human history to have access to essentially all of human knowledge, instantaneously, for free. Every great work of literature, every scientific paper, every philosophical treatise, every piece of music ever recorded—it's all there, waiting in our pocket. This should be a golden age of learning, a renaissance of human consciousness.

Instead, we have what I call information obesity: we are stuffed but malnourished, gorged but starving, drowning in data while dying of thirst for wisdom.

In the pre-digital age, information scarcity created value. A personal library was a treasure accumulated over a lifetime. Books were expensive, precious, passed down through generations. People memorised entire poems not as a party trick but as a way of carrying beauty with them always. They could recite passages from religious texts, the words worn smooth by repetition like river stones. Knowledge was internalised, metabolised, integrated into the self until you couldn't distinguish between what you'd learned and who you were.

My grandfather could recite hundreds of poems from memory. He carried Keats and Yeats and Frost in his mind like a portable library. When I asked him why he bothered memorizing them when he could just look them up, he looked at me with incomprehension. "But then they wouldn't be mine," he said.

Now, information is infinite and therefore worthless. We hoard it like digital dragons sitting on piles of gold we'll never spend. My "Read Later" folder contains 3,847 articles. My browser has 147 tabs open across five windows. My phone contains 23,000 photos. My computer has folders within folders within folders, digital sedimentary layers of good intentions and abandoned projects.

The Paradox of Perfect Memory and No Recollection

We have created machines with perfect memory and humans with none. Every moment is documented, archived, backed up to the cloud. Our phones remember everywhere we've been, everyone we've talked to, everything we've searched for. Our social media preserves every thought we've shared, every photo we've taken, every opinion we've expressed.

But this perfect external memory has atrophied our internal memory. We don't remember phone numbers, addresses, birthdays. We don't remember the plots of movies we watched last week, the articles we read yesterday, the conversations we had this morning. We live in a state of perpetual present tense, every moment immediately replaced by the next, nothing sticking, nothing accumulating into meaning.

This is what Paul Connerton calls "cultural amnesia"—not the gradual forgetting that comes with time, but an active erasure of memory by the very technologies meant to preserve it (Connerton, 2009). We are so busy recording the present that we have no time to process the past. We are so focused on the next thing that we never integrate the last thing.

The news cycle exemplifies this amnesia. Scandals that would have defined decades now evaporate in days. Mass shootings that should traumatise us into action are forgotten by the next mass shooting. Political outrages that should topple governments are replaced by new outrages before we can process our anger. We exist in what Douglas Rushkoff calls "present shock"—a state of constant emergency where everything is happening now, nothing happened before, and the future is unimaginable (Rushkoff, 2013).

The Death of Deep Learning

What we've lost is not just memory but the entire apparatus of deep learning—the slow, patient accumulation of knowledge that transforms information into understanding, understanding into wisdom.

I think of how I learned calculus in high school. It took months of struggle, confusion, practice. I did problem sets until my hand cramped. I stared at equations until they swam before my eyes. I dreamed about derivatives. And then one day, suddenly, it clicked. The symbols transformed into meaning. I could see how change could be measured, how infinity could be approached, how the universe could be described in mathematics. That knowledge became part of me, changed how I saw the world.

Compare that to how we "learn" now. We Google the answer. We watch a three-minute YouTube video. We skim the Wikipedia summary. We get the information we need for the moment and immediately forget it, knowing we can always look it up again. But this isn't learning; it's just-in-time information retrieval. It doesn't change us, doesn't become part of us, doesn't alter our consciousness.

The machine promises us all the world's knowledge, but it delivers only information. And information without integration is noise.

Part VIII: Toward a Sanctuary of Slowness

The Politics of Pace

How then shall we live? How do we maintain human consciousness in an inhuman tempo? How do we resist without becoming Luddites, engage without being consumed, use these tools without becoming tools ourselves?

The answer is not withdrawal—I've learned this the hard way. Complete digital abstinence is both impossible and irresponsible in our current world. To completely disconnect is to abandon the public sphere, to cede the conversation to those still plugged in, to make yourself irrelevant to the discussions that shape our collective future. The digital hermit is not a revolutionary but a dropout.

Nor is the answer better "digital wellness" apps or productivity hacks or life optimisation strategies. These are bandages on a severed artery, individual solutions to systemic problems. They ask us to manage our addiction better rather than questioning why everything is designed to be addictive in the first place.

What I propose instead is what I call "temporal sovereignty"—the conscious, deliberate, political act of reclaiming your own time as a form of resistance. This is not about productivity or optimisation or self-improvement. This is about recognising that the control of time is the ultimate form of power, and that reclaiming that power, moment by moment, breath by breath, is a radical act.

Temporal sovereignty begins with the recognition that your attention is not a resource to be harvested but a sacred faculty to be protected. It requires viewing every notification, every invitation to engage, every demand for immediate response as what it really is: an attempt to colonise your consciousness.

Practices of Resistance

The practice of temporal sovereignty manifests in small acts of rebellion that accumulate into revolution:

The Sacred No: Learning to say no without justification, without apology, without offering alternative times or compensatory gestures. No is a complete sentence. No is a declaration of independence. No is the foundational word of autonomy.

The Digital Sabbath: Choosing regular periods—an evening, a day, a week—where you completely disconnect. Not because you're burned out, not because you need a break, but as an assertion of your right to exist outside the network. This is not self-care; it's self-determination.

The Slow Read: Deliberately choosing difficult texts that require sustained attention. Reading them with a pencil, making notes in the margins, looking up words you don't know, rereading passages that confuse you. Treating reading not as consumption but as conversation, not as information transfer but as transformation.

The Unmeasured Life: Refusing to quantify everything—your steps, your sleep, your heart rate, your productivity. Accepting that not everything that matters can be measured, that not everything that can be measured matters. Living by rhythm rather than metrics, by feeling rather than data.

The Gift of Presence: When with others, being fully with them. Not documenting the moment, not sharing the experience, not performing your life for an invisible audience. Offering your complete attention as the radical gift it has become.

The Practice of Boredom: Deliberately creating spaces of nothingness. Sitting in waiting rooms without your phone. Walking without podcasts. Commuting without distraction. Learning to be alone with your thoughts, however uncomfortable they might be.

Building Structures of Slowness

But individual practices aren't enough. We need to build collective structures that protect and promote temporal sovereignty. This means:

Redesigning Technology: Demanding "humane technology" that respects human limitations rather than exploiting them. Interfaces designed for contemplation rather than compulsion. Algorithms that promote depth rather than engagement. Platforms that facilitate genuine connection rather than addictive interaction.

Reclaiming Education: Fighting for educational systems that teach sustained attention as a skill, that value depth over breadth, that recognise thinking as a slow process that can't be accelerated without losing its essence. We need to teach children not just to read but to read deeply, not just to think but to think slowly.

Creating Sacred Spaces: Establishing physical and temporal spaces where the machine's tempo is not allowed—restaurants where phones are checked at the door, parks with no WiFi, libraries that block signals. We need refuges from the network, sanctuaries of silence, places where we can remember what it feels like to be human.

Changing Work Culture: Challenging the expectation of constant availability, the valorisation of busyness, the equation of response time with dedication. We need to recognise that the best thinking happens slowly, that creativity requires downtime, that productivity and presence are not the same thing.

Part IX: A Manifesto for Depth in the Age of Surface

The Amphibious Consciousness

The question that haunts our age is not whether technology will slow down—it won't. Moore's Law ensures that processing speeds will continue to double, networks will get faster, the gap between human tempo and machine tempo will only widen. The question is whether we can maintain a human rhythm within the machine, whether we can be temporal amphibians—creatures capable of moving between two different temporalities without losing ourselves in either.

I think of musicians who can play incredibly fast passages—Coltrane's sheets of sound, Glenn Gould's prestissimo Bach—without losing the underlying rhythm, the deeper music. The notes blur into a cascade of sound, but beneath the speed is structure, intention, meaning. This is what we need: not to match the machine's tempo but to maintain our own rhythm within it.

The erosion of critical faculties is not inevitable; it is a choice we make thousands of times a day. Each time we reach for the phone instead of sitting with discomfort, each time we skim instead of read, each time we react instead of reflect, we vote for the world we're creating. These micro-decisions accumulate into macro-consequences, individual choices that become collective destiny.

The Muscle of Attention

To think deeply in the age of speed is not just an intellectual exercise—it is an athletic event, a form of resistance training for consciousness. Just as muscles atrophy without use, our capacity for sustained attention weakens without practice. But like muscles, it can be rebuilt, strengthened, made more resilient.

This requires what Susan Sontag exemplified throughout her career: what might be called a fierce, unapologetic seriousness—not the dour rejection of pleasure but the determined protection of depth (Sontag, 2012). It means treating your attention like the finite, precious, irreplaceable resource it is. It means recognising that every hour spent scrolling is an hour not spent reading, creating, connecting, becoming.

I've started training my attention like an athlete trains their body. I began with twenty minutes of single-pointed focus—sometimes looking inside myself for memories of a more productive time, sometimes reading a few pages of meaningful literature, sometimes simply watching the light change on the wall. I build up slowly, adding minutes like adding weight to a barbell. Some days I fail, fall back into the scroll, lose hours to the feed. But I return, again and again, to the practice of presence.

The Fertility of Silence

We must remember—must never forget—that silence is not absence but presence. It is not empty but gravid, pregnant with possibility. It is the dark matter of consciousness, invisible but essential, the negative space that gives shape to thought.

Silence is where ideas gestate, where insights crystallise, where the unconscious does its mysterious work. It is the soil in which imagination grows, the darkness seeds require to germinate. If we pave over that darkness with the harsh light of constant stimulation, we will create a world that is bright, loud, networked, and utterly barren—a cognitive desert where nothing new can grow.

The composer John Cage understood this when he created "4'33""—four minutes and thirty-three seconds of "silence" that revealed silence doesn't exist, that what we call silence is actually full of sound, of life, of possibility (Cage, 1961). The audience, forced to sit without structured sound, began to hear—really hear—the world around them: breathing, rustling, heartbeats, the hum of existence itself.

This is what we've lost: the ability to hear the background music of existence, the subtle frequencies drowned out by the noise.

The Speed of Wonder

The machine offers us the world at the speed of light. Every question answered instantly, every desire fulfilled immediately, every connection made at the touch of a screen. This is its promise, its seduction, its poison.

But we must find the wisdom—and it will take wisdom, and courage, and collective will—to receive the world at the speed of wonder. Wonder cannot be rushed. It requires the slow accommodation of the mind to the miraculous, the gradual opening of consciousness to awe. You cannot Google wonder. You cannot download awe. You cannot stream the sublime.

Wonder happens at biological speed, at the pace of breath and heartbeat, at the rhythm of seasons and tides. It happens when we're not looking for it, when we've stopped trying to capture it, when we've released our grip on experience long enough for experience to grip us.

Coda: The Falling Leaf

The Moment of Return

I close my laptop. The screen goes dark, and in that black mirror, I see my own face reflected—tired, older than I remember, marked by hours of screen-glow, the blue light having carved new lines around my eyes. For a moment, I don't recognise myself. This face has been so long turned toward screens that it has forgotten how to turn toward the world.

I look away, deliberately, consciously, with the kind of effort that should be effortless. Outside the artisan bakery window, the afternoon light has shifted to that particular gold that only comes in late autumn, when the sun angles low and the air itself seems to turn amber. This is the light that photographers call "magic hour," but no photograph has ever captured its actual magic—the way it makes everything look blessed, temporary, irreplaceable.

A single leaf detaches from a tree—not dramatically but gently, almost reluctantly, as if it has finally accepted what it has been resisting all autumn. It doesn't plummet. It performs a complex dance with gravity, rocking back and forth like a boat on invisible waves, spinning slowly, catching the light, turning gold to green to gold again.

I watch its entire journey. This takes perhaps twenty seconds—an eternity in internet time, barely a moment in tree time. The leaf traces a pattern through the air that will never be repeated, a unique pathway through space and time, a one-time performance for an audience of one.

The Physics of Falling

There is complex physics in this falling leaf—fluid dynamics, angular momentum, chaotic systems. Scientists have tried to model it, to predict the path a leaf will take. They've discovered it's essentially impossible. Too many variables: the exact shape of the leaf, its moisture content, the micro-currents of air, the temperature gradients. Each falling leaf is a unique solution to an unsolvable equation.

This is what we miss when we move at machine speed: the irreducible complexity of the actual world, the patterns that can't be compressed into algorithms, the beauty that emerges from slowness itself.

The leaf continues its descent. I could photograph it, could film it, could share it with the caption "Nature is healing" or "Autumn vibes" or some other reduction of the irreducible. But I don't. I just watch, and in watching, something in me that has been clenched begins to uncurl. I am relearning.

The Recovery of Time

As I said earlier, I am rebelling against the loss of my own time. Not through grand gestures of digital abstinence or proclamations of technological retreat, but through small, deliberate acts of reclamation. I close the unnecessary tabs. I silence the notifications. I return, again and again, to the single window where sustained thought might still be possible. Each act of focus becomes an insurgency against the algorithmic logic that seeks to fragment my attention into ever-smaller parcels of consumption. This is not about rejecting technology wholesale—such Luddite nostalgia holds no appeal—but about refusing to surrender the deeper rhythms of consciousness that make us fully human. The battle is fought in minutes: those moments when I resist the impulse to check, to refresh, to flee from the discomfort of sustained concentration. In these small victories, I reclaim inches of territory. My mind, slowly, becomes mine again.

This is how we save ourselves: one moment of presence at a time, one act of attention at a time, one small refusal to surrender consciousness to the machine. Not through grand gestures or dramatic renunciations, but through these tiny recoveries of time, these micro-resistances to the tempo that would consume us.

The leaf lands, settles among its fellows, becomes part of the carpet of autumn. Soon it will decompose, become soil, become tree again. This is tempo that preceded us and will outlast us—the rhythm of growth and decay, of seasons and cycles, of breath and heartbeat.

I sit for another moment, then another. No one is texting me anything urgent. No email requires my immediate response. The world will not end if I don't check my notifications. The only thing that will happen if I sit here watching the light fade is that I will have watched the light fade, will have been present for the daily miracle of transition from day to night, will have remembered what it means to be a creature of carbon rather than silicon, of flesh rather than data.

The Choice

Every moment, we face the same choice: the screen or the world, the feed or the field, the network or the earth. This is not a choice between technology and nature—that's too simple, too binary. It's a choice between consciousness and its absence, between presence and its simulation, between depth and surface, between the human tempo that creates meaning and the machine tempo that erases it.

The artisan bakery is closing. The baker is wiping down the counter, preparing to close. I gather my things slowly, reluctantly—the crumbs of a croissant, the book I've been trying to read, the silence I've been hoarding. I have to reenter the stream, the flow, the feed. But something has shifted. I've remembered something essential, something I keep forgetting and have to keep remembering: that I am not a user or a consumer or a node in a network. I am a consciousness, capable of wonder, capable of resistance, capable of choosing my own tempo.

The scent of sourdough lingers—that ancient fermentation, that slow transformation of grain and water and time into sustenance. It's a reminder that not everything worth having can be accelerated, optimised, or delivered instantly. Some processes demand their own duration, refuse to be rushed. As I step toward the door, I carry this knowledge like bread wrapped in cloth, still warm, still nourishing, a small rebellion against the tyranny of speed.

I walk home without earbuds, without podcasts, without the insulation of manufactured sound. I hear actual birds, actual wind, actual voices. The world is louder than I remembered, more complex, more present. My phone buzzes in my pocket—once, twice, repeatedly. It's no one I care about calling, merely more notifications. I let it buzz. Whatever it is can wait.

I am walking at three miles per hour, the speed humans have walked for hundreds of thousands of years. This is the speed at which we think best, solve problems, have insights. This is the speed of philosophy—Aristotle's peripatetic school, Kant's daily walk, Thoreau's sauntering. This is the speed at which the human mind naturally works when not forced to work faster.

The Leaf's Teaching

That falling leaf taught me something, or reminded me of something I keep forgetting: that the most profound things happen slowly. Birth is slow. Growth is slow. Learning is slow. Healing is slow. Love is slow. Death is slow. Only machines are fast, and they are fast because they are not alive.

We are not machines, though we're trying hard to become them. We are biological, temporal, mortal beings, blessed and cursed with consciousness, capable of experiencing beauty, meaning, connection—but only if we slow down enough to let these experiences happen.

The machine's tempo will not slow. If anything, it will accelerate the gap between human time and machine time, widening until they become entirely different categories of existence. But we don't have to match its pace. We can't match its pace. And in that impossibility lies our salvation.

We can choose to live at the speed of life rather than the speed of light. We can choose presence over productivity, depth over data, wisdom over information. We can choose to be temporal rebels, consciousness guerrillas, attention activists. We can choose to protect the human tempo as we would protect an endangered species—because that's what it is.

The Final Frame

I reach my apartment as darkness falls. Before I go inside, I stand for a moment on the threshold, looking back at the day that's ending, forward to the evening that's beginning. This is the liminal moment, the space between, the gap the machine hasn't yet colonised.

In my pocket, my phone continues its electronic pleading—buzzing, chiming, demanding attention. I pull it out, look at the notification screen—3 unread messages, 17 emails, 64 updates on social media. All of it requiring attention, none of it critical. I could spend the next three hours responding, reacting, engaging. Or I could turn it off, step in, kiss my wife, pet my dog, eat dinner, pick up a book, think my own thoughts, dream my own dreams.

The choice is mine. The choice is always mine. The choice is ours.

I turn off the phone. The screen goes dark. The buzzing stops. The silence that follows is not empty but full—full of possibility, full of potential, full of the promise that I might yet think a thought that is truly my own, feel a feeling that hasn't been algorithmically induced, have an experience that won't be shared, documented, commodified.

This is how we remain human in the age of algorithms: by remembering that we are creatures of carbon, not silicon, of flesh, not data, of breath, not bandwidth. By choosing, again and again, in the face of tremendous pressure to do otherwise, to live at the tempo of our own heartbeats rather than the frequency of our devices.

By remembering that we have something the machines will never have: the capacity to watch a leaf fall and be changed by it, to sit in silence and hear the universe, to be bored and from that boredom create something beautiful, to be slow and from that slowness discover what it means to be alive.

This is our resistance. This is our revolution. This is our hope.

One falling leaf at a time. One moment of presence at a time. One small refusal at a time.

Until we remember who we were. Until we become who we might yet be. Until we find our way back to the human tempo that beats beneath all the noise, patient and persistent as a heart, waiting for us to listen, to return, to remember, to begin again.

References

Baird, B., Smallwood, J., Mrazek, M. D., Kam, J. W. Y., Franklin, M. S., & Schooler, J. W. (2012). Inspired by distraction: Mind wandering facilitates creative incubation. Psychological Science, 23(10), 1117-1122. https://doi.org/10.1177/0956797612446024

Borges, J. L. (1962). The library of Babel (J. E. Irby, Trans.). In Labyrinths: Selected stories and other writings (pp. 51-58). New Directions. (Original work published 1941)

Cage, J. (1961). Silence: Lectures and writings. Wesleyan University Press.

Connerton, P. (2009). How modernity forgets. Cambridge University Press. https://doi.org/10.1017/CBO9780511627187

Ferster, C. B., & Skinner, B. F. (1957). Schedules of reinforcement. Appleton-Century-Crofts.

Han, B.-C. (2015). The burnout society (E. Butler, Trans.). Stanford University Press. (Original work published 2010)

Henkel, L. A. (2014). Point-and-shoot memories: The influence of taking photos on memory for a museum tour. Psychological Science, 25(2), 396-402. https://doi.org/10.1177/0956797613504438

Hublin, J.-J., Ben-Ncer, A., Bailey, S. E., Freidline, S. E., Neubauer, S., Skinner, M. M., Bergmann, I., Le Cabec, A., Benazzi, S., Harvati, K., & Gunz, P. (2017). New fossils from Jebel Irhoud, Morocco and the pan-African origin of Homo sapiens. Nature, 546(7657), 289-292. https://doi.org/10.1038/nature22336

Lembke, A. (2021). Dopamine nation: Finding balance in the age of indulgence. Dutton.

Miller, G. A. (1956). The magical number seven, plus or minus two: Some limits on our capacity for processing information. Psychological Review, 63(2), 81-97. https://doi.org/10.1037/h0043158

Montag, C., Markowetz, A., Blaszkiewicz, K., Andone, I., Lachmann, B., Sariyska, R., Trendafilov, B., Eibes, M., Kolb, J., Reuter, M., Weber, B., & Markett, S. (2017). Facebook usage on smartphones and gray matter volume of the nucleus accumbens. Behavioural Brain Research, 329, 221-228. https://doi.org/10.1016/j.bbr.2017.04.035

Pascal, B. (1995). Pensées (A. J. Krailsheimer, Trans.; Rev. ed.). Penguin Classics. (Original work published 1670)

Raichle, M. E. (2006). The brain's dark energy. Science, 314(5803), 1249-1250. https://doi.org/10.1126/science.1134405

Raichle, M. E. (2015). The brain's default mode network. Annual Review of Neuroscience, 38, 433-447. https://doi.org/10.1146/annurev-neuro-071013-014030

Rushkoff, D. (2013). Present shock: When everything happens now. Current.

Sontag, S. (2012). As consciousness is harnessed to flesh: Journals and notebooks, 1964-1980 (D. Rieff, Ed.). Farrar, Straus and Giroux.

Sparrow, B., Liu, J., & Wegner, D. M. (2011). Google effects on memory: Cognitive consequences of having information at our fingertips. Science, 333(6043), 776-778. https://doi.org/10.1126/science.1207745

Virilio, P. (2000). The information bomb (C. Turner, Trans.). Verso.

Virilio, P. (2006). Speed and politics: An essay on dromology (M. Polizzotti, Trans.). Semiotext(e). (Original work published 1977)

Wilson, T. D., Reinhard, D. A., Westgate, E. C., Gilbert, D. T., Ellerbeck, N., Hahn, C., Brown, C. L., & Shaked, A. (2014). Just think: The challenges of the disengaged mind. Science, 345(6192), 75-77. https://doi.org/10.1126/science.1250830

Wolf, M. (2018). Reader, come home: The reading brain in a digital world. Harper.


🔴 Viewpoint is a random series of spontaneous considerations about subjects that linger in my mind just long enough for me to write them down. They express my own often inconsistent thoughts, ideas, assumptions, and speculations. Nothing else. Quote me at your peril.