#9 in the series on the book I am writing with Richard Russell
This week's Newsletter is on the profound shifts that come with our digital lives. If you're old enough, you might remember the focus of reading a whole book without distractions. That was me, but I read everything digitally these days. Books, papers... streaming services have replaced TV for me. And yes, I'm getting sucked into short-form videos more and more 😬.
This transformation has its ups and downs. So much info, yet my attention's fragmented. Before, I'd vanish into a book; now it's bursts of chat conversations, flitting from topic to topic. It's a different way of thinking.
The deep vs. the flow -- it's like a cultural shift from 'taking stock' of the world to 'taking flow.' We grab snippets, react, share fleeting thoughts. It's rewiring our brains, making sustained focus harder.
At the same time, there's an opportunity for AI to step in as the 'stock-taker,' sifting through the endless info stream. It becomes this weird necessity since our brains struggle to keep up. Scary, but also fascinating.
The big question: are we losing our ability to create deep meaning if we're always in the moment? And how much will AI shape (or control) the very flow it thrives on? This quote about capitalism got me thinking -- replace 'capitalism' with 'AI':
"Although capitalism is global… it deprives people of any meaningful cognitive mapping… There is… no global 'capitalist world view,' no 'capitalist civilization' proper." -- Žižek
So, where are we heading? One thing's for sure: AI/computing is going to force some major value shifts.
I wrote my bachelor's thesis by hand. I didn't type a word until graduate school, and even there, my professional use of text - for writing equations and proofs - was done by hand for several years. Not because I didn't have access to computing technologies, but because I didn't feel the need to adopt them for work. I was late to the mechanical revolution, let alone the computer one. Oh, the places I have gone since my life became digital.
As a scholar, I spend many of my waking hours reading. But reading how? I stopped printing out journal articles after I got my first iPad. I still love holding hardcovers in my hands and concentrating on their words, but in practice, even books are iPad first now. I stopped watching network TV a long time ago. I don't go to movie theaters. Streaming services are my only source of long-form entertainment. Increasingly, I am finding short-form video hard to resist. I don't use TikTok yet, but the threat is looming.
As a child, I would crack open a book and disappear into it for hours, sometimes days. I find that hard to do now. Adult responsibilities, but also the nature of my attention has changed. Some of my most enriching interactions are on chat apps, where conversation flows organically from topic to topic. No one owns these conversations, for there's no sense of ‘copyright’ with insights and arguments emerging from many people riffing on each other. Every once in a while, if the chat is work-related, my co-workers and I will turn the chat into a document, but distillation is a secondary act; the flow of conversation is the thing.
This digital transformation has been both exhilarating and unsettling. The sheer volume of information I consume daily is vastly greater than my pen-and-paper days. Yet, there's a gnawing sense that how I process and engage with knowledge has changed irrevocably. Deep immersion into a book with the single-minded focus it demanded feels like a fading memory. The distractions are relentless -- notifications, pings, endless rabbit holes just a tap away. My attention span has fragmented, shaped by the rhythms of short-form videos and fleeting social media updates. Even the most enriching conversations often happen in fragmented bursts, in chat apps where insights and arguments emerge collaboratively, with no single authorial voice. Deep, sustained focus on a single text seems at odds with a "flow" culture that thrives on snippets, reactions, and rapidly shifting attention. Does this indicate an inevitable cognitive shift, or can we cultivate mental disciplines that maintain contemplative space even within a world of constant stimuli? Is there a role for AI in helping us "defragment" knowledge, fostering connections between disparate ideas, and guiding us towards greater understanding amidst the flow?
If the print era ideal is that of a disinterested observer taking stock of the world, like a pinhole camera of the mind, then the emerging ideal in our cybernetic age is that of an embedded participant 'taking flow' of the world. When we chunk labor into ever smaller units - in the gig economy, for example - we are taking flow. When we share a fleeting thought on Snapchat, we are taking flow. Taking flow brings individual mental states into focus because our attention is now trained to absorb a single thought or feeling from another person - the unit we like and share. Sustained attention might even be counterproductive to taking flow. It's a shift from the detached, rational observer prized by the print era to an embedded, ever-reacting participant in an endless stream of stimuli. We take flow of snippets of information, fleeting bursts of emotion, or single compelling arguments shared by others. Our brains are rewiring, primed for quick reactions and effortless multitasking, but perhaps less adept at sustained focus on a singular idea.
At the same time, it's not as if we have lost the need for exposition or archival. We need repositories of both. That's where AI as a technology can play an essential role as connective tissue, for it's a technology that can augment our narrowing attention span. I am not using that as a pejorative - we can be incredibly creative, adaptive, and insightful in the flow of the moment. It's not a skill that paper culture rewards, but clearly has value while taking flow. And AI complements our flow skills, for it records all the flow-like happenings and turns those into stocks. We are seeing the emergence of note taking apps that automatically extract themes from your writing and connect your thoughts along those themes. You can just keep flowing while the stock-taking is being done by the machine. For better or worse, AI is emerging as a force against this backdrop of flow culture. Its ability to sift through, pattern-match, and contextualize the endless flow of information aligns perfectly with this new mode of cognition. We need AI because our own brains are struggling to keep up with the sheer volume of data we produce and the pace at which it assaults our senses. AI has the potential to be an invaluable tool in this context. It can be the stock-taker, the tireless archivist that distills, curates, and categorizes the fragments of our collective "flow," giving shape and context to the chaos.
I am sure you have a question in your mind: what happens when we are perennially lost in the moment and can no longer use our minds to connect backwards and forwards?
One indication of where we might be going is in this forthcoming essay by Zizek (via this essay by Nathan Gardels):
Although capitalism is global and encompasses the whole world,” Žižek writes, “it deprives the large majority of people of any meaningful cognitive mapping. Capitalism is the first socio-economic order which de-totalizes meaning: It is not global at the level of meaning. There is, after all, no global ‘capitalist world view,’ no ‘capitalist civilization’ proper. The fundamental lesson of globalization is precisely that capitalism can accommodate itself to all civilizations, from Christian to Hindu or Buddhist, from West to East. Capitalism’s global dimension can only be formulated at the level of truth-without-meaning.
Now replace capitalism by AI throughout that quote - for it’s the mechanism accelerating the destruction of all existing sources of meaning. I am not saying that metaphorically! It’s the literal truth: LLMs are stochastic parrots devoid of any genuine grasp of meaning and their widespread use will only increase the pressure on existing values and meaning.
What’s the future of meaning deprived society?
Gardels/Zizek/Berardi have an answer to that question:
“Korea is the ground zero of the world, a blueprint for the future of the planet. … In the emptied cultural space, the Korean experience is marked by an extreme degree of individualization and simultaneously it is headed towards the ultimate cabling of the collective mind. These lonely monads walk in the urban space in tender continuous interaction with the pictures, tweets, games coming out of their small screens, perfectly insulated and perfectly wired into the smooth interface of the flow. … South Korea has the highest suicide rate in the world. Suicide is the most common cause of death for those under 40 in South Korea.”
“What Berardi’s impressions on Seoul provide,” says Žižek, “is the image of a place deprived of history, a worldless place. This new generation mostly doesn’t care about big issues like human rights and meaningful freedoms or the threat of war. While the world still notices the aggressive pronouncements of the North Korean regime accompanied by nuclear threats, the large majority in South Korea just ignores them. Since the standard of living is relatively high, one comfortably lives in a bubble.”
I take these pronouncements with an enormous pinch of salt - Sam Altman and Elon Musk’s Silicon Valley is a much more obvious example of the meaning bleached world than Korea. These excerpts from Zizek and Berardi have more than their share of the racism that seeps out of European intellectuals who don’t like inhabiting a has been culture. Get over it: Seoul is more important than Rome. But there’s a way to read these statements with generosity: instead of associating today’s Korea with a monadic post-meaning culture, see it as the bleeding edge of flow culture. Flow isn’t devoid of meaning; it’s just that the meaning isn’t commensurate with existing values.
Computing in general and AI in particular, is going to precipitate an enormous shift in values.
I value the capacity to connect different thoughts in my head. But I also recognize that our memory capacities are nothing in comparison with people trained in oral scholarly traditions, where disciples would hear and remember every word - literally - that their master had spoken to them. Writing changed all that, as Socratus famously complained about. And yet, absolutely no intellectual snob today will argue that writing made us stupid. So why are we calling our narrowing attention spans a sign of stupidity, especially when there’s an emerging technology that helps externalize these cognitive skills?
This relentless push towards "taking flow," driven by our evolving attention spans and the capabilities of digital technologies, has created a world where AI has become indispensable. We need AI's help to analyze, make sense of, and preserve the fragments of knowledge we generate at a dizzying pace.
But we also must be vigilant -- for in the same breath, AI has the potential to shape, manipulate, and ultimately control the very flow from which it draws its power. Yet, as we've discussed repeatedly, this power can be easily corrupted. Who decides what is preserved, what is deemed valuable, and what is consigned to digital oblivion? Can we trust AI systems to be impartial curators of our knowledge and history when they are, inevitably, created with inherent biases? These are crucial questions that we must grapple with as AI becomes increasingly intertwined with our cognitive processes.
In the second essay in this section, I will present four thought experiments to illuminate the complexity of these changes.
I miss the previous letters and need to catch up. These are best things these days I read.