We have become geniuses at killing time — and I don't mean the enjoyable, meandering time of a Sunday afternoon. I mean the necessary, digestion-like time that thought requires. Save for the greatest creative, literary, and scientific geniuses, there is an inherent boredom or inertia that precedes deep creation for most of us. And, ironically, these scientific geniuses have been the ones to invent things that make it progressively easier to kill time.
Every so often, a technology comes around that transforms a fundamental aspect of the human experience. Without question, the last invention to do so was the smartphone. The smartphone changed how we interact with one another intentionally, using the device itself, and it also changed the frequency with which we choose not to interact with one another and instead stay glued to our phone screens. Small talk while waiting in line has turned to scrolling until it's your turn. Why waste time and energy breaking the ice or making conversation when you have an endless, already-melted stream of parasocial entertainment in your pocket.
Those who are lucky enough to have been born in the 90s -- which I consider the 'sweet spot' for smartphone adoption -- seem to have gotten the best of the pre and post-smartphone worlds. We were internet native, so when smartphones came around, we quickly learned how to use them. Many of our parents didn't know how to switch the input on the TV. However, unlike subsequent generations, the internet we grew up with was largely relegated to a desktop computer in the living room or the computer lab. The fact that the internet wasn't glued to our bodies gave us plenty of time to interact face-to-face, unencumbered, in virtually every other scenario. By the time smartphones became ubiquitous in the early-2010s, we had built enough of a foundation that, while many of us certainly became smartphone addicts, we at least had our baseline, pre-smartphone social instincts to fall back on if the occasion required it.
AI is beginning to change what it means to spend time doing our work. Just as social media and the smartphone transformed how we choose to fill our time socially, I believe that AI will usher in a fundamental shift in what it means to 'pay attention' when it comes to our general productivity. Whether this is a net positive remains to be seen, and, though the intro may seem that way, this article is not here to fearmonger or paint AI as the next thing to bring us closer to the destruction of civilization. In fact, I don't even think the smartphone brought us closer to this: many of the frequent criticisms lobbed at smartphones sound suspiciously similar to what was said about newspapers preventing idle conversation on the train.
As someone who is terminally online (though tries his best to touch grass), I believe the rapid access to information and instant communication are net positives. More existentially, I believe that their use as the primary means of socialization for nascent generations is just the next step in the evolution in how we communicate. In a similar vein, while I believe there is an immense risk of humans becoming detached from their work just as the smartphone has detached us from the world around us, I also see plenty of upside to productivity and quality of life if we are deliberate in how we use AI.
It all comes down to how we define what it means to pay attention, to learn, and to multitask. This is the story of how AI is starting to redefine these terms. And to understand where we're going, we have to stop misunderstanding how attention actually works. Full disclosure: what follows is strictly anecdotal, based on my own messy experience with focus, ADHD, and too many screens. I'm not a neuroscientist or psychologist; I'm just a guy who works a lot and notices when his brain feels different while doing so.