It is now more important than ever to critically evaluate the way we get our news and information.
For our health, first of all. It is seriously stressful to be constantly plugged in, especially during particularly fraught times like these. And constant stress is, famously, not so healthy. The good news for those of us lucky enough not to rely for our survival on up-to-the-minute news updates? The possibility of re-evaluating our digital habits, and turning away from platforms which promote highly emotionally salient content over balanced reporting on facts.
A second reason we might consider unplugging is out of protest. In the name of resisting algorithmic manipulation, which is at least a contributing factor in both the mental health harms and the epistemic harms — harms having to do with truth and knowledge — of social media.
I discussed the epistemic harms in a past essay, using philosopher C. Thi Nguyen’s idea that the attention-extracting business models of digital platforms like Twitter hurt our ability to construct a nuanced and informed picture of the world.
Nguyen argues that Twitter's design turns communication into a sort of game where users chase likes and retweets rather than understanding. When conversation becomes a competition for engagement metrics, we lose the intellectual humility and cooperation necessary for genuine dialogue. Instead of seeking truth together, we become modern-day Sophists, optimizing for viral hot takes over nuanced understanding.
But we’re not defenseless against this withering away of epistemic virtues we’re seeing.
In a recent episode of the Deep Questions podcast, Cal Newport recommends taking a break, at least temporarily, from algorithmically-driven, engagement-maximizing information environments in the aftermath of this election cycle. He suggests embracing ‘slower’ media, like books and newspapers.
I’m not sure people will begin flocking to libraries across America in unprecedented droves anytime soon (though I’d love to see it!). But I am hopeful that public consciousness is beginning to be raised about the psychological harms of attention-harvesting, algorithmically-driven content delivery systems like Twitter.
More and more Twitter users are moving over to rival platform Bluesky, with largely positive reports so far. Still, we have to be careful that in critiquing Twitter we don’t forget that it is merely one platform among many that incentivizes divisiveness over nuance, encouraging users to point fingers rather than seek understanding.
This doesn’t mean rendering all algorithmic content ‘bad’ or rejecting digital platforms altogether. It just means having what Newport calls a philosophy of technology use, a philosophy that is oriented around your vision of a life well lived. This way, you’re less prone to digital distraction and better able to access your core beliefs and take action towards your goals.
That reminder is so powerful: that human aims precede technology. It means we get to be in control. We get to take back control.
For more thoughts on mitigating the harms of digital technology use while maximizing its benefits, check out my article “Web 3.0 is Coming. Are You Prepared?”
Thank you April. Our society needs to hear and attend more messages like this!
I'm curious to hear your thoughts on the long-term incentive structure of something like, for example, Bluesky; and whether you think that the future-state of *any* social media still tends towards an ever-increasingly incendiary mean. I hear, and want to share, your optimism for a fresh platform, but I'm skeptical that the drives of any social media can exhibit anything but tendency towards extremes. Hearst said long ago, (maybe apocryphally), that "if it bleeds it leads," and that trusism seems only to echo and amplify through our current media -- I wonder if, in a reality of algorithmic news, what "bleeds" is ultimately escapable (or likely not). I wonder how much of the consent we manufacture is our own malign.