I Miss When Only Smart People Got to Talk

By Francesca Rompal, MA


Photo: CottonBro Studios on Pexels

There’s something deeply exhausting about being online in 2025. Not just because of the constant influx of headlines, tragedies, and half-baked thread explainers — but because we’ve now spent over a decade in the digital wilderness, listening to people we would never take seriously in real life. We’re drowning in thoughts that were never meant to be spoken aloud, let alone published, reposted, and algorithmically amplified. And it’s making all of us just a little bit dumber.

In a culture obsessed with “relatability,” we’ve devalued actual intellect. Anyone with a phone and a ring light is a thought leader now. Expertise? Elitist. Credentials? Gatekeeping. The result: the loudest voice in the room wins, not the most thoughtful. And the loudest voices, it turns out, are often saying very little of substance.

There was a time — and it wasn’t even that long ago — when the people whose thoughts we consumed had to earn that platform. Journalists were trained. Writers were edited. Academics had citations and standards. Authors had to sell a book proposal. If someone was sharing their opinion on national TV, there was at least a chance they knew what they were talking about. We didn’t agree with all of them, but there was a sense that their words had passed through some kind of filter: of quality, of credibility, of coherence.

Now? That filter is gone, with thousands of AR (Augmented Reality) face filters in its place. And the result is cultural brain rot.

The Before Times – When Writing Had a Gatekeeper (and That Wasn’t Always Bad)

Before the era of social media, most of what we read was chosen for us — not by tech platforms and the billionaire weirdos that own them, but by editors, publishers, and producers. We read newspapers, literary magazines, and published books. We watched the nightly news or tuned in to Sunday opinion panels. Sure, it wasn’t perfect —the media in general has its own baggage — but there was at least a baseline standard of thoughtfulness and accountability.

Photo: Suzy Hazelwood on Pexels

According to research, in 2004 (just before Facebook launched), 82% of Americans got their news from print publications, network TV, or public radio (RIP). The dominant platforms were The New York Times, The Washington Post, TIME, The Atlantic, 60 Minutes — institutions with editorial oversight. If someone published an op-ed, one could assume it had been fact-checked, revised, and reviewed.

In 2008, Twitter launched. By 2011, it became a platform for public intellectuals and by 2015, it was a shouting match. Today, over 80% of adults under 30 say they get their news from social media, with 58% of people of all ages getting their information via digital devices. The shift isn’t just about medium — it’s about authorship. The people telling us what’s happening in the world are no longer journalists or experts. They’re influencers, Redditors, randoms on Twitter. Friends from high school. Bots.

And most critically: no one is editing them.

Photo: Kenneth Surillo on Pexels

In a pre-Instagram world, if you wanted to be heard, you had to write well. You had to speak and carry yourself well, you had to pitch to a publication. You had to be compelling enough for a respected editor to say, “Yes, this matters.” But now? Thoughts that once would have stayed in the group chat (or in someone’s head entirely) are published in all caps, with sound effects, multi-post threads, and self-promotional ads wedged into the captions.

When everyone is allowed to publish without filter, without friction, without the push to actually think through what they’re saying — chaos ensues. Opinions lose their weight. The volume goes up, the substance goes down, and we’re all left sifting through digital debris.

The Floodgates Opened — When Everyone Became a Thought Leader

We’re now nearly two decades into a social media landscape that actively rewards unfiltered, often uneducated opinions — not because they’re true or thoughtful, but because they’re loud. Anger gets clicks. Ignorance gets shared. And social media platforms, which thrive on engagement, have no moral compass about what’s being amplified as long as people are watching.

In this environment, the line between “audience” and “authority” has fully collapsed. Anyone — anyone — can now write a viral post or make a thread that spreads faster than any peer-reviewed article or reported piece of journalism. And the content doesn’t have to be accurate, ethical, or coherent. It just has to be emotional, controversial, or aesthetically packaged.

Photo: Mary Taylor on Pexels

A 2018 MIT study found that false news spreads ten to twenty times faster on Twitter than facts, particularly when it’s emotionally charged. Another study from 2022 revealed that posts with strong moral language are 30% more likely to be shared, even when those posts are factually incorrect or come from completely unverified sources. And we have only moved further into this hellish landscape in the past couple of years. 

So what happens when every person — regardless of education, experience, empathy, or any real grasp of the topic — believes their opinion is valid simply because it exists? We start confusing confidence for competence. We confuse visibility for legitimacy. We reward people not for what they know, but for what they say and how many people react to it.

And it’s not just the influencers with curated aesthetics and monetized personalities. It's also the everyday posters — the Facebook aunt, the TikTok neighbor, the guy in the replies who spells “you’re” wrong every time but is somehow ready to argue about public health policy or feminism or international law with the conviction of a tenured professor.

Photo: CottonBro Studios on Pexels

We’ve normalized the idea that everyone’s voice deserves to be heard, which sounds noble — but in practice has meant we’re reading tens of thousands of low-effort, high-impact takes from people who haven’t thought critically about anything in their entire lives.

We are not supposed to consume this many people’s opinions in one day. Especially not opinions that are typed on cracked phones, full of grammatical errors and rage, from someone who read half a headline and now feels equipped to argue in the comments like they’re at a UN summit.

And because this behavior is now constant — because we expect to hear from people who don’t know what they’re talking about — we start internalizing their language, their ideas, their framing. The bar for credibility drops. And suddenly, expert consensus is just “one side of the discourse.”

Real Consequences — Cultural Brain Rot, Disinformation, and the Erosion of Critical Thinking

This is not just annoying — it’s dangerous. When every voice is given the same algorithmic weight, when falsehoods are dressed up like insight, when the collective brain trust of society is filtered through trending TikToks and barely coherent threads, the result is individual and cultural brain rot.

In 2023, Stanford University released a report showing that the majority of misinformation online is spread not by bots, but by real people — often unknowingly, often passionately. And even when it’s corrected, it doesn’t matter: a 2022 study in Science Direct found that corrections rarely shift belief. Once the misinformation is out there and has been consumed, it sticks.

Photo: Antoni Shkraba Studio on Pexels

This is the real problem — not just that people are wrong, but that being wrong no longer matters.

We have created a feedback loop in which confidence is rewarded, but correction is ignored. 

The result? A generation that feels emotionally validated by “speaking their truth” regardless of whether that truth has any basis in reality.

And when we're constantly inundated by this kind of content — opinions masquerading as facts, vibes sold as values — we lose our edge. The critical thinking muscle atrophies. We stop asking “what are the facts?” and start asking “who said it and do I like them?”

Studies show this erosion clearly. The Reboot Foundation’s 2022 report on media literacy found that less than one-third of American adults regularly verify information before sharing it online. Meanwhile, younger users — raised in the storm of perpetual content — increasingly admit they don’t trust mainstream sources but do trust influencers or creators they “feel connected to.”

Photo: Pexels

This isn’t just a media problem. It’s a cognition problem. We’re overwhelmed and under-informed. We scroll instead of read. We react instead of reflect. And all of it — the misinformation, the low-effort discourse, the constant opinion-having — is exhausting. Emotionally, mentally, existentially.

We’ve traded curation for chaos. Reflection for reactivity. Knowledge and information for volume. And the average person is now living in a fog of microtakes, pseudo-wisdom, and algorithm-driven emotional spikes — not insight.

The result is a society that is both overstimulated and intellectually dulled. We’re more connected than ever, and somehow more birdbrained in the aggregate — parroting digital soundbites while dismissing data. Citing @’s over vetted sources. Feeling “informed” after watching a Reel with bold white font and lo-fi beats over it.

This isn’t discourse. It’s cognitive decay.

The Way Forward (Maybe): Discernment, Boundaries, and Digital Hygiene in the Age of Infinite Noise

So, where does that leave us?

Scrolling through the intellectual wasteland, muttering “this is insane” every ten seconds, yet still showing up to the feed like it’s church?

Photo: Mikoto on Pexels

We can’t put the genie back in the phone. That’s clear. There is no great unplugging.

But let’s be honest — some of us already have unplugged, or at least pulled the cord halfway out. Personally, I don’t engage much anymore. Not out of moral superiority, but frankly because it’s boring. It’s damaging. It makes me dumber.

In a world that’s tech-obsessed, a quiet counterculture has been growing — one that craves depth over immediacy, and presence over performance. People who use screens when necessary but not every waking second. Who are slowly realizing that our psychic, spiritual, and physical health depends on some degree of distance. We’re addicted to blue light, starved of real sunshine, and actually interacting with each other less than ever. 

Americans now spend an average of 7 hours a day on screens, at best. More than half of young adults say they feel addicted to their phones, and screen time has been directly linked to higher rates of anxiety, general malaise or downright depression, and deeply decreased attention spans. We're overwhelmed, under-fulfilled, and somehow still picking up the fucking phone.

For those (most) of us who can’t fully disconnect — who rely on digital spaces for work, visibility, survival — maybe, just maybe, there’s still room for a little something called discernment.

Photo: Thanh Luu on Pexels

In the same way we’ve started to care about gut health and skincare routines, there’s a growing cultural awareness that we also need some form of digital hygiene. That means boundaries, intentionality, and saying no to garbage content with the same energy you’d swipe left on a bad date. We don’t have to consume every take. We don’t have to react to every opinion. We are allowed to look away.

There’s also something to be said for curating your information diet with the same care you apply to your literal meals. Less processed, more substance. More vegetables (books, longform, printed things that had editors). Less sugar (Twitter threads, TikToks, unverified infographics on a beige Canva background).

Media literacy is no longer a “nice to have.” It’s essential. It should be taught in schools, yes — but also in adult spaces. At dinner. In work trainings. In therapy. We need to re-learn how to ask:

“Where did this come from? Is it true? Do I actually know what I’m talking about, or am I just repeating something I scrolled past?”

Photo: CottonBro Studios on Pexels

Some are already doing this. There is a small but vocal return-to-print movement with people subscribing to newsletters and zines. Actually reading books, buying newspapers like it’s 2004 and they’re heading to brunch with a tote bag. The nostalgia isn’t for print — it’s for clarity. For credible, informed voices. For something that passed through at least one thoughtful filter before landing in your hands.

We won’t escape the mess. But we can stop drinking from the same polluted stream.

And maybe that’s enough. Not to fix everything — we’re probably too far gone for that — but to reclaim a few inches of our own intellectual space. To ask more from ourselves, to let our brains generate some new neurons. To require a little more thinking before sharing, or consuming. 

Photo: VideoHive

Because if we’re all going to be intaking this much digital static every day, maybe it’s time to ask:

“Is this making me smarter, or just more numb?”

And if the answer leans toward rot? Close the app. Log off. Go touch some grass.

Your brain deserves better — and deep down, you know it.

Next
Next

Book Review: No Cure, No Problem: The Art of Healing by Jason Ott