Social platforms reflect people’s behaviors but unlike life, you can uninstall and stop visiting them.
TikTok and Twitter are often described as mirrors of life; chaotic, messy, sometimes brilliant, sometimes horrifying. But here’s the thing: life didn’t come with an “uninstall” button. These platforms do, sort of (you can remove the apps or stop visiting them altogether). And that makes it a lot harder to accept their messiness as something we just have to live with.
The harm they cause is undeniable. The misinformation, the rabbit holes, the amplification of violence and hate, it’s all right there, front and center. And because these aren’t immutable forces of nature but products of human design, it feels logical to think: Why not just turn them off? If a bridge kept collapsing under people’s feet, we’d stop letting people walk on it. If a factory was spewing toxins into the air, we wouldn’t celebrate the occasional mural painted on its walls, we’d shut the thing down.
But TikTok and Twitter aren’t just digital bridges or toxic factories, they’re also marketplaces, stages, classrooms, protest grounds, and cultural archives. They’ve been instrumental in amplifying marginalized voices, organizing grassroots movements, and spreading ideas that would’ve otherwise been silenced. Shutting them down wouldn’t just erase the harm, it would also erase the joy, the connection, the organizing power, and the little moments of humanity they enable.
That’s the tension we’re stuck with: the pull between “this is causing so much damage” and “this is doing so much good.” And it’s not a tension we can resolve cleanly, because both are true. These platforms are not neutral, they’re shaped by design choices, incentives, and algorithms that reward outrage, escalate conflict, and keep users scrolling no matter the emotional cost. But they’re also spaces where real, meaningful things happen, sometimes in spite of those same algorithms.
It’s easier to point fingers at the platforms themselves than to reckon with the fact that their messiness isn’t an anomaly, it’s a reflection. They thrive on the same things we do: conflict, validation, novelty, and the occasional hit of collective catharsis. The darkness they expose isn’t artificially generated, it’s drawn out from people who were always capable of it. TikTok and Twitter didn’t invent bad faith arguments, moral panic cycles, or performative empathy, they just turned them into highly optimized content formats.
That’s why it’s so tempting to reach for the “off” switch. Because these platforms don’t just show us other people’s mess, they show us our own. They force us to confront the uncomfortable reality that the world doesn’t just have ugliness, it produces it. And no matter how advanced our moderation tools get, or how many advisory panels are assembled, there’s no elegant way to algorithm our way out of human nature.
But accepting that doesn’t mean we stop holding these platforms accountable. They’re still products of human design, and every design choice, from the algorithm’s preferences to the placement of a “like” button, shapes behavior and incentives. The companies behind them can and should do better. But even if they do, the fundamental tension remains: these spaces are built on human behavior, and human behavior will always be messy.
Maybe the real discomfort isn’t just about what TikTok and Twitter are. It’s about what they reveal about us. The chaos, the harm, the brilliance, the joy, it’s all a reflection. And if we can’t figure out how to look at that reflection without flinching, no amount of platform reform is going to save us from ourselves.
P.S: Let me just add that I’m talking about the old Twitter, not the cesspool of unhinged miseducated misinformed mass of misguided white supremacists that it has increasingly become, a.k.a discount 4Chan. On top of that, outside of the English speaking sphere of the platform, the old Twitter still exists unbothered or unaffected by what’s happening outside of their spheres partly due to cultural differences, partly due to lack of relevance, partly due to language, and perhaps a handful of other reasons.









