How to Change the Tone Online

These days, it often feels like outrage spreads faster than truth and that listening with an open mind—especially on our social media platforms—is a lost art. All of our scrolling seems to bring new reasons to react, choose sides, post, and comment. Is it even possible to listen thoughtfully when we’re online?
Maybe, says Imran Ahmed, founder and CEO of the Center for Countering Digital Hate. He has spent years studying how social media platforms amplify division and profit from our reactivity, and he’s on a mission to stop the spread of online hate and disinformation through research, public campaigns, and policy advocacy. It’s not our fault that many of us find it difficult to listen with an open mind and heart on these platforms, says Ahmed. But there are some concrete things we can do to be a part of the solution.
Read on for Ahmed’s insights on why the algorithms are designed to keep us divided, and what we can do both online and in real life to restore our collective humanity.
A CONVERSATION WITH IMRAN AHMED

What makes online spaces so uniquely reactive—and why do they seem to reward outrage over understanding?
There’s the impact of social media culturally. There’s also the way that these platforms are designed. And then there’s the way that algorithms are reshaping what we think is important due to the way algorithms work.
Let’s start with the cultural aspects. I think there is increasing pressure on us as people, as a result of the ubiquity of social media, to feel that we have to express ourselves in the moment when things happen, and that holding an opinion is not just a sacred right, but it is a duty to communicate it constantly. Because we know that social media rewards immediacy and being the first person to jump on trends, there is this constant pressure to bow to what is trending on social media, to form an opinion on it, and to communicate it. We haven't made this choice as a democracy. It's a choice that is forced upon individuals constantly. But it is a choice. We have the option of saying no. However, there's this general culture of expectation that everyone is posting all the time.
Then there’s the way that the platforms are designed. They aggregate hundreds of millions of people’s information and then decide who wins and who loses based on an algorithm. And that requires you to have to think really hard—if you want your communications to be visible—about making sure that you are working to the needs of the platform. We don't actually post for ourselves so much anymore, as we we post in order to essentially feed an algorithm whose aim is to create as much profit as possible by keeping people on platform for as long as possible. And controversial content does that. Conspiracy theories, hate, and polarization are also very effective at that.
What these companies have worked out over time and trillions of data points is that the stuff that keeps us scrolling is the stuff that induces not satisfaction, happiness, edification, or the quiet pleasure of learning something new, but rather something that pisses us off and that gets us angry and emotional.
That, on an individual basis and at a societal basis, is increasing our cortisol levels. It's making us feel more stressed and anxious. We're always being presented with stuff that upsets us. And think about what you're like when you've got high cortisol. You're in a hyper vigilant state. You are watching, and so it keeps us on the platform, because we're trying to work out what the heck is happening and why people are behaving this way. And so, we get caught in this loop of being presented information through a dystopian lens. Over time, that's having multiple socializing effects on us as people and as a society. One is that it makes the world look more terrifying, polarized, angry, and hateful than it really is. Another is that it's making us hyper aware of difference and how threatening the world is.
All of this is having a series of pernicious effects in our society, due to the design of platforms that demand that we constantly engage, post, react, and communicate at all times.
It often feels like we’re not listening to each other online because we’re caught in our own algorithms. As a result, how important do you think it is to listen more deeply outside of online spaces?
Listening is about more than reading words. The act of listening is to hear tone, to watch someone's body language, to see how they react to your reactions. It's a really active process that isn't simply consumption of the words that someone is saying. And so I think it's very hard to listen on social media because the way it's designed.
I also think it’s very difficult to understand something and its motivations when it's being presented through a distorted lens, and social media provides a distorted lens on reality. Everyone is familiar with a moment when they've been doom scrolling, constantly consuming content and getting more and more worked up, and then they look up and the sun's shining, the kids are playing, and the birds are singing. And you realize, the real world is better!
Gen Z talk a lot about “touching grass,” which just means get off your device. For Gen Z, it’s an act of rebellion against their big tech overlords to go outside and see grass in real life, not grass on Minecraft. I think of social media as ultra-processed food: You don't really know what's in it, and if you want to have a balanced diet, you need to make sure you're getting lots of other sources of nutrition.
Is there anything we can do to take a stand against the distorted information we’re getting on social media platforms?
I think at a societal level, there are two things we could do. One is that we could introduce transparency on algorithms. At the moment, there's no FDA or FCC or anyone who's actually looking at social media companies in a serious way, apart from nonprofits like mine. Transparency laws should be put into place. We've worked really hard with the British and the European regulators to introduce transparency legislation. And I can tell you social media platforms do not like this at all. They do not want to tell us how they're manipulating our information. Transparency is anathema to these companies, because it would help us understand how we're being manipulated.
We also need more accountability. Every other industry in America is subject to negligence law and to checks and balances. That's the beauty of the American experiment. And I say this is someone who comes from a country with kings and queens. The American experiment is that we can rule ourselves by creating a system of checks and balances such that there is no individual who is treated as though everything they do is beyond accountability. And unfortunately, we don't have that when it comes to social media and AI. We have no idea what's being fed to us, to our kids, to our society—and we have no way of holding these companies accountable if it's harmful to people.
Are you hopeful about the possibility of building digital spaces rooted in listening, nuance, and dignity?
The original promise of social media was about bringing us closer. It would make Tyneside in England seem as close as Timbuktu seem as close as Tulsa, Oklahoma. It would reduce racism in the world because we'd all listen to each other—we'd all take part in a global conversation. We'd feel close. It would make the world tiny in our hands.
But it hasn't done that. And the thing that really changed everything is algorithms. Social media companies said, Connecting people is great, and it's making us a lot of money, but you know what can make us an absolute ton of money? Adding an addictive element to this as well. That's where the algorithms came in, and they have changed our experiences.
There are a million things that you can do to bring back the original promise of social media by giving people control over some of the aspects of social media that have made it so damaging to our society. I’m hopeful that we’ll be able to introduce legislation that holds these companies liable. If you look around the world, you can find countries that are showing us that technology can serve us the people, not exploit us. In six years of doing this job, I have seen the UK Online Safety Act become law. I've worked on the Online Harms Act. I have worked with bipartisan legislators in the House and in the Senate on Section 230 of the Communications Decency Act of 1996 reform bills. We have helped to introduce the Kids Online Safety Act. We have done a ton of work, and we're seeing changes happening. In the great state of California, legislators are introducing bills that actually hold social media companies accountable. What we need is the federal government to have our backs, too.
For those who want to be part of the solution, what is one thing we can do this week to listen more deeply and maybe even shift the tone online?
I don’t want to say switch off and don’t consume social media, although that is probably the healthiest thing you can do for your own mind and soul.
I think if you're a parent or grandparent, the best thing you can do is ask your kids what they're seeing on social media this week and then have a real conversation with them—an exchange of understanding, where they're telling you what they're seeing on their platforms that you don't use. Kids are spending 120 minutes a day on average on these social platforms. So, talk to them about what they're seeing, help them to contextualize it, and have an active conversation with them about it that has no shame attached.
The other thing you can do is write to your lawmaker and ask them what they are doing to have our backs as citizens. We've been running this grand experiment of privatizing and handing over to a few people control of our entire information ecosystem. This has clearly got some real problems, whether it's self-harm, content, extremism, polarization, or hate and the lack of understanding that we seem to have for our fellow man. Ask your legislators what they are doing about that.
Please note that we may receive affiliate commissions from the sales of linked products.