Online exploitation of tragedy: The moral argument for leaving Elon Musk’s X

In the wake of a tragic murder in his neighbourhood, Dave Lee recounts a chilling phenomenon that unfolded on his social media. A shocking video of the incident went viral, capturing millions of views. What’s even more disturbing is how a new breed of online influencers, empowered by Elon Musk’s Twitter reign, seized this tragedy as a content opportunity. The resulting toxic comments and hateful discourse are emblematic of the platform’s alarming transformation under Musk’s leadership. It’s a stark reminder of the corrosive power of online incentives, pushing us to reconsider our engagement in this disturbing digital realm.

Sign up for your early morning brew of the BizNews Insider to keep you up to speed with the content that matters. The newsletter will land in your inbox at 5:30am weekdays. Register here.


The Moral Case for No Longer Engaging With Elon Musk’s X: Dave Lee

By Dave Lee

A man was murdered in my neighborhood on Monday. Ryan Carson was waiting at a bus stop with his girlfriend just before 4 a.m. when a man stabbed him repeatedly in the chest. The couple had been at a wedding.

A video of the attack, obtained initially by the New York Post, was soon seized upon by one of X’s newest “stars” — one of those users who has thrived under the new Elon Musk regime at the former Twitter. His feed (which I will not publicize) is a stream of incendiary incidents from around the world, posted several times a day to an audience that is approaching a million followers.

I don’t follow this account, but X’s algorithm makes absolutely sure that I see what it has to say. A senseless murder is apparently a content opportunity not to be missed. The user’s post on Tuesday contained all the ingredients for success: It was timely. It was shocking. It was an innocent 32-year-old man dying on the streets of New York City. It was a chance, duly taken, to write an inflammatory comment on Carson’s work in public policy, as though it had somehow led to this moment, as though he had it coming. 

As I rode the subway home to Bedford-Stuyvesant, I watched as the video clocked 1 million views, then 2 million. Up up up. Disgusting replies flooded in by the thousands: That’s what you get for supporting woke policies; should have carried a gun; looks planned. By the time I got home, I had deleted the app from my phone.

I will have to continue to follow X, of course, because it’s part of my job. But it’s time to step back as an engaged user, one who for the past decade has posted several times a day and scrolled countless times more. My eyeballs are no longer for sale to Musk and whatever grotesque content he wants to serve up in front of them.

Read more: Mary Vilakazi takes over at FirstRand as Pullinger steps down and Celliers leaves FNB 

 

Social networks are molded by the incentives presented to users. In the same way we can encourage people to buy greener cars with subsidies or promote healthy living by giving out smartwatches, so, too, can levers be pulled to improve the health of online life. Online, people can’t be told what to post, but sites can try to nudge them toward behaving in a certain manner, whether through design choices or reward mechanisms. 

Under the previous management, Twitter at least paid lip service to this. In 2020, it introduced a feature that encouraged people to actually read articles before retweeting them, for instance, to promote “informed discussion.” Jack Dorsey, the co-founder and former chief executive officer, claimed to be thinking deeply about improving the quality of conversations on the platform — seeking ways to better measure and improve good discourse online. Another experiment was hiding the “likes” count in an attempt to train away our brain’s yearn for the dopamine hit we get from social engagement.

One thing the prior Twitter management didn’t do is actively make things worse. When Musk introduced creator payments in July, he splashed rocket fuel over the darkest elements of the platform. These kinds of posts always existed, in no small number, but are now the despicable main event. There’s money to be made. X’s new incentive structure has turned the site into a hive of so-called engagement farming — posts designed with the sole intent to elicit literally any kind of response: laughter, sadness, fear. Or the best one: hate. Hate is what truly juices the numbers. 

The user who shared the video of Carson’s attack wasn’t the only one to do it. But his track record on these kinds of posts, and the inflammatory language, primed it to be boosted by the algorithm. By Tuesday, the user was still at it, making jokes about Carson’s girlfriend. All content monetized by advertising, which X desperately needs. It’s no mistake, and the user’s no fringe figure. In July, he posted that the site had paid him more than $16,000. Musk interacts with him often. 

X is now an app that forcibly puts abhorrent content into users’ feeds and then rewards financially the people who were the most successful in producing it, egging them on to do it again and again and make it part of their living. Know this: As the scramble for attention increases, the content will need to become more violent, more tragic and more divisive to stand out. More car crashes, high school fights and public humiliation.

Decency long left the building at X. It flows from the very top. When former executive Yoel Roth, whom Musk wrongly accused of being a pedophile, warned recently about hate speech on X, CEO Linda Yaccarino’s first reaction was to play down his concerns. On Monday, Musk followed up: “I have rarely seen evil in as pure a form as Yoel Roth.”  

I have. It was on the corner of Malcolm X Boulevard and Lafayette Avenue that Monday night, as replayed to the hordes of baying, heartless ghouls X is happy to host and, in some cases, finance. To continue to engage at length with X is to enable this cycle of behavior. You can count me out.

Read also:

© 2023 Bloomberg L.P.