🔒 Taking back control from social media – With insights from The Wall Street Journal

Social media is centre stage after the world once more witnessed its power to mobilise like-minded people. The alt-right’s active social media networks facilitated the unprecedented invasion of the US Capitol. Just as social media networks also sparked regime change protests of Arab Spring. It comes at a cost, however. In this superb assessment, Joanna Stern, personal tech guru at our partners The Wall Street Journal, shares info about a Facebook alternative and offers practical suggestions on how we can take back control from the social media giants. Partly anyway. Alec Hogg 

Use Spotify? Access BizNews podcasts here.

Use Apple Podcasts? Access BizNews podcasts here.
___STEADY_PAYWALL___


Social-Media algorithms rule how we see the world. Good luck trying to stop them.

What you see in your feeds isn’t up to you. What’s at stake is no longer just missing a birthday. It’s your sanity—and world peace.

Jan. 17, 2021 7:00 am ET

It’s hard to pinpoint exactly when we lost control of what we see, read—and even think—to the biggest social-media companies.

I put it right around 2016. That was the year Twitter and Instagram joined Facebook and YouTube in the algorithmic future. Ruled by robots programmed to keep our attention as long as possible, they promoted stuff we’d most likely tap, share or heart—and buried everything else.

Bye-bye, feeds that showed everything and everyone we followed in an unending, chronologically ordered river. Hello, high-energy feeds that popped with must-clicks.

At around the same time, Facebook—whose News Feed has been driven by algorithms since 2009—hid the setting to switch back to “Most Recent.”

No big deal, you probably thought, if you thought about it at all. Except these opaque algorithms didn’t only maximize news of T. Swift’s latest album drops. They also maximized the reach of the incendiary—the attacks, the misinformation, the conspiracy theories. They pushed us further into our own hyperpolarized filter bubbles.

“There are bad people doing bad things on the internet—QAnon, white supremacists—it’s not that Facebook, YouTube and other social-media sites allow it on their platform. It’s that they amplify it,” says Hany Farid, a computer science professor at the University of California, Berkeley.

The worst-case scenarios are no longer just hypothetical. People are shown things that appeal most to them, they click, they read, they watch, they fall into rabbit holes that reinforce their thoughts and ideas, they connect with like-minded people. They end up in their own personalized version of reality. They end up inside the U.S. Capitol.

Certainly social media isn’t alone to blame. And when the blame does fall on social media, the robots aren’t the only ones culpable. The silencing of President Trump’s Facebook and Twitter accounts revealed the opposite: The humans running these companies still have final say over what does—and doesn’t—appear. (And last I checked, we can still opt out of using social media at all.)

But at the heart of it all, this is still a gigantic technology problem: Computers are in charge of what we see and they’re operating without transparency.

Usually, the goal of my column is to provide solutions to tech problems. In this case, there isn’t one—certainly not a simple one. All I can do is share some ideas.

Idea #1: No algorithms, no ads

Mark Weinstein, the founder of MeWe, a social network that markets itself as the anti-Facebook, is very clear that the solution is going back to a pure chronological feed with zero manipulation.

On MeWe you follow friends, pages or groups. The posts appear in your feed as they were posted. “No advertiser, no marketer, no political operative and no member can boost content into someone else’s feed. So we completely dismantle the idea of disruptive, outrageous amplification,” Mr. Weinstein said. “It can’t happen on MeWe.”

Specifically, he says, this limits the spread of misinformation. “You have to choose to follow misinformation. You can’t be spoon-fed misinformation by an algorithm or by a Russian government or a Chinese government,” he added.

In the recent week, however, as far-right users look for alternatives to Facebook, Twitter and Parler, they’ve landed at MeWe, which currently reports 15.5 million users. Mr. Weinstein says the social network has invested heavily in moderation and that its terms of service are clear: You will be kicked off for inciting violence, posting unlawful material and more. In the wake of Parler’s removal, MeWe has been in touch with Apple, Google andAmazon about making sure the app meets their moderation guidelines.

The issue with a non-algorithmic timeline, however, is that the feed can become flooded and hard to sort through. It’s exactly what Twitter, Facebook and Instagram told me when I asked why they did away with chronological defaults in the first place.

Mr. Weinstein says MeWe, which makes money from paid features and a premium subscription option—not from advertising—solves for that with filters and tools to manage your feed; you can sort by contacts, groups and pages you follow. And because there aren’t targeted ads, MeWe isn’t constantly collecting data about you, Mr. Weinstein says.

Idea #2: Deprioritize the destructive

For the big guys which depend on advertising, however, removing the algorithm is as likely as teaching my dog to write columns.

Instead, experts suggest that the platforms get serious about deprioritizing the outrage, anger and conspiracy, and prioritizing the trustworthy, thoughtful and reputable—even if they know it means less engagement.

“Algorithms simply have to just decide, ‘We think there is more value to the Wall Street Journals, the Wikipedias, the World Health Organizations than the Dumbdumb.coms,’” said Prof. Farid.

And there’s proof that it can work. In a March 2020 study, Prof. Farid and other researchers found that YouTube had in fact decreased the promotion of conspiracy videos after the company changed its recommendation algorithms.

Facebook has done the same. In the weeks leading up to the election, Facebook and Instagram took steps to limit the spread of information its algorithms classified as potential misinformation, including debunked claims about voter and ballot fraud. It resulted in a boost to trusted news sources and a decrease to partisan sites.

And there’s evidence that election-fraud narratives sharply decreased on Twitter following the suspension of Mr. Trump’s account, according to media intelligence platform Zignal Labs.

Regulators could also step in. Beyond the various antitrust suits, there have been specific calls to hold the companies liable for their algorithms. Reporting from my colleagues has shown that Facebook has been well aware of its recommendation system’s tendency to drive people to extremist groups. One bill, introduced by Rep. Tom Malinowski (D., N.J.) and Rep. Anna Eshoo (D., Calif.), aims to hold the platforms liable for “algorithmic promotion of extremism.”

Idea #3: Give back control

The craziest idea of all? Oh, you know, just give us back some control.

“What if people could say, ‘I want to see news from a broad political spectrum of sources,’ or, ‘I only want to see posts from friends and family,’ ” said Jesse Lehrich, the co-founder of Accountable Tech, a nonprofit dedicated to combating disinformation on social media.

Facebook does, at least, allow you to see some information about why you are seeing something. Tap the three horizontal dots on any post in your feed and then “Why Am I Seeing This Post?”

And if you think the good, old chronological feed is the answer, there are ways to get it back, at least temporarily, on some services. The settings are as hidden as Waldo, however:

Facebook: In a web browser, go to the home icon at the top of your feed, scroll through the menu on the left side. Select “See More,” then “Most Recent.” In the mobile app, go to the three horizontal lines on top or bottom right of your screen and look for “Most Recent.” Just know, you’ll be bumped out of this feed when you close the website or the app.

Twitter: Twitter is far easier. A small star in the upper right corner of the website and app allow you to “See latest Tweets” instead of “Top Tweets.” While it used to throw you back to the algorithmic feed, it now keeps you in the feed you last used. I often toggle between the two.

YouTube: You can’t turn off the entire set of algorithmic recommendations but you can switch to “Latest videos” in each category or search query. You can also turn off autoplay. In a web browser, look for the tiny toggle with a play button on the bottom of the video player. In the app look for the tiny toggle at the top of the video player.

TikTok: Next to the addictive, algorithmically driven For You feed is the Following feed, showing just the people you follow on the service. Just know: TikTok still uses algorithms here to show you videos it thinks you’ll want to watch most.

Instagram: Sorry, no can do. Algorithms only. A Facebook spokeswoman explained that with the old chronological feed, people missed 70% of the posts in their feed—almost half their friends and family’s posts. After making the change to the algorithmic feed, the company found that on average, people saw more than 90% of their friends’ posts.

If it were just about us and our friends and family, that would be one thing, but for years social media hasn’t been just about keeping up with Auntie Sue. It’s the funnel through which many now see and form their views of the world.

Will more algorithms that serve Big Tech’s bottom line continue to inform those views, or will we get some real rules and control? Unfortunately, it’s not up to us to decide.

Write to Joanna Stern at [email protected]

Visited 275 times, 1 visit(s) today