in

‘Algorithm choice’ can fix social media — but only on decentralized platforms

Everyone seems to agree that social media has become a cesspit, with cancel-culture mobs enforcing ideological purity on one side and trolls spreading conspiracy theories on the other.

X and Facebook are accused of amplifying hatred and conflict, with riots in the United Kingdom highlighting how a handful of social media posts can ignite a cauldron of simmering anger and resentment.

In response, governments around the world are cracking down on free speech. Turkey banned Instagram, Venezuela banned X and the UK government has been sending people to jail for inciting violence — and in some cases, just for having shitty opinions.

But there’s a better way to fix social media than banning accounts and censoring “misinformation.”

The root cause of the problem isn’t fake news in individual posts, it’s how social media algorithms prioritize conflict and highlight the most polarizing content in a bid for engagement and ad dollars.

“This is going to sound a little bit crazy, but I think the free speech debate is a complete distraction right now,” former Twitter boss Jack Dorsey told the Oslo Freedom Forum in June. “I think the real debate should be about free will.”

A marketplace of social media algorithms

Dorsey argues that black-box social media algorithms are impacting our agency by twisting our reality and hacking our mind space. He believes the solution is to enable users to choose between different algorithms to have greater control over the sort of content they serve up.

Lyn Alden and Jack Dorsey at the Oslo Freedom Forum. (X)

“Give people choice of what algorithm they want to use, from a party that they trust, give people choice to build their own algorithm that they can plug in on top of these networks and see what they want. And they can shift them out as well. And give people choice to have, really, a marketplace.”

It’s a simple but compelling idea, but there are a truckload of hurdles to overcome before a mainstream platform would willingly give users a choice of algorithm. 

Why social media platforms will resist algorithmic choice

Princeton computer science professor Arvind Narayanan has extensively researched the impact of social media algorithms on society. He tells Cointelegraph that Dorsey’s idea is great but unlikely to happen on the big platforms.

“A marketplace of algorithms is an important intervention. Centralized social media platforms don’t allow users nearly enough control over their feed, and the trend is toward less and less control, as recommendation algorithms play a more central role,” he explains.

“I expect that centralized platforms won’t allow third-party algorithms for the same reasons they don’t provide user controls in the first place. That’s why decentralized social media is so important.”

There are some early experiments on decentralized platforms like Farcaster and Nostr, but Twitter spinoff Bluesky is the most advanced and already has this functionality built-in. However, it’s only been used so far for specialty content feeds.

Read also

Features

Insiders’ guide to real-life crypto OGs: Part 1

Features

Peter McCormack’s Real Bedford Football Club puts Bitcoin on the map

Bluesky to trial algorithm choice

But Northwestern University Assistant Professor William Brady tells Cointelegraph he’ll be trialing a new algorithm on Bluesky in the coming months that will be offered as an alternative to the site’s main algorithm.

Studies have shown that up to 90% of the political content we see online comes from a tiny minority of highly motivated and partisan users. “So trying to reduce some of their influence is one key feature,” he says.

The “representative diversification algorithm” aims to better represent the most common views rather than the most extreme views without making the feed vanilla.

Bluesky gained a lot of users after Elon Musk’s takeover of Twitter. (Bluesky)

“We’re actually not getting rid of strong moral or political opinions, because we think that’s important for democracy. But we’re getting rid of some of that most toxic content that we know is associated with the most extreme people on that distribution.”

Create a personalized algorithm using AI

Approaching the subject from a different direction, Groq AI researcher and engineer Rick Lamers recently developed an open-source browser extension that works on desktop and mobile. It scans and assesses posts from people you follow and auto-hides posts based on content and sentiment.

Lamers tells Cointelegraph he created it so that he could follow people on X for their posts about AI, without having to read inflammatory political content.

“I needed something in-between unfollowing and following all content, which led to selectively hiding posts based on topics with a LLM/AI.”

The use of large language models (LLMs) to sort through social media content opens up the intriguing possibility of designing personalized algorithms that do not require social platforms to agree to change.

But reordering content on your feed is a much bigger challenge than simply hiding posts Lamers says, so we’re not there yet.

Facebook and Twitter began with chronological feeds. (Pexels)

How social media algorithms amplify conflict

When social media first began in the early 2000s, content was displayed in chronological order. But in 2011, Facebook’s news feed started choosing “Top Stories” to show users.

Twitter followed suit in 2015 with its “While You Were Away” feature and moved to an algorithmic timeline in 2016. The world as we knew it ended.

Although everyone claims to hate social media algorithms, they’re actually very useful in helping users wade through an ocean of content to find the most interesting and engaging posts.

Dan Romero, the founder of the decentralized platform Farcaster, points Cointelegraph to a thread he wrote on the topic. He says that every world-class consumer app uses machine learning-based feeds because that’s what users want.

“This is [the] overwhelming consumer revealed preference in terms of time spent,” he said.

Unfortunately, the algorithms quickly learned that the content people are most likely to engage with involves conflict and hatred, polarizing political views, conspiracy theories, outrage and public shaming.

“You open your feed and you are smashed with the same stuff,” says Dave Catudal, the co-founder of the SocialFi platform Lyvely.

“I don’t want to be bombarded with Yemen and Iran and Gaza and Israel and all that […] They are clearly pushing some kind of political, disruptive conflict — they want conflict.”

Studies show that algorithms consistently amplify moral, emotional and group-based content. Brady explains this is an evolutionary adaptation.

“We have biases to pay attention to this type of content because in small group settings, this actually gives us an advantage,” he says. “If you are paying attention to emotional content in your environment it helps you survive physical and social threats.”

Read also

Features

Peter McCormack’s Real Bedford Football Club puts Bitcoin on the map

Features

5 years of the ‘Top 10 Cryptos’ experiment and the lessons learned

Social media bubbles work differently

The old concept of the social media bubble — where users only get content they agree with — is not really accurate. 

While bubbles do exist, research shows that users are exposed to more opinions and ideas that they hate than ever before. That’s because they are more likely to engage with content that enrages them, either by getting into an argument, dunking on it in a quote tweet, or via a pile-on.

Content that you hate is like quicksand — the more you fight against it, the more the algo serves up. But it still reinforces people’s beliefs and darkest fears by highlighting the absolute worst takes from “the other side.”

Like cigarette companies in the 1970s, platforms are well aware of the harms the focus on engagement causes to individuals and society, but it appears that there’s too much money at stake to change direction. 

Meta made $38.32 billion in ad revenue last quarter (98% of its total revenue), with Meta’s chief financial officer, Susan Li, attributing much of this to AI-driven ad placements. Facebook has trialed the use of “bridging algorithms,” which aim to bring people together rather than divide them, but elected not to put them into production.

Farcaster is the social media platform favored by Ethereans. (Farcaster)

Bluesky, Nostr and Farcaster: Marketplace of algorithms

Dorsey also realized he wasn’t going to be able to bring meaningful change to Twitter, so he created Bluesky in an attempt to build an open-source, decentralized alternative. But disillusioned with Bluesky making many of the same mistakes as Twitter, he’s now thrown his weight behind Bitcoin-friendly Nostr.

The decentralized network allows users to choose which clients and relays to use, potentiallyoffering users a wide choice of algorithms.

But one big issue for decentralized platforms is that building a decent algorithm is a massive undertaking that is likely beyond the community’s abilities.

A team of developers built a decentralized feed market for Farcaster for the Paradigm hackathon last October, but no one seemed interested. 

The reason, according to Romero, was that community-built feeds were “unlikely to be performant and economic enough for a modern, at-scale consumer UX. Might work as an open source, self-hosted type client.”

“Making a good machine learning feed is hard and requires significant resources to make performant and real-time,” he said in another thread.

“If you want to do a feed marketplace with good UX, you’d likely need to create a back end where developers would upload their models and the client runs the model in their [infrastructure]. This obviously has privacy concerns, but maybe possible.”

A bigger problem, however, is that it’s “TBD if consumers would be willing to pay for your algo, though.”

Andrew Fenton

Based in Melbourne, Andrew Fenton is a journalist and editor covering cryptocurrency and blockchain. He has worked as a national entertainment writer for News Corp Australia, on SA Weekend as a film journalist, and at The Melbourne Weekly.

This article first appeared at Cointelegraph.com News

What do you think?

Written by Outside Source

Avalanche rises after big token unlock; staking reward falls

BlackRock’s iShares Ethereum Trust approaches $1B in net inflows — Morningstar