Algorithms control your online life. Here’s how to reduce their influence.

83

The world in 2020 has been given plenty of reasons to be wary of algorithms. Depending on the result of the U.S. presidential election, it may give us one more. Either way, it’s high time we questioned the impact of these high-tech data-driven calculations, which increasingly determine who or what we see (and what we don’t) online.

The impact of algorithms is starting to scale up to a dizzying degree, and literally billions of people are feeling the ripple effects. This is the year the Social Credit System, an ominous Black Mirror-like “behavior score” run by the Chinese government, is set to officially launch. It may not be quite as bad as you’ve heard, but it will boost or tighten financial credit and other incentives for the entire population. There’s another billion unexamined, unimpeachable algorithms hanging over a billion human lives.

In the UK, few will forget this year’s A-level algorithm. A-levels are key exams for 18-year olds; they make or break college offers. COVID-19 canceled them. Teachers were asked what each pupil would have scored. But the government fed these numbers into an algorithm alongside the school’s past performance. Result: 40 percent of all teacher estimates were downgraded, which nixed college for high-achieving kids in disadvantaged areas. Boris Johnson backed down, eventually, blaming a “mutant algorithm.” Still, even a former colleague of the prime minister thinks the A-level fiasco may torpedo his reelection chances.

In the U.S., we don’t tend to think about shadowy government algorithms running or ruining our lives. Well, not unless you’re a defendant in one of the states where algorithms predict your likelihood of committing more crime (eat your heart out, Minority Report) and advise judges on sentencing. U.S. criminal justice algorithms, it probably won’t surprise you to learn, are operated by for-profit companies and stand accused of perpetuating racism. Such as  COMPAS in Florida and Wisconsin, which ProPublica found was twice as likely to label Black defendants “high risk” than white defendants — and was wrong about 40 percent of the time.

The flaws in such “mutant algorithms,” of course, reflect their all-too-human designers. Math itself isn’t racist, or classist, or authoritarian. An algorithm is just a set of instructions. Technically, the recipe book in your kitchen is full of them. As with any recipe, the quality of an algorithm depends on its ingredients — and those of us who have to eat the result really don’t think enough about what went on in the kitchen.

WATCH: This is how algorithms work

“All around us, algorithms provide a kind of convenient source of authority, an easy way to delegate responsibility; a short cut that we take without thinking,” writes mathematician Hannah Fry in her 2018 book Hello World: Being Human in the Age of Algorithms. “Who is really going to click through to the second page of Google every time and think critically about every result?”

Try to live without algorithms entirely, however, and you’ll soon notice their absence. Algorithms are often effective because they are able to calculate multiple probabilities faster and more effectively than any human mind. Anyone who’s ever spent longer on the road because they thought they could outsmart Google Maps’ directions knows the truth of this. This thought experiment imagining a day without algorithms ended in terrible gridlock, since even traffic-light systems use them.

Still, you would be right to be concerned about the influence algorithms have on our internet lives — particularly in the area of online content. The more scientists study the matter, the more it seems that popular search, video and social media algorithms are governing our brains. Studies have shown they can alter our mood (Facebook itself proved that one) and yes, even our 2016 votes (which explains why the Trump campaign is investing so much into Facebook ads this time around).

So before we find out the full effect of algorithms in 2020 let’s take a look at the algorithms on each of the major content services — many of which are surprisingly easy to erase from our lives.

1. Facebook

No algorithm on Earth, not even China’s Social Credit system, has the power of Mark Zuckerberg’s. Every day, nearly 2 billion people visit Facebook. Nearly all of them allow the algorithm to present posts in the order that the company has determined most likely to keep them engaged. That means you see a lot more posts from friends you’ve engaged with in the past, regardless of how close you actually are to them. It also means content that causes big back-and-forth fights is pushed to the top. And Zuckerberg knows it.

“Our algorithms exploit the human brain’s attraction to divisiveness,” warned a 2018 internal Facebook study, unearthed by the Wall Street Journal. Left unchecked, these mutant algorithms would favor “more and more divisive content in an effort to gain user attention & increase time on the platform.”

Zuckerberg, reportedly afraid that conservatives would be disproportionately affected if he tweaked the algorithm to surface more harmonious posts, shelved the study. It’s been a good four years for conservatives on Facebook, who have been playing the referee ever since they petitioned Zuckerberg to stop using human editors to curate news in 2016. Now look at Facebook’s top performing posts in 2020; on a daily basis, the list is dominated by names such as Ben Shapiro, Franklin Graham, and Sean Hannity.

But even conservatives have cause to be disquieted by the Facebook algorithm. Seeing friends’ popular posts has been shown to make us more depressed. Facebook addiction is heavily correlated with depressive disorder. So-called “super sharers” drown out less active users, according to the 2018 report; an executive who tried to reduce the super-sharer influence on the algorithm abruptly left the company.

How to fix it 

Algorithms control your online life. Here's how to reduce their influence.

Luckily, you can reduce their influence yourself. Because Facebook still allows you to remove the sorting algorithm from your timeline, and simply view all posts from all your friends and follows in reverse chronological order (that is, most recently posted at the top). On Facebook.com, click the three dots next to “News Feed,” then click “most recent.” On the app, you’ll need to click “settings,” then “see more,” then “most recent.”

The result? Well, you might be surprised to catch up with old friends you’d almost forgotten about. And if you interact with their posts, you’re training the content algorithm for when you go back to your regular timeline. In my experience, reverse chronological order isn’t the most thrilling way to browse Facebook — the algorithm knows what it’s doing, locking your brain in with the most exciting posts — but it’s a nice corrective. If you’re one of the two billion on Facebook every day, try this version at least once a week.

2. YouTube

The YouTube “watch next” algorithm may be even more damaging to democracy than Facebook’s preference for controversial posts. Some 70 percent of YouTube videos we consume were recommended by the service’s algorithm, which is optimized to make you watch more YouTube videos and ads no matter what (the average viewing session is now above one hour).

That means YouTube prioritizes controversial content, because whether you love it or hate it, you’ll keep watching. And once you’ve watched one piece of controversial content, the algorithm will assume that’s what you’re into, steering you to the kind of stuff viewers of that video opted to watch next. Which explains how your grandparents can start by watching one relatively innocuous Fox News video and end up going down a QAnon conspiracy theory rabbit hole.

A former Google programmer, Guillaime Chaslot, found the YouTube algorithm may have been biased enough to swing the outcome of the 2016 election, which was decided by 77,000 votes in three states. “More than 80 percent of recommended videos were favorable to Trump, whether the initial query was ‘Trump’ or ‘Clinton’,” he wrote in the immediate aftermath. “A large proportion of these recommendations were divisive and fake news.” Similarly, Chaslot found that 90 percent of videos recommended from the search query “is the Earth flat?” said that yes, indeed it is.

This isn’t just a problem in the U.S. One of the most important case studies of the YouTube algorithm’s political impact was in Brazil, where fringe right-wing candidate Jair Bolsonaro was elected president after unexpectedly becoming a YouTube star. “YouTube’s search and recommendation system appears to have systematically diverted users to far-right and conspiracy channels in Brazil,” a 2019 New York Times investigation found. Even Bolsonaro’s allies credited YouTube for his win.

How to fix it

Keep the algorithm at bay. Disable 'Up Next.'

Keep the algorithm at bay. Disable ‘Up Next.’

Image: chris taylor

Turning off autoplay, an option next to the “Up Next”  list, will at least stop you from blindly watching whatever the YouTube algorithm recommends. You can’t turn off recommendations altogether, but you can at least warn less tech-savvy relatives that the algorithm is doing its level best to radicalize them in service of views.

Chaslot’s nonprofit algotransparency.org will show you what videos are most recommended across the site on any given day. By now, you may not be surprised to see that Fox News content tends to float to the top. Your YouTube recommendation algorithm may look normal to you if it’s had years to learn your likes and dislikes. But a brand-new user will see something else entirely.

3. Instagram

While parent company Facebook allows you to view your feed in reverse chronological order, Instagram banished that option altogether back in 2016 — leading to a variety of conspiracy theories about “shadow banning.” It will still show you every photo and story if you keep scrolling for long enough, but certain names float to the top so frequently that you’d be forgiven for feeling like a stalker. (Hello, Instagram crushes!)

How to fix it

Algorithms control your online life. Here's how to reduce their influence.

Image: chris taylor

As of a February update, Instagram will at least let you see who you’ve been inadvertently ignoring. Click on your profile icon in the bottom right corner, click on your “following” number, and you’ll see two categories: “Least Interacted With” and “Most Shown In Feed.” Click on the former, scroll through the list, and give your most ignored follows some love.

You can also sort your feed by the order in which you followed accounts, which is truly infuriating. Why offer that option, and not just give us a straight-up chronological feed? Instagram is also said to be testing a “Latest posts” feature that will catch you up on recent happenings, but this hasn’t rolled out to all users yet.

4. Twitter

Just like its social media rivals, Twitter is obsessed with figuring out how it can present information in anything other than most recent order — the format that Twitter has long been known for. Founder Jack Dorsey has introduced solutions that will allow you to follow topics, not just people, and to show you tweets in your timeline that drove the most engagement first.

How to fix it

Go! See latest tweets! Be free of the algorithm!

Go! See latest tweets! Be free of the algorithm!

Image: chris taylor

All of these non-chronological tweaks fall under the “Home” heading at the top of the page. Click the star icons next to it, and you’ll have the opportunity to go back to traditional Twitter-style “Latest Tweets.” Of all the social media services, Twitter is the one that makes it easiest to ignore its recommendation algorithm.

It may take a little more scrolling to find the good stuff on Latest Tweets, and of course what you’re seeing depends on what time of day you’re dipping into the timeline. Still, Latest Tweets is your best bet for a range of opinions and information from your follows unimpeded by any mutant algorithms.

https://mashable.com/article/how-to-avoid-algorithms-facebook-youtube-twitter-instagram/