Filter Failure at the Outrage Factory

13Feb19

‘Filter failure at the outrage factory’ is a term I’ve been using on Twitter[1], usually as part of a quote tweet for something describing the latest social media catalysed atrocity, but I thought it deserved a longer form explanation, hence this post.

I fear that I buried the lede in a note when I first published this. But the core point is that Rule 34 (if it exists there is porn for it on the Internet) is being weaponised against the general population. The people out past (what should be) the 5th standard deviation are very into their peculiar peccadilloes, and that looks like ‘engagement’ to the algorithms.

The Outrage Factory

This is a label I’m going to smear across the entire advertising funded web, but the nexus of the issue is where traditional media and its previously trusted brands intersects with social media. The old sources of funding (subscriptions and classifieds) dried up, and that drove media companies into the arms of online advertisers, which quickly became a race to the bottom for our attention. Click bait headlines, fake news, outrage – they all get attention. Facebook, and YouTube and all the rest run off algorithms that finds the stuff that grabs and holds our attention, and those algorithms have discovered that outrage == $$$. Unfortunately those algorithms don’t care whether the material they hype is based on truth or some conspiracy theory nonsense or just an outright falsehood.

Filter Failure

Clay Shirky famously said [2] “It’s not information overload. It’s filter failure.” when describing how we could deal with the fire hoses of information the Internet can throw at us. At the time (2008) we were seeing the beginning of a shift from hand selected filters such as the RSS feeds one might subscribe to in Google Reader to ‘collaborative filters’ where we started to use the links posted by friends, colleagues and trusted strangers on Facebook and Twitter. Notably those were the days before algorithmic feeds.

The problem that developed from there is that where we thought we were handing curation over to our ‘friends’ we ended up handing curation over to the attention economy[3]. JP Rangaswami laid out Seven Principles for filtering, and it seems what we have today fails on all counts – we lost control.

Why care?

We should care because our democracy is being subverted. The checks and balances that worked in the world of print, radio and TV have proven utterly ineffective, and bad actors like the Internet Research Agency (IRA) are rampantly exploiting this to undermine our society[4].

We should care because dipshit stuff like the anti-vax movement that used to be consigned to wingnuts on the fringe of society has been sucked into the mainstream to the extent that it’s ruining herd immunity and we’re having life threatening and life changing outbreaks of completely preventable diseases.

We should care because algorithmically generated garbage is polluting the minds of a whole generation of kids growing up with iPads and Kindles as their pacifiers.

We should care because our kids are finding themselves a click or two away from Nazi propaganda, the Incel subculture, and all manner of other stuff that should be found somewhere out past the 5th standard deviation in any properly functioning society rather than being pushed into the mainstream.

We should care because this is the information age equivalent of the Bhopal disaster, and the Exxon Valdez, and the Great Molasses Flood, where toxic waste is leaking out and poisoning us all.

What can we do?

We don’t have to play along and sell our attention cheaply.

A wise friend once tweeted ‘I predict that in the future, “luxury” will be defined primarily by the lack of advertisements’. So upgrade your life to a life of luxury. Leave (or at least uninstall) Facebook. Install an ad blocker. Curate you own sources of information on something like Feedly.

Update 23 Apr 2019

I’ve been tracking items relating to this with the ffof tag on Pinboard.in

Update 25 Apr 2019

Sam Harris’s ‘Making Sense’ podcast interview with ‘Zucked‘ author Roger McNamee ‘The Trouble with Facebook‘ provides an outstanding tour around this topic.

Update 22 Feb 2020

Aral Balkan came up with a great contraction of aggro and algorithm:

Aggrorithm (n): an algorithm written to incite aggression.

Aggrorithms are a dark pattern employed by businesses with bottom lines tied to confrontation and conflict (“engagement”). Profiling data is used to nudge users into conflict and increase revenue.

Update 21 Oct 2020

Another Sam Harris podcast, this time with Tristan Harris – ‘Welcome to the Cult Factory

Notes

[1] About not amplifying outrage, listening at scale, outrage amplification as part of the system design, and recommendation engines pushing anti-vax onto new parents
[2] “It’s Not Information Overload. It’s Filter Failure.” Video, Clay Shirky at Web 2.0 Expo NY, Sept. 16–19, 2008.
[3] Aka ‘surveillance capitalism
[4] Renée DiResta‘s Information War podcast with Sam Harris is a good place to start, and then maybe move on to The Digital Maginot Line.