Everyone says I hate the algorithm
It is pretty usual nowadays to hear, read, and see people complaining about the tyranny of algorithms. It is becoming quite a prohibition proclaim, or less radically, a call for strong regulation. As if a promised, pristine land of freedom of choice was just there, close, reachable just by having prescription algorithms cancelled.
Indeed, the expanse of algorithms in our daily lives is breath-taking: in our feeds in social networks, not only deciding what contents we see, watch, or listen to (and what contents we do not) but also the ads filling the empty spaces within those feeds; recommendation algorithms for songs and videos; algorithms governing search engine results; and many more than are far less visible than these.
Some prevention is quite understandable. And yet, the reasons provided by the supporters of algorithm cancellation sound wrong to me. Let me explain why.
We are the top
It is hard not noticing the anthropocentric bias behind this trust in nothing but other humans. In my honest opinion, humans are precisely who other humans should trust the less. Just sighting the last 12 millennia of human history, since the first invention of agriculture and henceforth politics and organized religions, and the long list of ruined civilizations, massacres, or domestic violence should claim for any other way to govern ourselves.
Were Historical facts regarding humans governing humans not enough, just see what we have done, and keep doing, with all the other species on Earth: we implicitly consider ourselves the right and only owners of the whole planet, and most importantly for its non-human inhabitants, the owners of their lives, and deaths. It does not matter how many living creatures we wipe out, they will always count less than one single human life.
At all effects, we act as we are the only relevant living beings on Earth, the right owners of the planet’s seas, lands, and resources, and soon the right exploiters of the Solar System and beyond. The Cosmos is out there for us to take.
The spirit behind our actions, past, present, and close future, is the one expressed by this quote:
All animals are equal, but some animals are more equal than others.
“Animal Farm”, George Orwell

It does not matter how many disasters our attitude could bring to ourselves and the rest of the planet.
We are at the top, and there is no discussion about this statement, no matter how radical your vision of our relationship with the other travellers in the Spaceship Earth might be.
So being there, at the top, how could we possibly tolerate another kind of intelligence governing us?
And, again, do not fool yourselves. Algorithms are demeaning, but that is not the issue here. Can you feel the fear? This fear of losing our privileged position in the Cosmos is what we masquerade as this claim for breaking the chains of automated prescriptions.
The prison of choice
Let me be clear: our problem is with the prescribers, not with the prescription itself. Being recommended what to think, do, or believe has never been a problem. We all have been living by prescription since the very beginning of human communities.
We do not need to go back to the dawn of humankind to realize how natural it feels for us to follow the lead. For example, the album “Thriller” by Michael Jackson has sold more than 70 million records worldwide. Is this not an example of mass prescription?
This masterpiece of pop music came out in 1982. The Golden Age of Mass Media dictating what we should be reading, listening to, or watching would last 20 years more.
Until the Internet killed it.
Not immediately, though. It happened during its second incarnation: the Web 2.0, which came into being after the first Internet bubble burst in 2001.
One of the most interesting aspects of the Web 2.0 was the concept of The Long Tail. Even though the term was invented years before, it was popularized by Chris Anderson in an article in the magazine Wired (2004), still available online at the time of this writing, and later in a full-length book (2006) with the same title.
Mesmerizing as it seemed at the time, the concept of Long Tail is quite easy to summarize: being the cost of digital assets so neglectable, B2C retailers can keep a virtually infinite catalogue online because just a few sales of whatever unknown item would turn profitable.

The Long Tail destroyed the concept of prescriptions based on rankings. From now on, everyone would be their own prescriber. In a context of total freedom of choice, rankings would not make sense anymore.
The trouble with discovery
A printed edition of “The Long Tail” book by Chris Anderson lays beside me on the table right now. I bought it in 2006, or maybe in 2007. At amazon.com. I have kept it in my library, likely for sentimental reasons for it lost its aura more than a decade ago.
Within its cover in full, aspirational white, a sentence in orange from Eric Schmidt outstands:

Why would the Long Tail be so influential to Google?
Because along with the promise of freedom of choice, it also came the disillusion of realizing that an infinite catalogue of possibilities is useless unless you know what you are looking for first.
And that has been the underpinning pillar for Google since its foundation to the current day: can technology figure out what users might be looking for even before they know it themselves?
At the beginning of the 21st century, this was still an open question. It is not anymore.
Everything in technology spins around gathering data of any sort and then crunch it until producing a reasonable answer to the question above. And then, iterate. And iterate again, improving how well these answers match what users may accept as what they were looking for.
And they do. Algorithms do work. And hence humans are predictable.
Maybe this is it. Maybe the key feature of our species is that we are social by nature, and socialization turns us common. Still distinguishable enough for us to see ourselves as someone different to everyone else, though not so different to make our choices impossible to compute.
An escape from computational horror
Thanks to the fact that algorithms run on the data collected from our activities, we can control algorithms by taking control of those activities. And that is surprisingly easy to do: just make your daily life more diverse.
Follow different paths to work, discover new places to eat food or go for a stroll; try not to consume the same kind of content (videos, songs, texts, audibles).
Pet your algorithm as you’d water a plant. Train it. Curate your algorithms, and make true the promise of the Long Tail: become your only prescriber.
Take control.
Copyright: the artwork at the top comes from the blog thenewrepublic.com taken without permission.
RIP pure algorithms could have been another related essay?
Difficult to say if I hate anything but easy to say how much I dislike algorithm interference by humans vs developing them to better typify one’s likes and dislikes and suggest content accordingly.
Meaning censorship of eg violence makes sense but censorship of opposing ideas to what is considered the official truth has taken over the pure but innocent algorithms.
LikeLike
Yes, you are right. The trouble with implementing any kind of content curation like this is “how”. For common sense is also a cultural thing. Should algorithms ban videos denouncing government violence against their citizens? If not, would not they be taking sides?
I would for sure support content moderation, but only if it was very restricted (i.e. I cannot see any fair use for child abuse of any kind). The list of banned topics would be provided by a sufficiently diverse international group of people, instead of whatever big companies decide internally, as it happens right now.
LikeLike