The thought first occurred to me a couple of years ago, when I realised that thanks to outsourcing and automation, we would struggle today to do many things that were once life skills. We also lost a little more than that – learning.
Sometimes directly, and sometimes, through the interactions with the world, they facilitated a learning experience that taught one how to navigate the world and the different kinds of folks that made up its systems.
Regression Planning
It was continued with a bit more specificity in a subsequent post.
Instagram, Facebook, Tinder, Spotify, Netflix, Amazon – everything is a feed of recommendations, whether it be social interactions, music, content or shopping! Once upon a time, these were conscious choices we made. These choices, new discoveries, their outcomes, the feedback loop, and the memories we store of them, all worked towards developing intuition.
Intelligence, intuition and instincts. The journeys in the first two are what have gotten the third hardwired into our biology and chemistry. When we cut off the pipeline to the first two, what happens to the third, and where does it leave our species?
AI: Artificial Instincts
A recent article in the New Yorker which framed Netflix (or at least a genre of shows) as “ambient TV”, made me think of it some more.
In a Netflix context, one could argue that it is understandable if one wants to outsource the decision. After all, it’s only entertainment. But I do believe there are at least three second-order consequences that we might not be thinking about enough.
Self
Outsourcing is convenient, there is no denying that. But, to quote from a wonderful article from a long while ago, the victim of convenience is conscious choice. In Tyranny of Convenience, Tim Wu writes about how convenience is the most underestimated and least understood force in the world today and has the ability to make other options unthinkable. But as I wrote in both of my previous posts, the process of making choices, and mistakes, is part of the development of the self. As Tim Wu states, Today’s cult of convenience fails to acknowledge that difficulty is a constitutive feature of human experience. Convenience is all destination and no journey.
You would ask, “all this from just choosing Netflix’s Shuffle Play?”. All I have to offer is a (part of a) quote – watch your actions, they become your habits; watch your habits, they become your character; watch your character, it becomes your destiny.” Is this the version of yourself that you really want? Habits lead me to the next point.
Society
To begin with, as Karthik notes in his post on the subject, before technology reached this stage, we used to ask around for recommendations. What did this achieve in addition to the important element of interaction? Variety, serendipity, and the opportunity to debate, agree, disagree, identify biases, agree to disagree but hopefully in a civilised manner. In the absence of that, mistrust increases, generalised reciprocity reduces and transaction costs rocket. At a species level, the ability to create and act on a shared understanding is what got us this far.
The habit that makes us choose the Shuffle Play option – convenience – also contributes to forwarding WhatsApp messages without bothering to verify it. As I have written before, the daily non-use of rational, deliberate thinking starts playing at higher levels until this capability is completely lost! And worse, it will begin to manifest itself in everything from politics to personal finance.
Another aspect of this is conformity. When an algorithm chooses for us, we believe it is individualisation. But to quote Wu again, today’s technologies of individualization are technologies of mass individualization. Customization can be surprisingly homogenizing. In essence, we are conforming more than being unique individuals. Everyone discussing Bridgerton is cute. But at a fundamental level, we have evolved because of variety – nature’s play with traits, and survival. When I extend this in the direction of thoughts and ideas, conformity is a recipe for fragility as a species. So, is this the version of society you really want? Well, not me, but I do know some others who would prefer this, and that’s the last point.
Surveillance Capitalism
Jamie Bartlett once tweeted, “The end result will be ad targeting so effective that you may well question the notion of free will altogether“. Marketing has long been playing and winning by using behaviour patterns 1 , but capitalism’s rogue version, exhibited by everyone from Google onwards, has weaponised extraction to a different level.
By embedding its values and goals into concrete technologies, capital seeks to assert dominion over the future — constraining what type of social change is viable. This makes techno-politics a natural battleground for staging struggles over what utopias are imagined and whose utopia is materialized.
Future Schlock
While capitalism itself exploited nature, surveillance capitalism exploits human nature. The natural corollary to that is that the more predictable you are, the more dollars can be made of you. And when we let an algorithm choose for us, we are walking into it. Like a frog in boiling water, all in the name of convenience. Little suggestions in Gmail are nice, until the tech evolves into GPT -3, which can be easily weaponised for hate-mongering. As I wrote in “In Code we trust“, TikTok has shown that an AI can build a taste graph from scratch. It’s a matter of time before it actually starts creating culture. All the while propagating the biases of its masters. And in future, a lifestyle the habits of which are profitable to anyone willing and able to exploit it.
If we don’t want to go from “I think, therefore I am” to “I am predictable, therefore I exist”, it will take some conscious decisions. Unfortunately, as Tim Wu points out, humans did not evolve with the capability to understand their reality because it was not important to survival. Any illusion that keeps us alive long enough to procreate is good enough. Evolution doesn’t care about humanity. Which means, we need to consciously balance convenience and conformity with deliberate choices.
A question I asked myself while writing this was, when there is no agency, what happens to morality? My own first answer was worrying – maybe you just become numb to life’s deeper questions because there’s always an algorithm to give you something you didn’t know you wanted. And that’s the panacea that this age warrants. And hence default in our stars, and an artificial existence.
6 thoughts on “Default in our stars”