Artificial Culture

It’s almost a year and a half since I wrote In Code we Trust. More recently, Tim Ferriss had Eric Schmidt on his podcast (transcript). In what I thought was a fascinating discussion based on the latter’s recent book  The Age of AI: And Our Human Future, (coauthored with Henry A. Kissinger and Daniel Huttenlocher), they also brought up AlphaGo. Go was a game that humans had been playing for 2,500 years, and it was thought to be incomputable until DeepMind’s AlphaGo beat world champions. As Schmidt explained, some of its moves and strategies were the kind no one had thought of before. In Kissinger’s words, we’re entering a new epoch, similar to the Renaissance, this age of artificial intelligence, because humanity has never had a competitive intelligence, similar to itself, but not human. To note, a more recent version – AlphaGo Zero self-taught itself without learning from human games, and surpassed its predecessor in 40 days!

The Things AI can do!

Even as GPT-3 use cases continue to rise – from copywriting to re-creating childhood imaginary friends, we’re also entering a visual era now. If you haven’t already, do check out DALL-E (what a brilliant name!) images based on attributes chosen by the user. And there’s already DALL-E 2, which can edit images too. The pace!

And if we go by this post on the Google AI blog, the Pathways Language Model (PaLM) can now understand, context, cause and effect and do things explaining a joke, and guessing movie names from emojis! What’s more, we now have AI that is not specialised – MRKL, in which a single NLP can perform as a customer service chatbot, and help analyse the sentiment of CEO earnings calls.

From self-driving cars to disease-mapping to investing, AI is everywhere.

Deep learning and deep influence

As I wrote earlier, as AI makes its way into many different parts of our life, do we even know how we come to have many of our desires? Not that we do now, but at least it is not manipulated by any potentially evil intention.

So much of our lives — from online dating, to search engines, to social-media feeds — is mediated by algorithms. And we talk about them like we actually know much about them. We complain about the Facebook algorithm and we gush (Betancourt isn’t alone) over TikTok’s. As I write this, some YouTube alpha male is out there uploading videos promising straight men advice on how to “hack” the Tinder algorithm to date like kings, and if you watch any of these videos, the site’s algorithm will use that query to offer you more unsolicited dating advice the next time you log in.

In reality, we don’t know nearly enough.

When we talk about “the algorithm” of any given platform, we’re sometimes talking about multiple algorithms that use artificial intelligence to metabolize the data that consumers (that’s us) provide through our interactions with the platform. These algorithms use that information to then curate that platform’s offering to its users (again, us). In other words: Our likes, swipes, comments, play time, and clicks provide these platforms up-to-the-minute updates on our needs and preferences and the algorithms use this information to determine what we see and when.

Because Your Algorithm Says So
Examining our (sometimes toxic) relationship with our AI overlords.

“We shape our buildings; thereafter they shape us.” The influence is now getting more literal and visibly present in our feeds. China’s virtual influencers were worth $961 mn in 2021 with many of them having millions of followers. And across the world, virtual influencers are now becoming verified accounts on Instagram! It’s not just in our feeds, it’s entering reality too. A growing trend in Japan is “fictosexuals” – people (unofficially) getting married to fictional characters, with tech in general and AI specifically helping create experiences.

How is that shaping our future?

Each technological revolution brings with it, not only a full revamping of the productive structure, but eventually a transformation of the institutions of governance, of society, and even of ideology and culture.

Carlota Perez

In the podcast, Ferriss and Schmidt bring up the trolley problem, and how it’s as much a philosophical or moral decision as it is a programming one. It made me think of two things. One, what is our expectation of AI? After all, in a trolley situation, would we ever blame a human, whatever be the choice he/she made? Two, maybe AI will get to a computation level that will help it to quantify the value of a human life. We already do that for insurance. Algorithms are after all a set of instructions. Maybe the AI will compute, in nanoseconds, social connections, financial dependents, health stats, criminal records etc and choose. Maybe without even giving the human driver any agency. I am using the trolley problem as probably an extreme example, but a recurring theme on this blog is how we have increasingly gone beyond outsourcing our transactions to AI, and are now outsourcing our decisions too – from Netflix to dating! Culture is slowly being determined by AI. Think about it, the fastest growing social platform is TikTok, which does not depend on your social graph – just interests.

At a broader level, as I brought up in my post a fortnight ago, culture is increasingly becoming a bigger factor than genetics in shaping our species. And as you can see above, AI is influencing culture by increasingly understanding us better, and influencing our behaviour and mindset. A generation or two from now, would driving even be a skill? Would we even learn from interpersonal interactions? What would that do to our species and its cognition? Recent studies have shown that spending time in nature has an effect on human cognition. I am quite certain that this would hold true for AI as well. To the extent that, being a Star Trek fan, I sometimes I wonder whether our future is just assimilation- Borg!

Or maybe there is another option. All our basic needs can now actually be met by AI/robots – robot kitchens and drone delivery, there are robots designing fashion, drones can actually do construction – food, clothes, safety. As you read above, they are also on our way to solving emotional needs! As Balaji Srinivasan said in The Knowledge Project podcast, maybe if land is made a utility, all status-seeking will move to the metaverse. That would be self-esteem and then actualisation.

Leave a Reply

Your email address will not be published. Required fields are marked *