Eugene Wei’s TikTok and the Sorting Hat is a splendid read on many counts. It provides some excellent perspectives on tech companies’ crossovers across WEIRD (Western, Educated, Industrialised, Rich, Democratic) and non-WEIRD countries like China and India, made complicated by the culture difference. This serves as the context setting for TikTok’s rise in the US, and some deep chronicling on how this came to be, while juxtaposing it against social networks like Facebook, Instagram, Snapchat and Twitter. On an aside though, all of these networks have found varying degrees of success in India. Meanwhile, he points out that TikTok is not really a social network, because instead of a social graph, it plays on an interest graph that it builds from the user’s reactions. All of this makes for some excellent reading. But what really caught my attention was this –
in some categories a machine learning algorithm significantly responsive and accurate can pierce the veil of cultural ignorance. Today, sometimes culture can be abstracted.
A meta example of this appears at the end of the post when he visits the Newsdog. At that point it was the top news app in India, and it was built by a Beijing-based startup. Around 40 male Chinese engineers, none of whom could read Hindi!
I found the algorithm fascinating because of how this compares to us humans. We build our cultural (and taste) graph according to our surroundings over a period of time. Sure, we are pretty good at picking up new things, and making creative use of things/technology without understanding its component parts or even working, but I was wondering if this compared to an actor of foreign origin suddenly becoming the top superstar in a country without even residing there as prep. Even if the script and direction were top notch, how difficult would it be to understand nuances, subtleties, contexts and sensibilities? [I am not sure of this example either, but..] Maybe something that comes close is K-pop, but I don’t think even that has the varied acceptance the algorithm has managed.
This is AI building a taste graph at scale just by rapid experimentation. Infinite monkeys writing Shakespeare in a matter of years, if not months! Actually a different author for each user for 100 mn users! And doing this without what we might call understanding. This is interesting because in The Mind is Flat, Nick Chater writes about how the operations of the brain’s processing is not really known to us. We’re only conscious of the outputs – “the meaningful organisations of sensory information.“
In that sense, our understanding of our own preferences is pretty thin. At best we have “precedents, not principles“. So, what happens if we extrapolate what TikTok achieved to culture in general? For both definitions of culture – arts & manifestations of intellectual achievement as well as ideas, customs and social behaviour of a set of people.
For now, humans are still creating content and AI is just distributing it accurately at an individual level. But with an animated version of ThisPersonDoesNotExist, scripts by GPT-3, and deepfakes (Reface anyone?) becoming common, it can create too, sooner rather than later. In fact, Watson already made a trailer back in 2016. We are increasingly finding things to consume (read, view) as opposed to searching for something, and in many cases, letting a system decide it for us (Netflix, Spotify etc). Would we even know when the shift happen from human to AI creators? Or, to use an HGTG reference, maybe it has already happened. An amazing meaning of cultural appropriation that would be! It would even mean that we have an answer to shouldn’t children be taught poetry and not coding? Coding, because it can generate poetry, which soon might become pop culture.
In my utopian version, AI will become advanced enough to know us deeply, possibly even better than ourselves, and will be able to lull us into narratives that answer even our most profound philosophical questions. Solving without really understanding, just like TikTok. In my dystopian version, AI will aid the survival of only those humans who can relate to their (the AI’s) version of art, culture and philosophy. Spending time on the others would be inefficient.
3 thoughts on “In Code we Trust”