It was towards the end of 2020 that I came across Roblox and wrote Metaverse : Get a second life. Since that post, Mathew Ball has written the definitive primer on the Metaverse1, and if you’re interested in the subject, it’s a must-read. The word “metaverse”, ICYMI, was coined by Neal Stephenson in Snow Crash, and the book is being referenced in many recent conversations. In fact, Stephenson has been quizzed for years, each time we seem to take a step in this direction, and his comments continue to be prescient, insightful and hugely creative. This one, from 2017, in Vanity Fair, is a favourite, and contains, among other succinct gems
The purpose of VR is to take you to a completely made-up place, and the purpose of AR is to change your experience of the place that you’re in.
Neal Stephenson
But Stephenson admits to not seeing the whole social-media bubble thing coming, and thinking that there would only be one Metaverse which everyone logs on to. It isn’t quite turning out that way. Every tech giant wants in, to create and own the narrative. Intuitively, VR seems closer to the metaverse, but in terms of adoption, I do think AR can play a role in easing users in.
Snap is a case in point. While they have released 3D Bitmojis and 3D full-body tracking, it is AR-driven Spectacles and the potential it offers2 that makes its route to the metaverse interesting. Microsoft’s approach is more VR – to populate the metaverse with “digital twins” of any physical object and use technology like IoT, 5G, cloud computing etc to monitor, simulate and even make interactions possible without being physically near the object. (via)
Back in 2007, there was a rumour about Google turning Google Earth into a metaverse. It didn’t go anywhere, just like Google Lively. Since then we’ve had everything from Cardboard to Glass to Niantic (remember the Pokemon craze), but nothing that changed the world. Until (IMO) Darlene and Chi-Chi showed us what’s possible, with Project Starline earlier this year.
But as Ben Thompson pithily pointed out in Metaverses, no one cares unless Facebook is involved. Unlike Snow Crash, Facebook is planning a more immersive (and invasive, I reckon) version of the internet that is absolutely connected to reality. VR-rebranded as Ben Thompson calls it. They began lightly with Ray-Ban Stories , and stepped it up with the Horizon Workrooms app, which uses Oculus Quest 2 headsets to help people hold VR meetings in their avatar versions, with collaborations on shared whiteboards or documents. More recently, and ironically, they have decided to invest $50 million to “responsibly” build the metaverse.
I think that long before we inhabit the meta verse, we will start experimenting with world building using proxies. Several examples exist across fiction and reality. In Ted Chiang’s Exhalation, the novella The Lifecycle of Software Objects has the protagonist Ana trying to train digital pets which are designed to have a learning capacity similar to children. It’s a fantastic story of what happens to their existence when platforms go defunct. It’s probably inspired by Neopets, which was launched back in 1999. Users could buy digital pets and are responsible for taking care of them, using Neopoints and Neocash as virtual currencies. Even before that, in 1996, was Creatures – artificial life simulation, in a video game. (it actually inspired Amazon’s game plan – via) Given how these days NFTs also play a social status/ self esteem role using things that are purely digital, it’s a route digital pets could take. Entire digital worlds built for them, with humans vicariously living through them in a metaverse. The next step could possibly be a mixed reality. It could start with AR (see this short film) and then move to VR. Remember the Holodeck in Star Trek? After all, we’re already buying virtual estate in Decentraland! Pretty soon, we’ll be building houses. After all, we need to show off those NFTs.
But the “multiverse” in the title of this post comes from a fully loaded version of the metaverse. In $ocial Validation, I had ended with the question of how humans could cope with the “always on stage” and efficiency mandates of “the great online game”. Though I had a clue in In Code we Trust on what the combination of deepfakes and GPT-3 could achieve, the controversy around the use of AI to simulate the voice of Anthony Bourdain in the documentary Roadrunner really brought it to life in the real world. While we don’t know whether he would have wanted it, the more recent deepfake-led promotions of Reminiscence that allowed users to appear in the movie trailer, and Bruce Willis agreeing to the use of his deepfake in a Russian commercial are both examples of voluntary “fake” presence. These are still only media being passively consumed, but extrapolated, this means you could simulate your presence using technology – either something like Project Starline or deepfakes with consent. In the latter, you could actually be doing something else. In a metaverse context, the platform could bring to life people who have passed away (heard about the guy who chats with his dead girlfriend?), and your deepfake armed with GPT-3 could be chatting with someone while you go about your daily life in reality (meatspace) or in a different metaverse. Say, listening to an artist who doesn’t really exist in a place that doesn’t actually exist. We have also created a hologram system that uses jets of air known as “aerohaptics” to replicate the sensation of touch.
Do we then transcend meatspace reality? Does it splinter now into various realities that a person inhabits? In a metaverse like the Holodeck, maybe we could even change our past. Will we have multiple metaverses where we will live out our many real life what-if scenarios even as we follow a singular path in meatspace reality? Or maybe we just slowly build out a simulated reality for ourselves. Or perhaps the future is a combination of real and simulated lives in a combination of reality and metaverses. Maybe the freedom that the metaverse can provide will in time, take us completely out of reality into a digital multiverse. How would it feel? Would our definitions of self and consciousness still hold? One person, many worlds, many timelines. In a way, we are already living that on digital platforms.