Meta, the parent company of Facebook and Instagram, is pouring billions into building the metaverse. This project carries new risks and these are not only related to the number of users.
have a problem. Earlier this month, it saw $250 billion in market cap evaporate after worse-than-expected revenue and earnings figures. Competition from popular video service TikTok, tighter privacy rules from Apple and slow growth in user numbers are causing problems.
There are also billions of investments in the metaverse, a combination of virtual and augmented reality with ordinary online environments. “The metaverse is the successor to the mobile internet,” CEO Mark Zuckerberg has long said. But such a project not only costs a lot of money, it can do enormous damage to the company’s reputation.
Meta realizes that. Last week, the company announced that it is taking additional security measures at its own virtual world Horizon Worlds and at Horizon Venues, where VR avatars can attend events. Those avatars are now stuck in an invisible bubble. The only virtual physical contact that is still possible is the five and the fists, but with the arms extended. Romantic relationships or even a hug are made impossible for fear of incidents.
Romantic relationships or even a hug are impossible for Meta in her virtual worlds for fear of incidents.
And such an incident has already been known. A Horizon Worlds tester reported that an avatar she didn’t know grabbed her inappropriately before she opened up.
The entire episode showed the tension between the desire to have as little or no regulation as possible and the need for protection. It also became apparent how difficult it was to get that protection really tight.
VR can be a toxic environment and pose an existential threat to Meta’s metaversue plans.
Fast forward to last year, when Andrew Bosworth, director of Reality Labs where Meta develops its AR and VR applications, admitted in an internal note recognized by the Financial Times that moderating in a virtual environment is “virtually impossible on any significant scale.” . ‘. He warned that virtual reality can be a toxic environment, especially for women and minorities. He also admitted that this could pose an “existential threat” to Meta’s plans if it were to scare ordinary users. Despite great challenges, Bosworth strives for “almost Disney-level” security.
Bosworth is now responsible for all of Meta’s technology and full investments are being made in security. Invisible bubbles, ways to quickly and easily report misconduct, and the ability to block others are not the only means. Artificial intelligence must play a crucial role.
The company is developing an AI supercomputer, Research SuperCluster (RSC). This should allow the AI to understand the context of a message. It deals with cues such as language, imagery, and tone. ‘Take a pointing finger, for example. It can mean something different in different cultures, but the meaning also depends on the context. An AI model can learn to determine whether an action, sound, or image is bad or good. This research is necessary, among other things, to improve security in the metaverse in the future,” says a Meta spokesperson.
To date, the AI has proven completely incapable of smoothly moderating the usual Meta services, such as Facebook. If that were to improve markedly with CSR, Meta hopes for the balancing act between ensuring security and countering the accusation that the company, like a big brother, monitors and regulates the metaverse life of the citizen.
Meta’s current VR platform also offers a host of third-party VR social services. Those apps, like VR Chat and Rec Room, have been very successful. It looks like the Meta portion of the metaverse will consist of the company’s own highly protected Disneyland-style environments and third-party environments that may choose to rely more on free expression, even if there may be incidents.
The metaverse isn’t just for Meta.
Apps can also be placed on the Meta headset via ‘sideloading’. They are not even in the official app store. For those apps, Meta can completely disclaim responsibility and signal the openness of the ecosystem. “The metaverse is not just a Meta thing,” the company emphasizes.
It is precisely this diversity that makes the concept of the ‘metaverse’ attractive because it allows for different environments, each seeking its own balance between restraint, vigilance, and laissez-faire. Even then, critics will keep a close eye on how Meta manages its own environments and how it handles the flood of data captured in the metaverse.
- Meta tries as much as possible to avoid incidents between users by turning the metaverse into a toxic place. More and more shocking scandals could kill the company.
- Meta uses a panoply of resources, sometimes with draconian measures to counter virtual physical contact and AI supercomputers.
- Third-party apps should allow you to choose between Disneyland-like environments and virtual places that are freer but also riskier.