In a new update, Meta’s Mark Zuckerberg has shared more about AI and what his plans are for the metaverse with it. It turns out that they mainly revolve around translation. That language is important in the Metaverse is clear from the start of the presentation. For example, tests are being carried out with artificial intelligence that allows you to build a world just by describing what you want in it.
For example, you say, “Let’s go to the beach,” and the builder bot you’re talking to creates a sea and sand. Then you can add clouds and tropical music, for example. Therefore, the AI must listen very carefully to what you say in order to satisfy your wishes, as well as what you mean and how it can achieve it.
The biggest announcement is that of ‘self-supervised learning’: learning under one’s own supervision. So no one is there to police things: AI has to learn from the data itself. Mark Zuckerberg: “With this you are not teaching AI a concept, but you are giving it raw data and it has to figure out for itself what is missing. This is how the human brain also learns. Nor does a child need to see 10,000 photos of a cat before he can recognize one.”
In fact, Meta believes that self-supervised learning is better than many existing AI models. However, Meta seems to be missing something here, namely: who selects the data that is used? After all, it is still man-made information. So it’s not as if AI is completely free: it still has to make do with the data it has at its disposal (and is therefore ‘fed’).
Teleport to other worlds
Also, Meta will compete with Google, which already discussed many translation possibilities last year. Zuckerberg: “Half the world doesn’t have internet access in their own language. There are millions of Fula speakers, but that language is almost non-existent online. Even the most advanced translation models are trained in English.” For this reason, Meta is now building a model that does not require English to learn other languages. Zuckerberg: “When people in the Metaverse teleport between virtual worlds, then translation is important. We have the opportunity to improve the Internet and create something where people from all over the world can communicate with each other.”
In this regard, it has launched two large-scale AI projects: No Language Left Behind and Universal Speech Translator. The first aims to be a translation system that does not forget any language. Zuckerberg: “You can already translate hundreds of languages. Five years ago there were twelve, three years ago there were thirty and now there are hundreds”. For the Metaverse, the Universal Speech Translator project is especially important, because it would make it possible to communicate with anyone in the world simply by speaking.
Break the language barrier
Universal Speech Translator should translate what you say into the language of the person you’re talking to almost instantly. For example, while in your living room in Appingedam with your VR headset on, you can have a very “normal” conversation with a Japanese guy who is somewhere in Nagasaki with his headset on, without speaking the other’s language. However, this requires a lot of computing power and much more development of the artificial intelligence behind these plans. It is being worked on and Zuckerberg gave an update on it.
In addition, Meta is busy developing technology for -probably- her virtual reality glove: for example, she investigates important contact points using a type of clay, but she has also developed a part that Digit has: this is a kind of hand that consists of touch sensors that help you grasp objects firmly (but not too firmly). Although it was introduced as features for robots and the glove hasn’t reappeared, the expectation is that this technology will have everything to do with that glove.
Meta vs. Google
It appeals greatly to the imagination: soon you will have a conversation with a headset on with someone who does not speak any language in your language (or vice versa). Exactly how Meta sees this has not yet been shown. For example, there may be long delays, but at least now she is working on the technology. Google does much the same thing, including using Google Neural Machine Translation.
Both companies are investing heavily in artificial intelligence, both in their own way. It will be interesting to see which AI model turns out to be better, but above all: how soon we will ensure that we can talk to someone almost immediately where we normally needed hands and feet. So wouldn’t the characters in Meta’s Horizon Worlds have feet?
When you’re not tapping, you’re floating somewhere in the wonderful world of entertainment or on a plane to some great place in the real world. Mario…