The Dangers of Deepfakes Hiding in the Metaverse

If you are in the metaverse, you are usually represented as a blocky or cartoonish avatar, or a lifeless floating body and a pair of hands. None of them are as far away as you.

But what happens when things become more real?

Several companies have developed ways for you to create hyperrealistic representations of yourself for the Metaverse, complete with your face, your voice, and even the way you act. One of these is Metaphysic, a deepfake or synthetic media company, founded by Chris Ume, creator of Deep Tom Cruise Videos which brought TikTok by storm last year.

The videos claim to show the Hollywood actor doing things like eating from a lollipop and playing golf. The footage is a different actor, with Tom Cruise’s face being so aptly moved to the top that it’s hard to tell he’s not real.

Now Metaphysic wants to put this technology in everyone’s hands, so they can use it to create their own hyperreal avatars. The start is recent raised $ 7.5 million from investors like Winkelvoss Capital and YouTuber Logan Paul to help fund it.

“Anyone can come in and create their own hyperreal synthetic avatar,” said Tom Graham, CEO and co-founder of Metaphysic.

But that’s not all: with Metaphysic, you can also securely hide your avatar (as a non-fungible token), so you can continue to own your own image and, importantly, the biometric data that used to make.

Other companies do the same. based in Romania also offers a service where people can perform NFT on their own face or voice.

Part of it is having fun-who doesn’t want to make a mini-me of their own? Or see what they look like dressed like Lady Gaga?

But there is a serious side. If we don’t find ways to secure our metaverse identities early on – as Metaphysic intended to do – the result could be a terrible loss of control over our own images and biometric data.

A quick dive into the deep dilemma

No one knows this better than Henry Ajder, a researcher who has spent years studying the malicious use of synthetic media. A joint survey she conducted with Karen Hao of the MIT Technology Review in 2019 found that 96% of all synthetic media at the time was pornographic, mostly created by robots that could change people’s faces into bodies. a person.

This is at a time when deepfake technology is still in its infancy. Now it has become easier and more realistic.

“The future can be synthesized and there is no downside to the challenges ahead”

“It used to take 150 CGI people and $ 250 million to create an in -depth set of effects for a movie. Now we can do it for a few thousand dollars, a few GPUs and a man, ”Graham said.

And in-depth videos keep popping up everywhere. At the start of Russia’s invasion of Ukraine, a clumsy deepfake video by President Volodymyr Zelensky allegedly surrendering shows how such media can be armed for political purposes. The video is a bit brutal, however experts warn that the next ones may not be easy to identify.

You don’t ban technology, says Ajder: “If you ban synthetic media, you ban all Instagram filters, you ban computational photography on your camera and smartphone, you ban dinosaurs in Jurassic Park. It’s not disappear-the future can be synthesized and there is no stopping the challenges ahead.

Set a good example

The only thing you can do, Ajder argues during his article, is to try to build a large enough industry around legitimate and ethical counterfeit technology to try to establish the best ones. habits – and may, perhaps, help balance out the bad ones.

Ajder partnered with Ume and Graham, who were setting up Metaphysic at the time, to provide advice on how to take things in an ethical direction.

“I come from most of this perspective to understand how technology can be used maliciously, but I also see an explosion of really interesting creative and commercial use of technology, and the need for a more nuanced conversation around synthetic media as a technology, ”he told Sifted last year when they started.“ With the right hands and used responsibly, it could be the future of creative expression. We need to make sure we find a good example.

Metaphysic attempts to model the ways in which media companies can use synthetic media responsibly. The company helps famous actors, for example, rent their images to advertising agencies to create a campaign, but everything must be done with permission and within agreed limits.

“There are obvious use cases that we consider to be clearly bad – not abusing the image in the context of pornography, misleading political gimmicks, cybersecurity issues with fraud,” Ajder said.

Another synthetic media company, D-ID, which has worked with Warner Brothers on various film projects, is also lobbying for creating a code of ethics in the industry.

But now it has become a problem for everyone

While deep technology has emerged from the realm of advertising and film projects and is available to everyone, companies like Metaphysic believe they need to keep going.

Not only does the team want to make sure the film and advertising industry uses synthetic media ethically – they want everyone to be able to create and secure their avatar. They are offering their services, called everyone free, Users only have to spend about $ 20 for the NFT keystroke fee.

Each person’s platform allows faces to maneuver in all sorts of ways

Of course, that doesn’t stop someone from stealing your face to do revenge porn if they want to. But they want people to understand what images can do and think of ways to control the use of their face and voice.

“We want individual users to feel like they have more control over who they are and not have to worry about sending all their data to a dodgy company and what might happen in the future,” Graham said.

“It’s basically a matter of agreement. We want to level the game a little bit and create a paradigm where that’s the norm.

Maija Palmer is Sifted’s Innovation Editor. He covers in-depth technology and business innovation, and tweets from @maijapalmer

Leave a Comment