Ume is now back and on a mission: to market deepfake videos to the planned metaverse and make them the center of digital life like tweets and memes.
He’ll take the next step Tuesday when a deepfake developed by Metaphysic, the company he formed with businessman Tom Graham, will compete in the semi-finals of NBC’s reality hit “America’s Got Talent.”
“This is a good opportunity to raise awareness and show what we can do,” said Ume.
“We think the web would be so much better if, instead of avatars, we lived in the world of the hyperreal,” added Graham, describing the ability for users to manipulate real faces with Metaphysic.
The startup’s appearance before millions on television will lay the groundwork for its new website that seeks to make it easier for ordinary people to say and do things they never did in real life. Many other such sites are geared toward programmers and researchers.
And the act- which will follow a raucous preliminary round appearance in which the face of a young Simon Cowell was superimposed on screen over a stage actor so that the judge appeared to be singing to himself; they will offer a brilliant advertisement for a technology that is democratizing with surprising speed.
However, some critics are horrified by this moment of celebration in a television program of great audience. They say the fake videos blur a line between fiction and reality that is barely clear now. If disinformation peddlers can be so successful with manipulated words and images, imagine what they can do with a full length video.
“We are rapidly entering a world where everything, even video, can be manipulated by almost anyone they want,” said Hany Farid, a professor at the University of California, Berkeley and an expert on deepfakes. “What can go wrong?”
The unveiling of what is the most-watched show on network television for most weeks this summer comes at the end of a hectic summer in the world of deepfakes, which use deep learning artificial intelligence to create fake media (followers prefer “synthetic tokens”). ” or “generated by AI”).
While many Americans happily engaged in quaint analog activities like going to the beach, a startup called Midjourney offered “AI art generation,” in which anyone with a basic graphics card could create stunningly lifelike images with just a few presses. keys. To spend even a few minutes with him: there’s Gordon Ramsay burning in his Hell’s Kitchen; Here’s Gandalf playing a guitar – it’s experiencing a technology that makes Photoshop look Wite-Out. Midjourney has gathered over a million users on their Discord channel.
Three weeks ago, a startup called Stable AI launched a program called Stable Diffusion. The AI image generator is an open source program that, unlike some rivals, places few limits on the images people can create, leading critics to say it can be used for scams, political disinformation and violations. of privacy.
“We should be worried. I follow technology every day, and am worried,” said Subbarao Kambhampati, a professor at the Arizona State University School of Computing and AI who has studied deep fakes and virtual identities. He said that he hopes the “AGT” moment will make platforms like these take off even more, even as the technology improves by the day.
“It’s moving so fast. soon anyone can [to] create a moon landing that looks real,” he said.
Ume and Graham say that deception is not their goal. Ume stresses the entertainment value: The company will market itself to Hollywood studios that want to feature deceased actors in movies (with permission from an estate) or have performers play against themselves when they were younger.
As for ordinary users, Ume says Metaphysic’s goal is to make online interactions feel more real, without the quirkiness of video games or the drudgery of Zoom. “I imagine being able to have breakfast with my grandparents in Belgium from here in Bangkok and feel like I’m really there,” Ume said from his current base.
Graham adds that synthetic media, far from harming privacy, will enhance it. “I would like to see a world where online communication is a more human experience that humans own and control,” said Graham, a Harvard-educated lawyer who founded a digital graphics company before turning to cryptocurrencies and eventually , to deep fakes. “I don’t think that’s going to happen in today’s Web2 world.”
Farid is not convinced. “They’re only telling half the story, that you use your own image,” he said. “The other side is someone else using it to defraud, spread misinformation and disrupt society. And you have to ask yourself if it’s worth it to be able to move around a little more on Zoom.”
Deepfake technology began eight years ago with the use of “generative adversarial networks.” Created by computer scientist Ian Goodfellow, it basically pits two AIs against each other to compete for the most realistic images. The results were far superior to basic machine learning techniques. Goodfellow would go on to work for Google, Apple, and now DeepMind, a Google subsidiary.
At first, deepfakes were used by clever exploiters, who infamously grafted faces of actresses onto pornographic videos. But since the technology requires fewer tools, ordinary people can now implement it for a variety of uses, which Metaphysic hopes to further.
Earlier this year, the company attracted a $7.5 million investment from the likes of the Winklevoss twins, the social media entrepreneurs turned cryptocurrency, and Section 32, the venture capital fund of Google’s original founder. Ventures, Bill Maris. “We think the impact will be far-reaching,” Andy Harrison, managing partner of Section 32, said of Metaphysic. Harrison, also a Google veteran, said he viewed video deepfakes not as a threat, but as an exciting shift in consumption and communication.
“Frankly, I’m very excited,” he said. “I think it’s a new era in entertainment and social interaction.”
Critics, however, worry about the “liar’s dividend,” in which a web awash with fake videos muddies the water even for legitimate videos, causing no one to believe anything.
“Video has been the last frontier of online verification. And now he might as well be gone,” Farid said. He cited the unifying power of the George Floyd video in 2020 as unlikely in a world awash in fake videos.
When asked about “AGT’s” role in promoting deepfakes, a spokesperson for production company Fremantle declined to provide comment for this story. But a person close to the show who asked to remain anonymous because he was legally barred from commenting on an ongoing competition said he believed there was a social utility to the Metaphysic act. “By using the innovation in a completely transparent way,” the person said, “they are showing a mainstream audience how this technology can work.”
A solution to the truth problem could come in the form of authentication. A cross-industry effort involving Adobe, Microsoft and Intel would verify and make transparent the creator of each video to assure people it was real. But it is not clear how many would adopt it.
Kambhampati, the ASU researcher, said he fears the world will end up in one of two places: “Either nobody trusts anything they see anymore, or we need an elaborate authentication system for them to do so.”
“I hope it’s the second,” he said, then added, “not that that sounds so good, either.”