Entertainment

How Metaphysic’s entry into AGT will impact the entertainment industry

On June 6, 2022, Chris Ume and Thomas Graham, the founders of Metaphysic, joined the America’s Got Talent scene along with Daniel Emmett. Using technology developed by Metaphysics, Daniel Emmett performed “You’re the Inspiration” using the likeness of AGT judge Simon Cowell live on stage.

This performance stunned the judges and captivated the audience, leading to four positive results. The group is moving towards the next round of AGT, but what does the use of deepfake technology mean for the future of entertainment?

MAKEUSE VIDEO OF THE DAY

What happened on the AGT stage?

The main participants of the show were Chris and Thomas. When they introduced themselves, they said:

“We use artificial intelligence to create hyper-real content.”

They then brought Daniel Emmett onto the stage. When Simon asked how he got into the duo, he said:

“I am a fan of what they do online and they are AGT fans. And when they asked me to be a part of this wonderful, unique, original thing they were going to do, I couldn’t say no.”

So, despite all the mysteries surrounding the trio, Cowell wished them good luck and they started the show.

As Emmett prepared to sing, a camera came on stage to record his profile, blocking the audience’s (and the judge’s) view. When the music started, the camera panned to the stage screen, where we saw Simon’s hyperreal avatar superimposed on David and singing live on stage.


The crowd was furious when they saw this, but Simon was initially confused. As the show went on, Cowell went from embarrassment to embarrassment to amusement. In the end, he (and the rest of the audience) gave the performance a standing ovation, and the group received a “yes” from all four judges.

How deepfakes affect entertainment

While deepfakes may seem new to most people, they are not, especially in Hollywood. Of course, this is probably the first time deepfake technology has been used in a live setting, but movies have been using this technology for years.

One popular example is the 2015 film Fast & Furious 7, where one of the main characters, Paul Walker, died in a car accident during filming. Instead of killing off his character in the film, the producers decided to shoot him using his brothers as stunt doubles and then use CGI to draw Paul’s face on them.

This technique also appeared in Rogue One: A Star Wars Story in 2016 and was used on Grand Moff Tarkin and Princess Leia. Unfortunately, the actor who played the character, Peter Cushing, passed away in 1994. And although Carrie Fisher was still alive during filming, her appearance has changed significantly since 1977.

This film used body doubles and deepfake-like technology in production to digitally recreate the original characters.

And while you might think this is a recent development, you might be surprised to know that the 2000 Gladiator also used similar technology. When one of the main supporting actors, Oliver Reed, died suddenly during filming, the production challenge was to digitally recreate him using a stunt double for the actual shoot.

However, this technology goes beyond resurrecting and reversing the age of celebrities. They even used it in Captain America: The First Avenger to replace Chris Evan’s hulking body with Leander Dini’s smaller body. Although they also used various other methods to make it tiny before Howard Stark’s experiment, deepfake technology was one of their tools.

How deepfakes could revolutionize the entertainment industry

What all the above examples have in common is that they had to spend months and millions of dollars to get the desired effect. However, on the AGT stage, Metaphysic showed they could recreate another celebrity in high quality in real time.

While their appearance on the AGT stage isn’t 100% realistic, it’s close enough to the real thing if you don’t look closely. And one big thing is that they do it in real time. With the development of technology, we will soon find a real-time deepfake app that can instantly create a realistic video.

Development can reduce studios’ reliance on advanced tracking suits and extensive post-production, further simplifying their post-production workflow. And while creating seamless deepfake videos isn’t as easy as it sounds, you no longer need to be a multi-million dollar production company to afford it.


The mass adoption of the technology will allow small studios and even independent filmmakers to afford and use it for their creations. In fact, anyone with a smartphone can try this technology through face swap apps, although they are not as good as in the AGT stage.

The future of deepfakes in film and television

As deepfake technology advances, we may soon find a time when studios can easily use images of past celebrities to bring their newest films to life. This would help maintain continuity, especially since producers (and moviegoers) prefer long-running franchises where they run into trouble with aging or even dying celebrities.

However, this development has another facet – it can be used to recreate the image of a person without his consent. In addition, the same video can be used to create fear, tension and confusion in people, especially if it is filmed by a political figure and the deepfake is not recognized as such.

As much as we love this technology in entertainment, we also have to be wary that it is used ethically and never wreaks havoc.

About the author

bouzara

Leave a Comment