‘House of David’ Creator explains with the help of AI to make origin

The newly released episode 6 of Amazon’s “House of David” opens with a mythical origin order for the Goliath character made in an inventive way by including generative AI tools in the production workflow.
“The entire scene is driven with generative AI tools as the horsepower of the scene. What we have found is that these tools work in combination with traditional tools,” says Jon Erwin, series maker and co-showrunner, says Variety. He says that it was originally much smaller as a script as a script, but then the filmmakers began to present more ambitious visuals and record AI applications. “We had permission to use the technology, we had formed a team around it, and [thought] Let’s really go for it. ”
In total, season one contains 72 shots with the use of AI, and it was a learning process for the filmmakers. “People can certainly make cool images and use these tools that is kind of consumers, but to really use them in professional ways, it is really how you stack tools together that matters,” Erwin claims. In the case of ‘House of David’, the team gathered with its Indie Studio Wonder project, AI tools uses Midjourney for image enlargement, among other things; Magnific and topaz for increasing and adding details, and job and blade for generating video, combined with traditional tool such as Unreal Engine, Nuke and Adobe Photoshop and after effects.
“They all have unique strengths,” says Erwin about the AI tools, which calls the image-to-video application of Runway as an example. “Because I don’t want to generate images all over again. I want to expand assets from my show that I have already made. So I can start with images of my show or things we have made – things we possess.”
Speaking of the ‘House of David’ series, he says that the combination of AI tools ‘enabled us to make these photoeals and tell the story, but in a budget and time frame that we could afford.’ He refused to share the budget, but said that without AI the order would have shot in a desert and/or high-end VFX, both of which would have been “just outside the budget parameters of the show.”
Speed was also a consideration for the order. “You can dream in real -time and work together much faster on the material. And so that enabled us to get through this scene in a few weeks.” He suggests that otherwise the series may have lasted “four or five months in a traditional process.”
The origin story was an ambitious company, although it also served as an opportunity to experiment. “The order in episode 6 is really a combination of all the difficult things in VFX,” says Erwin, with reference to photoeal digital characters and simulations such as rain, smoke and wind. “The angel’s wings have feathers on them. Only the active that was built alone would have always lasted,” he adds. “They are many difficult things in one scene. Character consistency and consistency of the environment was really the trick and the difficult part of the order is how you let all these recordings match, and how you feel the order human and organic.
“I was blown away that many of these AI tools do physics simulations -water, rain, atmosphere, smoke, wind, much better than any other VFX tool that I once used. And so I was blown away with what we could achieve.”