Powerful software and talented visual effects artists are usually needed to convincingly pull off the effect where an actor filmed against a green screen is inserted into another shot. But researchers at Netflix say they have come up with a much easier, and potentially more accurate, green screen technique that takes advantage of how digital cameras work—with a dash of AI.
Netflix Passwords, ChatGPT Can’t Detect AI, and No More CoTweets | Editor Picks
Off
English
Replacing the background of footage where actors have been filmed against a brightly colored backdrop is a technique that’s been used by Hollywood, and even your local news, for decades, as it’s a much faster alternative than an artist having to painstakingly rotoscope an actor out of a clip by hand. Typically, either bright green or bright blue backdrops are used, as those are colors that don’t naturally appear in human skin tones so that when the software algorithms do their thing, the actors don’t end up getting automatically removed, too.
Advertisement
The typical green screen technique does come with some challenges, however. The on-screen talent can’t wear makeup or clothing that matches the color of the backdrop, and often times additional cleanup is needed in post production to remove a brightly colored glow around the talent. Extracting fine details, like wispy hair, from green screen footage can also be a challenge, as can dealing with transparent objects like windows which can’t disappear entirely, despite the green backdrop showing through.
More time needed in post-production to fix these types of issues also increases the cost of completing a shot, so it makes sense why a company like Netflix would want to find ways to make the green screen process easier and faster. But the alternative technique the company’s researchers have come up with almost seems counter-intuitive when you see what the captured green screen footage actually looks like.