Volume 1: The forefront of virtual production:
A focus on large LED studio + screen process
- From the dawn of virtual production to the present
- Virtual production systems introduced one after another in large movie studios
- Screen process that began to be popular before in-camera VFX
- Main types of virtual production
- High-definition 360° camera takes the screen process to the next level
- Next time, the main road of virtual production, commentary on in-camera VFX
From the dawn of virtual production to the present
PRONEWS Japan’s special feature “VPFG” was first released on March 4, 2021 in Japanese, so more than two years have passed. Its now time for the Global Team to also cover the bourgeoning world of virtual production. Green screen virtual production has been around for ages, but now, we are seeing that a new shooting method with in-camera VFX using LEDs began to attract attention.
However, it was not something that could be easily obtained, partly due to the use of LEDs and the high-definition top panels. For that reason, most studios started experimenting with small systems in 2021. The article in Vol.01 was “The Next-Generation Shooting Style Seen in CyberAgent’s Virtual LED Studio” .
Starting with the opening of “Hibino VFX Studio” in June 2021, permanent studios large enough to withstand normal shooting, such as Sony PCL’s “Kiyosumi Shirakawa BASE”, which we covered previously, have emerged. There are more and more examples of shooting with LEDs assembled in a studio for photography on a limited basis.
Two major virtual production studios representing Japan
Virtual production systems introduced one after another in large movie studios
In 2022, “Studio PX SEIJO” was set up jointly by five companies, Tohokushinsha + Dentsu Creative Cube + Dentsu Creative X + HIBINO + Omnibus Japan, on Toho 11th. A virtual production studio system by Sony PCL was installed at Kadokawa Daiei Studio’s “Stage C”.
At the beginning of 2023, the Toei Zukun Research Institute will set up a studio at the Tokyo Studio in Toei Oizumi Gakuen, where a huge 30m-long LED will be installed in a 270° circle.
Nikkatsu Chofu Studio STAGE 4st already has a green screen virtual production studio called “Virtual Line Studio,” operated by Nikkatsu + Digital Frontier + AOI / TYO Holdings, which has already started operations in October 2020. By early 2023, all major film studios will have virtual production systems in place.
With so many studios introducing virtual production systems, more and more TV commercials, dramas, and movies are shot using virtual production. However, the goal of virtual production is to make it look as if it was shot on location. Almost.
Screen process that began to be popular before in-camera VFX
As mentioned above, large-scale LED studios have become popular in the last year, but most of the footage shot was not in-camera VFX, which can be said to be the true value of virtual production, but screen processing. Here I will write about the difference between in-camera VFX, screen process, and match move.
Main types of virtual production
Please see the following table. The current virtual production method is divided into four sections.
The top one uses an LED, and the bottom one uses a green screen. It is divided into left and right, and the left is a real-time synthesis method using a camera tracker, etc., and the right is a conventional method.
Originally, methods (1) and (3) are called virtual production. However, it is often said that films shot without real-time rendering using these virtual production studios were also shot using virtual production.
The advantages common to all of these are: “No restrictions on location weather or time,” “Cost reduction of travel, etc.,” “Possible to shoot in places that are not realistically possible,” and “Focus on shooting in a comfortable environment.” You can do it,” which can be said for VFX shooting in general. A common demerit is that in order to pursue realism, the filming staff must have a certain amount of experience.
Now, let’s start with the most familiar (4), green screen (blue and other colors are sometimes used, but for the sake of convenience, it’s written as green) synthesis.
As you know, this method has been used since around 1940 and is still mainstream today, so I won’t go into detail about the method, but basically the perspective and movement of the background and the image of the camera shot with a green screen are captured. The same is desirable. In that sense, fix is the basis, but even if the camera is moving, it can be handled by a method called match move.
Match-move is a process in which several reference points called markers that are easy to erase are pasted on the green screen, and the background image is moved according to the captured image when compositing, and CG is produced based on the movement of the camera. become. Many of the 3DCG applications can analyze and synthesize the movement of the camera from the captured images, and the labor is considerably reduced. Nonetheless, it is undeniable that the weight of the post-processing called post-production is quite high.
The screen process in (2) is also an old method. Since there is also rear projection by film, it has been performed for a long time than the green screen. This is a method that has recently been drawing attention again because the use of high-quality LEDs increases reality and increases the degree of freedom in lighting.
The advantage is that the image captured by the camera becomes the final image, making it easier for staff to communicate with each other on site. The effects of light, reflection, and transmission on the subject are realistic. I wonder if it was said.
For this reason, it exerts tremendous power in sizzle photography and car scenes, where green screens have been avoided until now. Especially for car scenes, the effect is tremendous, such as the transmission through the window, the reflection on the body, the reflection of the mirror, and the influence of the light on the person running.
High-definition 360° camera takes the screen process to the next level
However, this method has the condition that the background material must be shot at the same angle as the final shooting angle (it is safer to use a slightly wider angle than the actual shooting angle). This is the same for LEDs and green screens, but this is also changing recently with the advent of high-resolution 360-degree cameras. With the advent of Insta360 Titan (11K) and Obsidian Pro (12K), it is now possible to record high-resolution 360° spherical data.
It is possible to change the direction of the projection material by changing the angle of the camera in the LED studio, and the camera can be swung quite freely. Since the height and left/right position of the camera cannot be changed from the shooting material, there is little freedom, but if it is a moving scene such as a moving vehicle, there is no problem even if it is a little rough.
What is a limiting factor is the weather at the time of background material shooting. This is not something that can be adjusted for later, so we have to match the scene with the material that was shot. Still, compared to the green screen in (4), in the case of (3), the LED itself emits light, so the same 360-degree image is projected on the ceiling and the LED panel for reflection in the foreground, creating a much more realistic environment. Light comes out naturally. Also, since it is difficult to express only the strong light of the point light source with the LED panel, we will prepare a separate light, but in terms of natural synthesis, it is faster to reach the realism than the green screen.
What’s more, being able to share the finished product with all the staff on site is irreplaceable. If this is a green screen, there are times when we do preliminary synthesis in the studio, but in the end it becomes work in post-production, and in many cases only the editor and director are present. Being able to see something close to the finish on site directly leads to quality improvement.
Well, with this kind of feeling, the background material must be shot in advance, in the case of CG, it must be completed before the shooting stage, etc. As with the LED in-camera VFX in (1), the longer it takes, the less compositing work and the shorter the post-production time for post-processing.
Although there is a disadvantage that the lighting will inevitably match the background because the background material is prepared in advance, it is easy for the shooting staff to get used to because it is an application of conventional shooting methods. It makes sense that much of today’s virtual production is a screen process.
Next time, the main road of virtual production, commentary on in-camera VFX
Now, about the main subject (3), green screen + real-time rendering, but it seems that I will spend more time than the screen process, so I will carry it over to the next time.
Therefore, I will touch on Unreal Engine, which is revolutionizing the current video industry, so please look forward to the next time.