- Introduction
- High-quality 2D data generation and real-time live performance
- Reproduce idom’s performance in VR space
- Volumetric Capture x Virtual Production
- Interview with Toru Masuda, Sony Group R&D Center
Introduction
Volumetric capture studios in Japan can be roughly divided into those that have self-developed filming systems, and those that use systems developed by other companies. In the case of in-house development, the development team is expected to be related and physically close, and frequent performance and function improvements of the shooting system can be expected, but Sony PCL is actually practicing it. It is a volumetric capture studio within the Kiyosumi Shirakawa BASE.
Sony’s Kiyosumi Shirakawa BASE volumetric capture studio was opened in in July 2022, but even after opening for just a short time, it has produced a lot of shooting results, and in all of those projects, the projects before that were It’s amazing to hear that they’ve made some technical improvements and updates since. Half a year after its opening, I had the opportunity to visit and hear from Mr. Yasushi Ikeda and Mr. Toru Masuda of the Sony Group R&D Center, so I would like to introduce them.
High-quality 2D data generation and real-time live performance
In January 2020, Sony set up a studio in its Shinagawa headquarters and has been verifying the value of various use cases. To further this research, a studio was opened at Sony PCL’s Kiyosumi Shirakawa BASE.
The volumetric capture studio enables high-definition human capture, uses a unique algorithm for 2D rendering, and features high image quality suitable for commercial use. Live streaming of volumetric video has also been realized at Sony’s in-house studio, and the live broadcast of Ikimonogakari (“Ikimonogakari Volumetric LIVE ~Ikiru~”) on August 2, 2020 will be of high quality despite real-time shooting. The results were successful. At Kiyosumi Shirakawa BASE, the virtual production and volumetric capture studios are adjacent, making it easy to use volumetric video data in conjunction with virtual production and other content.
In fact, in March 2020, I was also given the opportunity to visit by Sony’s in-house volumetric capture studio. Kabuki actor Ainosuke Kataoka’s lion dance is volumetrically filmed, and then used as an AR (Augmented Reality) application for smartphones and tablets called “Reverse Reality ~ KABUKI Performance ‘Shakkyo’ ~” by Shochiku Co. Ltd. By implementing this, we provided an app that allows customers who cannot go to the theater due to the coronavirus to easily experience Kabuki.
Reproduce idom’s performance in VR space
In about 5 minutes of content, users can visit Minato Mirai created in a virtual world using a head-mounted display, and can move 360 degrees up, down, left and right, and see idom’s performance in 3D. Due to the limitations of the VRChat system, there was an upper limit to the amount of data that could be used (600MB including the background world), but as far as the Meta Quest 2 used in the experience was concerned, the data was of sufficiently high quality.
Volumetric Capture x Virtual Production
Using volumetric video data from idom, this time I was able to witness the creation of a video in collaboration with virtual production. “Kiyosumi Shirakawa BASE” is a permanent virtual production studio. Sony’s high-definition LED display, Crystal LED B series, Sony’s digital cinema camera “VENICE 2”, and the virtual production studio using Unreal Engine as an in-camera VFX system, the latest shooting technology called volumetric capture. It was a challenge to link
In virtual production, the appearance of the 3D CG projected on the LED display in the background of the subject changes according to the position of the camera. In general, background CGs that represent places and buildings are often used, but this time, the composition is such that there is a huge volumetric video of the idom in the background.
Therefore, when the camera moves to the left, the right side of idom becomes more visible in the volumetric video. Sony has developed a volumetric video rendering plug-in that supports UE in order to link this volumetric capture and virtual production.
“Kiyosumi-shirakawa BASE” was not just a studio that operated a complete shooting system, but a base where both creative and technology staff were stationed, and the shooting system and expression methods were constantly updated. In addition to mastering established methods, adding new mechanisms and techniques each time will increase the difficulty of shooting, but volumetric capture conveys the spirit of wanting to create new images even if it surpasses it. × It was a virtual production collaboration. I look forward to seeing further evolution in the future.
Interview with Toru Masuda, Sony Group R&D Center
We asked Mr. Toru Masuda, R&D Center of the Sony Group, about the Kiyosumi-Shirakawa BASE volumetric capture studio.
――Please tell us about your approach to volumetrics.
Mr. Masuda:
Volumetric is still a developing technology, so we are working on practical application and technical development in parallel, and we have been trying to raise the level every time we go through projects such as the introduction of new algorithms and hardware. In 2020, we set up a dedicated volumetric capture studio in the Sony Group Headquarters building that can shoot multiple people, and we have been shooting and producing projects such as music videos, movies and commercials. In 2021, it will be used in the movie “Violence Action”, and in July 2022, the volumetric capture studio will start operation at “Kiyosumi Shirakawa BASE”.
Regarding 3D content production, at the end of 2022, I worked on idom’s metaverse live “Yorunoyo [VIEWING-ROW/idom]”. We captured 3D images of idom singing the song “GLOW” in the studio, and created a content that people around the world can gather and enjoy together.
The most difficult thing at that time was the data amount limit of VRChat. From the VRChat side, the stable operation was specified to be about 600MB for everything including the world, and it was difficult to squeeze in photorealistic data. We worked with the R&D side on trial and error to find ways to keep the overall quality small and the textures small.
And now, we are in the process of developing other content from Metaverse Live. We are trying to display volumetric data on the virtual production LED panel and integrate it with the virtual production.
――Could you give us an overview of Volumetric Capture Studio?
Mr. Masuda:
The studio is circular and has a diameter of about 8.7m, and the shooting range is a cylindrical area with a diameter of 6m and a height of 3m. Even in Japan, we have a relatively spacious studio.
All the cameras lined up are 4K cameras. The maximum frame rate can be shot at 60fps. The lighting is a little dark now, but I can make it brighter and increase the shutter speed. The best I’ve done so far is 1/2000th of a second. It could probably be raised a little more, but it’s a trade-off in image quality. At 1/2000 second, even a golf or baseball swing could produce a beautiful shape.
A unique feature of our studio is that it can be suspended from the ceiling. You can’t get the fluffy feeling when you’re flying in the sky with your feet on the ground. Therefore, from the time of designing the “Kiyosumi Shirakawa BASE”, we made it possible to suspend it from the ceiling.
――How much data does idom use for virtual production?
Mr. Masuda:
Regarding the data capacity, it is necessary to think about how to shoot with the positional relationship between the camera and the LED panel. We prepared several patterns of data and conducted an experiment to confirm that it works without problems even in combination with background assets. As a result of the experiment, it was confirmed that the quality can be maintained even if the size is reduced to about 1/3 or less compared to normal video production, and that it operates stably.
――Finally, please tell us about future developments at Kiyosumi Shirakawa BASE.
Mr. Masuda:
Traditionally, 2D content production was the main focus, but we believe that demand for 3D content will increase in the future. We will also pursue realistic 3D quality, and as a Sony company, we will thoroughly pursue realism that is competitive with our competitors. The rest is a place where collaboration with creators is established. We would like to continue to create an environment that is closer to creators.
Originally Written in Japanese
Writer: Takayuki Aoki / Composition: Editorial Department