cancel
Showing results for 
Search instead for 
Did you mean: 

Virtual Production using AMD Adaptive Computing Technology

Robert_Green_0-1738259359272.png

 

Article co-authored with Olivier Antoine, Product Manager, DELTACAST

 

Virtual production has transformed the way filmmakers, game developers, and visual effects artists create immersive content. It blends physical and digital environments in real-time, offering significant benefits like cost savings, enhanced creative control, and streamlined workflows. Recent cinema and studio productions have demonstrated its potential, setting a new industry standard by seamlessly integrating live-action and CGI, opening unprecedented creative possibilities. The technology is now being adopted by smaller studios and news outlets to give the impression of being on-location.

 

A significant technological advancement driving transformation in virtual production is the use of Field Programmable Gate Array (FPGA) and adaptive System-on-Chip (SoCs) devices. AMD Kintex™ UltraScale+™ and Virtex™ UltraScale+ FPGAs are integrated circuits that can be programmed after manufacturing to perform a wide range of functions and offer a customizable hardware framework using programmable logic. This adaptability allows developers to tailor the FPGA’s configuration to specific tasks, optimizing performance and efficiency.

 

Robert_Green_0-1737386312665.png

 

AMD Zynq™ UltraScale+ adaptive SoCs and Versal™ adaptive SoCs combine a multiprocessor with FPGA programmable logic and other functions, offering a combination of hardware and software reconfigurability. The programmable nature of AMD FPGAs and adaptive SoCs makes them ideal for applications and systems requiring real-time video processing and low-latency performance. These versatile and powerful devices enhance virtual production processes, seamlessly integrating complex visual effects in a wide range of equipment, including cinema cameras, camera trackers, LED walls, content generation systems, and monitoring solutions.

 

Benefits of FPGAs and Adaptive SoCs in Virtual Production Equipment

The use of adaptive computing technology in virtual production equipment overcomes several challenges to make the system work:

  • Enhanced Performance and Efficiency: FPGAs and adaptive SoCs excel at managing the high data throughput required for real-time video processing and transfer at HD, 4K, 8K and beyond.
  • Low-Latency: Latency is a critical factor in virtual production, especially when integrating live-action footage with digital elements. Any delay can disrupt the seamless blending of these components, leading to a disjointed visual experience. Adaptive computing technology can process data with minimal latency, ensuring virtual production systems operate in real time whilst maintaining synchronization between different elements. This low-latency performance is crucial for applications like virtual cinematography, where directors need to see immediate results to make creative decisions on the fly.
  • Scalability and Adaptability: Adaptive computing technology allows for easy updates and adjustments, ensuring that the equipment can evolve with technological advancements during research and development, and even on-set, as well as scaling to support larger systems and pixel canvas sizes.

 

Enhancing Cinema Cameras and Camera Trackers

In cinema cameras, FPGAs and adaptive SoCs enable advanced image processing capabilities. They handle high-resolution image processing and sensor data management with minimal latency, and support HDR processing, ensuring that the captured footage maintains high quality under varying lighting conditions. They also provide video encoding and decoding in real-time, offering filmmakers the ability to capture and manipulate images through remote monitoring with exceptional clarity and speed.

Adaptive computing also plays a part in camera trackers, enabling precise and fast data processing, essential for integrating live-action footage with virtual elements seamlessly. These devices provide the computational power required for real-time tracking and stabilization, ensuring smooth and accurate motion capture. When integrated with AI, FPGAs and adaptive SoCs enhance object recognition and scene analysis, allowing for more precise tracking and automated adjustments during filming.

 

Powering LED Walls

LED walls have become a staple in virtual production, providing dynamic and interactive backgrounds for live-action shooting and are essential for creating immersive virtual environments. FPGAs excel in driving these high-resolution displays from LED wall controllers and pixel processors, through to pixel control in the LED tiles, offering real-time video processing and synchronization. Their parallel processing capabilities ensure smooth performance, even with the demanding graphical content required for immersive environments. Adaptive computing technology enables dynamic adjustment of the display to match the virtual scene’s lighting and perspective, ensuring a seamless integration of physical and digital elements. This capability is crucial for maintaining the illusion of depth and realism in virtual sets. LED walls also leverage FPGAs and adaptive SoCs for real-time pixel processing and color correction.

 

Monitoring Solutions with Low Latency

In monitoring solutions, FPGAs provide the high-speed data processing needed to handle multiple video streams with minimal delay. This low-latency performance is crucial for maintaining synchronization between different elements in a virtual production setup. Equipment makers can utilize FPGAs to develop monitoring systems that offer real-time feedback, enabling directors, production teams, and technical crews to make immediate adjustments and ensure the quality of the production.

 

The Importance of AV Interface Cards in Virtual Production

AV interface cards are a vital ingredient in virtual production processing systems. To create magic with content, you first need to be able to get it in and out of your system! These cards, based on standards like ST 2110 and IPMX for AV over IP, as well as SDI and HDMI™, require high-speed data handling and low-latency processing. FPGAs provide the necessary performance to handle these tasks efficiently, ensuring smooth and reliable transmission of media streams. This capability is critical for maintaining the integrity and quality of the content being produced. FPGAs handle the complex task of real-time data transfer and synchronization, ensuring that high-quality audio and video signals are seamlessly integrated into the virtual production workflow. This ensures compatibility and interoperability across various devices and formats, facilitating a smooth and efficient production process.

 

Robert_Green_2-1737386442273.jpg

DELTACAST is a leader in the design, development and manufacture of live video transport and processing solutions for OEMs and developers. Its solutions deliver high quality and the low latency needed to serve demanding applications in TV broadcasting, pro AV, medical, aerospace and many other markets.

 

For live video applications, the company offers a line of SDI, HDMI, DisplayPort™, and IP ST 2110 video cards. Now fully compatible with Unreal Engine 5, it allows media server users to seamlessly merge Unreal Engine's real-time rendering capabilities with live video inputs captured by DELTACAST cards, and to send out the live 3D rendering as an SDI, HDMI, DP or IP video signal. This capability is particularly useful in LED wall-based virtual production studios.

 

To design virtual sets, the industry wisely uses game engines for the live rendering of stunning photorealistic content projected onto the LED wall. The 3D environment must be calculated in real-time based on the on-set cameras’ tracking information and projected onto the LED display with the lowest latency possible, to keep the background consistent with the camera movements.

 

Moving video in and out of gaming engines like Unreal Engine with minimal latency is the specialty of DELTACAST I/O cards and FLEX solution. All SDI, HDMI and DisplayPort card models can be used in Unreal Engine thanks to the free DELTACAST Media Plugin.

 

Robert_Green_3-1737386442281.png

 

ST 2110 (and IPMX which is based on the same protocols) is growing in popularity in virtual production and especially in LED wall controllers, because it offers scalability to support large-format pixel canvases as a backdrop and interoperability between different AV and broadcast technology. ST 2110 is a family of open standards defined by the SMPTE consortium to provide an IP streaming alternative to historical SDI (Serial Digital Interface) connectivity used in broadcast infrastructures. Using IP infrastructures for the content production, where an optimal picture quality is essential, is now a technical reality thanks to increasing network bandwidths and computer processing power. Nowadays, 10, 25, 100, and 400Gb/s Ethernet connections effortlessly carry several uncompressed feeds in 1080p, 4K and even 8K. By using visually lossless codecs such as JPEG XS and High-Throughput JPEG 2000, network link usage can even be further optimized.

 

DELTACAST has introduced models dedicated to IP video streaming. The DELTA-ip-ST2110 10 and DELTA-ip-ST2110 01 are dual 10GbE interface cards that support SMPTE ST 2110 video, audio and ancillary data streams in reception and transmission. Recognizing their interoperability, DELTACAST ST2110 cards successfully obtained the “Self-Tested in Accordance with JT-NM Test Plan for SMPTE ST 2110” badge. The DELTACAST portfolio offers a wide variety of I/O cards with multiple combinations of inputs and outputs, so you can choose the right model depending on your setup.

 

DELTACAST has been using AMD FPGAs since the company was founded in 1986. Today, the company uses a wide array of AMD FPGAs and adaptive SoCs in its video I/O PCIe boards and camera control boards. AMD technology enables DELTACAST video cards to deliver rapid video transport and processing for demanding workloads. This includes media servers merging Unreal Engine’s real-time rendering capabilities with live video inputs captured by DELTACAST cards.

 

AI-powered FPGAs and Adaptive SoCs in Virtual Production

The use of AMD adaptive computing technology in virtual production is continuously evolving, with ongoing research and development driving new innovations. One exciting area is the integration of artificial intelligence (AI) and machine learning algorithms into FPGA and adaptive SoC architectures, capturing and processing data right at the edge and on-set, minimizing latency. This combination can revolutionize virtual production by enabling intelligent processing of visual data in real time. Scene analysis and object detection can speed up workflows, automating set up and getting shots done faster. FPGAs could be used to deploy AI for automatically adjusting lighting and textures based on scene context, further streamlining the production process, and enhancing visual quality.

 

Pushing the Boundaries of Creativity

For equipment makers in the professional AV and broadcast industry, understanding the value of FPGAs and adaptive SoCs in virtual production is essential as they offer customizable, high-performance, and low-latency solutions. As the demand for sophisticated and immersive digital content continues to grow, leveraging adaptable hardware platforms will be key to pushing the boundaries of creativity and technical excellence in virtual production. By integrating FPGAs and adaptive SoCs into their equipment, manufacturers can provide production teams with the tools they need to create visually stunning and seamless virtual experiences.

As the demand for more sophisticated and immersive digital content grows, the role of AMD adaptive computing technology in virtual production is set to expand. By leveraging these versatile devices, production teams can push the boundaries of creativity and technical excellence. The ability to enhance performance, reduce costs, and adapt to new demands makes FPGAs an asset in advancing virtual production capabilities. As the industry continues to innovate, the role of FPGAs will only grow, driving the future of storytelling and content creation. The journey into the future of virtual production is just beginning, and with FPGAs, the possibilities are limitless.

 

 


About DELTACAST 

Born from the TV broadcast industry, DELTACAST is a leader in the design, development and manufacturing of live video transport and processing solutions for OEMs and developers. Our solutions deliver the highest quality and the lowest latency to serve the most demanding applications in TV broadcasting, ProAV industry, medtech, aerospace and many other markets.

Website: https://www.deltacast.tv