Full video: https://youtu.be/MoDOZd4HXXA
This summary is based on a transcript of a presentation from Unreal Fest 2024, titled "State of the Union Virtual Production," delivered by Ryan Maetta. The presentation focuses on updates and advancements in Unreal Engine, particularly in relation to virtual production (VP). Summarised by ChatGPT
Movie Render Queue (MRQ)
Transitioning from Stack-Based Presets to Graph-Based Paradigm*: Unreal Engine 5.4 saw the shift from a stack-based preset system in MRQ to a graph-based paradigm. This transition brought about the introduction of features like Collections and modifiers, which offer users more control over the isolation and rendering of elements within a scene, making it easier to prepare outputs for post-production compositing.
Support for Various Asset Types in Render Layers: Unreal Engine 5.5 has expanded support for rendering different asset types in separate layers. This includes heterogeneous volumes (e.g. VDB-based simulations), translucent objects (e.g. car windshields), Niagara effects, landscapes, and sky atmospheres.
Enhanced Media Output Options: The integration of ProRes and Avid DNxHR options within the graph, along with improvements to output quality and timecode support, makes Unreal Engine more compatible with post-production workflows. Support for all standard quality options for ProRes and Avid DNxHR provides editors with more choices for their projects.
Simplified Metadata Management: The addition of a metadata node for EXR files simplifies the process of adding custom attributes to rendered images, streamlining pipeline integration and eliminating the need for complex Python scripts.
Improved Console Variable Management: The presentation highlights improvements in console variable management, aiming to make it easier for users to find, set, and understand console variables related to rendering settings.
Full Parity and Production Readiness: The goal for Unreal Engine 5.6 is to achieve full parity with the legacy preset system of MRQ, making the Movie Render Graph officially production-ready. This will involve the integration of high-resolution and panoramic options, as well as the introduction of a "quick render" concept for streamlined rendering workflows.
In-Camera Visual Effects (ICVFX)
Depth of Field Compensation: Unreal Engine 5.4 introduced depth of field compensation, a feature that significantly improves the realism of depth of field effects in virtual production. This feature addresses issues like double blurring and unnatural focus transitions that occurred in previous versions.
2110 Inner Frustum Split: Introduced in Unreal Engine 5.4, the 2110 Inner Frustum Split enables the division of the rendered image into smaller sections, allowing for the distribution of rendering tasks across multiple machines and potentially boosting performance for large volumes and wide shots.
Production-Ready Nvidia Rivermax Support: In Unreal Engine 5.5, the Nvidia Rivermax setup for 2110 has been deemed production-ready, offering users a reliable and efficient solution for their virtual production pipelines. The focus has also shifted towards supporting other hardware partners like Matrox, expanding the range of options for users.
User Experience Refinements: Unreal Engine 5.5 introduces several refinements to enhance the ICVFX user experience. These include a level visibility column in the levels panel for easier management of sub-levels, and per-light card control for positioning light cards over or under the inner frustum.
Content Plugins
The presentation advocates for the use of content plugins to manage reusable assets and workflows across different virtual production projects. By packaging common elements, such as stage setups, camera tracks, and blueprints, into plugins, studios can streamline their workflows and ensure consistency across projects.
iOS Unreal Stage
Unreal Engine 5.5 brings some improvements to the iOS Unreal Stage app, such as a "show only selected" option to manage clutter when dealing with multiple light cards, and the addition of transform controls for colour correction regions.
Plate Playback
A new feature called "Holdout Composite" has been added to improve plate playback in Unreal Engine. This feature uses the holdout system developed for MRQ to "punch a hole" through the virtual scene and play back the plate as is, minimising potential artefacts caused by engine features.
Future Directions
Live Compositing: Unreal Engine is focusing on improvements in live compositing workflows, particularly in relation to simultaneous camera tools and a modern implementation of compositing features that cater to both basic and augmented reality-style setups.
Enhanced Colour Grading: The colour grading tools introduced in the in-camera VFX editor have been promoted and are now available across the engine. This provides artists with a more intuitive and dedicated interface for colour grading, with improved visibility and management of colour correction regions.
Virtual Scouting Advancements: The XR Creative Framework, which includes virtual scouting tools, has reached production-ready status. Improvements to the Gizmo system in VR provide users with more precise and comfortable control over object manipulation within virtual environments.
Open VR for Pucks: The presentation acknowledges the continued relevance of pucks for camera tracking, particularly for those new to the technology. The new Live Link Open VR plugin offers better support for these devices, allowing for headless operation and easier setup compared to the previous Live Link XR plugin.
Third-Party Plugin Support in Live Link Hub: Live Link Hub now allows for the externalisation of third-party plugins, simplifying the integration of performance capture solutions like Motive, Shogun, and Rokoko. This highlights the growing importance of Live Link Hub as a foundational component for performance capture workflows within Unreal Engine.
Elevated Focus on Performance Capture: Unreal Engine is emphasising the importance of performance capture as a critical element in real-time animation. This will be reflected in future developments, including a dedicated map manager for performance capture operators, improved integration with in-engine animation tools, and a significant overhaul of the Take Recorder to enhance its robustness and reliability.
Virtual Camera (VCAM)
Multi-user Actor Replication: Enhancements to actor replication in Unreal Engine enable the real-time sharing of actor properties within multi-user sessions. This is particularly beneficial for collaborative virtual camera workflows and controller input sharing, simplifying collaborative editing and control within virtual production environments.
Overscan Integration: Overscan is now officially supported in all camera actors, providing users with greater control over framing and composition. The addition of optional resolution scaling and crop functionality, as well as the ability to distort images using lens files with overscan, enhances the accuracy and flexibility of virtual camera workflows.
Lens Calibration
The lens calibration solver in Unreal Engine has reached production-ready status, thanks to improvements in accuracy and workflow. The solver leverages more tracking information, reducing the need for precise mounting of tracking devices. Rendering improvements, specifically the application of lens distortion as part of the Timecode Synchronised Rendering (TSR) process, contribute to better visual results. The ability to perform lens calibration and nodal offset calculations simultaneously further streamlines the calibration process.
VCAM Visualization Enhancements
Unreal Engine has introduced features that enhance the visualisation and management of recorded VCAM data. The addition of a bookmarks browser with thumbnails simplifies the navigation and retrieval of specific bookmarks. A take carousel with thumbnails offers a visual overview of recorded takes, making it easier for users to browse and select previous recordings.
Future Developments: Shots in Unreal Engine
Unreal Engine is embarking on a research project to define and implement a more robust system for managing shots within the engine. This initiative aims to address the challenges of scaling Unreal Engine for larger film and TV projects that involve hundreds or thousands of shots. The project will involve defining the concept of a shot within the context of Unreal Engine, exploring ways to organise and manage shot data, and investigating seamless integration with editorial workflows.
Kommentare