DHR: Distributed Hybrid Rendering for Metaverse Experiences
Classically, rasterization techniques are performed for real-time rendering to meet the constraint of interactive frame rates. However, such techniques do not produce realistic results as compared to ray tracing approaches. Hence, hybrid rendering has emerged to improve the graphics fidelity of rasterization with ray tracing in real-time. We explore the approach of distributed rendering in incorporating real-time hybrid rendering into metaverse experiences for immersive graphics. In standalone extended reality (XR) devices, such ray tracing-enabled graphics is only feasible through pure cloud-based remote rendering systems that rely on low-latency networks to transmit real-time ray-traced data in response to interactive user input. Under high network latency conditions, remote rendering might not be able to maintain interactive frame rates for the client, adversely affecting the user experience. We adopt hybrid rendering via a distributed rendering approach by integrating ray tracing on powerful remote hardware with raster-based rendering on user access devices. With this hybrid approach, our technique can help standalone XR devices achieve ray tracing-incorporated graphics and maintain interactive frame rates even under high-latency conditions.
READ FULL TEXT