
- Meta has made it potential for folks to add excessive dynamic vary (HDR) movies from their telephone’s digital camera roll to Reels on Fb and Instagram.
- To point out normal dynamic vary (SDR) UI components and overlays legibly on high of HDR video, we render them at a brightness degree akin to the video itself.
- We solved varied technical challenges to make sure a clean transition to HDR video throughout the various vary of previous and new units that individuals use to work together with our companies day-after-day.
Over the previous 12 months, the Video Infrastructure groups at Fb and Instagram have seen a big enhance within the quantity of HDR content material being uploaded to our apps, with tens of millions of HDR movies uploaded day-after-day. Because of this, we have now been working to deliver true HDR video assist to our household of apps, beginning with Reels. At this time, folks now have the power to add HDR movies from their telephone’s digital camera roll to Reels and to have that video playback in full HDR . To make this potential, we would have liked to beat just a few technical challenges to make sure a clean transition to HDR video throughout the various vary of previous and new units that individuals use to work together with our companies day-after-day.
The journey to HDR might be higher understood if we have a look at how the introduction of shade tv, for example, was a recreation changer, permitting viewers to look at applications in full shade, a vastly totally different expertise from the black-and-white broadcasting of the previous. This marked a step change in bringing video nearer to actuality, setting the stage for future developments, similar to high-definition (HD) content material, that proceed elevating the bar for the viewer expertise.
Now we’re on the daybreak of the subsequent technology of video with the transition to HDR video. In contrast to SDR video, HDR video has a wider vary of luminosity and shade, which ends up in brighter whites, darker blacks, and a bigger potential variety of seen colours for extra vivid and true-to-life photos. The rising adoption of HDR cameras and shows, particularly on cellular units, permits a wider viewers to expertise its advantages.
Problem: Differing system assist for HDR
The rollout of HDR cameras and shows has resulted in an in depth vary of units with disparate capabilities, which makes implementing end-to-end HDR video creation and supply difficult. It’s extra sophisticated than merely enabling HDR video creation and add for supported units. From a product perspective, we have to be sure that HDR is accurately preserved via our whole media pipeline to offer a high quality consumer expertise all the best way from the creator’s system via our server-side video processing/storage and, finally, view-side supply and playback.
One of many essential challenges with having a variety of units with varied capabilities is making certain compatibility throughout totally different units. HDR video requirements have particular {hardware} necessities for each creation and consumption, and never all units assist these new requirements. Which means that some customers might not have the ability to view HDR content material. Even worse, they could see a degraded model of the video with incorrect colours, leading to a much less satisfying viewing expertise.
HDR video additionally comes with the problem of various HDR codecs, similar to HLG and PQ, both of which can comprise HDR10+ or Dolby Imaginative and prescient dynamic metadata. Totally different system producers might select to implement their very own requirements, with every having distinctive capabilities by way of the vary of luminosity and shade gamut they assist. Totally different sensors additionally lead to totally different colours. These distinctions all lead to extra traits for us to account for to make sure that a given video seems persistently and accurately throughout all units. For instance, if a video is encoded in HDR10 and performed on a tool with an 8-bit show, in some circumstances the video might look worse than if it had been an SDR video, with washed-out or unnatural-looking colours. If we didn’t handle this, it will be tough for creators to precisely showcase their work and assure the viewing expertise of their audiences.
Problem: Correct tone mapping
Since not all units assist HDR shows, we have to present backward compatibility by providing an SDR illustration of the HDR video. Tone mapping is the method of cutting down the dynamic vary and shade house of a picture or video whereas aiming to protect the unique look of the picture. Improper tone mapping might trigger shade shifts that look unnatural, similar to timber that seem blue. Tone-mapped movies could also be too shiny or too darkish. In an expert studio, handbook tone mapping or shade grading entails adjusting the parameters till the outcomes look pleasing. Nevertheless, we’d like an answer that may run mechanically on billions of movies with no need human assessment with the tone-mapped SDR encoding intently resembling the unique HDR video.
Refined tone mapping algorithms can be utilized to protect high quality and reduce any artifacts that had been launched. Goal visible high quality metrics for tone mapping stay an open analysis downside within the business, however primarily based on our inner testing, we had been capable of tune the favored Hable tone mapping operator to provide outcomes that moderately characterize the creator’s unique intent.
Tone mapping on the shopper
When creators first started importing HDR video content material, our media pipeline was not ready to deal with 10-bit colours. So HDR movies had overexposed colours, leading to a dissatisfying expertise. iOS units had been the primary to allow HDR by default and, as such, had been the place we started to see HDR uploads coming from our customers. To mitigate this, we transformed all compositions containing HDR content material to SDR on the system previous to add with client-side tone mapping.
We used Apple’s native tone mapping APIs to shortly launch a repair, mapping HDR movies to SDR shade house previous to importing to our servers. This manner all uploads had been assured to look good throughout all units, even when they had been coming from newer telephones that captured HDR video.
On Android the story was a bit totally different. With a extra various system panorama that lacked standardization for HDR in its early phases, we weren’t capable of depend on OS-level tone mapping to take care of a constant look. Some units had been creating HDR movies with the PQ switch perform, whereas others used HLG. These totally different requirements resulted in several representations of shade and, consequently, there was no one-size-fits-all resolution for tone mapping all Android HDR uploads into correct SDR representations.
Bhumi Sabarwal, a software program engineer at Meta, discusses the challenges round enabling tone mapping on Android. (from: Video @Scale 2022)
For Android, we would have liked to implement a deeper resolution. The early expertise for Android HDR Reels resulted in washed out colours for customers with newer units. So we constructed customized tone mapping shaders to precisely convert each PQ and HLG HDR into correct SDR. First, we extracted the video metadata whereas decoding the frames to find out which switch perform was used (e.g., PQ or HLG). Subsequent, as soon as we had every body in YUV colorspace, we may apply acceptable transformation matrices to transform into the goal SDR colorspace (BT.709). As soon as we had the SDR rendition, the remainder of the creation course of, together with inventive results, AR filters, and complicated media composition, had been capable of perform appropriately.
With client-side tone mapping in place, we had alleviated the problem with washed out colours, however we had been nonetheless not delivering a real HDR expertise for these creators who had HDR content material.
Tone mapping on the server
With client-side tone mapping, we had been capable of mitigate the visible high quality degradation related to HDR video processed in a media pipeline that solely supported SDR. This fastened the problems we had been observing, however our final purpose was nonetheless to unlock the ability of newer units with HDR shows and ship a full HDR expertise to them. This meant constructing strong end-to-end HDR assist whereas additionally supporting a satisfying consumer expertise for older units that won’t assist HDR.
As a part of the creation course of, we already transcode all uploaded movies into totally different resolutions and bitrates to offer a clean playback expertise for all units and community circumstances with adaptive bitrate streaming (ABR). With HDR movies, nonetheless, this will get a bit extra sophisticated since they require 10 bits per shade part per pixel, and are sometimes encoded with newer codecs, similar to HEVC, VP9, or AV1. These traits enhance the decode complexity, and thus require larger efficiency units to assist clean decoding and playback. If we delivered HDR content material to all units, together with these with out ample assist, we may introduce degraded efficiency as the upper requisite bitrates lead to wasted bandwidth, which ends up in elevated buffering, extra frequent in-play stalls, and decrease battery life.
Due to this fact, to construct an expertise optimized for all customers, we have to have a strategy to ship SDR encodings to units that may’t reap the benefits of the advantages of HDR. To deal with the problem of delivering HDR video throughout a various system ecosystem, we constructed tone mapping into our server-side processing.
With server-side tone mapping, we add content material to our servers with the unique HDR shade info intact, and generate each HDR and SDR representations for supply. Doing each HDR and SDR encodes doubles our compute for the HDR movies. Leveraging our Meta Scalable Video Processor for HDR processing, we’re capable of deal with this load with out elevated vitality necessities.
If a tool doesn’t assist HDR, solely the SDR illustration will likely be delivered for playback. Moreover, the tone-mapped SDR variants of HDR movies are helpful for blended situations, just like the Instagram Explore web page or the Fb Feed, the place the consumer expertise is greatest with a uniform brightness throughout all thumbnails and previews.
Problem: Managing brightness
Inside our apps, we additionally confronted challenges round sustaining the consistency of the app expertise when introducing HDR brightness and colours within the context of the consumer interfaces designed round SDR video. This usually resulted in inconsistent or unsatisfying experiences in our testing.
As talked about above, HDR doesn’t solely broaden the vary of colours, but in addition distinction by enabling larger ranges of brightness. This begs the query of how an already shiny show can grow to be even brighter to sufficiently accommodate the dynamic vary wanted for HDR, whereas additionally making certain that SDR content material stays accurately and comparatively represented. On iOS units, the default system conduct is sort of attention-grabbing. The brightest ranges of the show grow to be reserved for HDR content material, and the SDR content material truly turns into dimmer.
Inconsistent brightness when displaying HDR and SDR in a blended state of affairs. The leftmost video within the third row is an HDR video and is way brighter than the opposite movies.
What is occurring is probably not noticeable in an app the place the whole thing of the display screen is occupied by a single video. Nevertheless, in an app like Instagram, the place a video is displayed alongside others, this impact might be fairly dissatisfying and challenges us to outline a brand new normal for an optimum consumer expertise.
One case is the Discover tab, which presents a mixture of photographs and movies displayed in a grid. If we merely enabled HDR on this setting the HDR movies would draw further consideration, resulting in an unbalanced expertise. On this state of affairs we’d use the identical client-side tone mapping that we had used throughout video creation to tone map movies on the fly.
One other case is Reels, the place we show a single full-screen video at a time, overlaid by UI indicating the writer, caption, feedback, and different metadata. These interface components are SDR and, as such, they’re topic to the identical dimming conduct after they’re displayed alongside HDR content material. This led to points in our early experiments the place white textual content appeared grey when rendered alongside HDR video. We needed to make sure that our overlay textual content would at all times be rendered in true white as a result of the grey – being dimmer and utterly illegible in some circumstances – posed a usability concern.
As a result of our goal is to really present HDR movies, merely tone mapping again to SDR was not the perfect possibility. As a substitute, we labored the issue from the opposite aspect, rendering the overlays in HDR when accompanying an HDR video. To do that, we apply a brightness multiplier to the overlay colours, extending into the HDR elements of the spectrum and thus rendering on the similar brightness ranges.
Chris Ellsworth, a software program engineer at Meta, discusses our work supporting HDR on iOS. (from: Video @Scale 2022)
Acknowledgements
This work is the results of a collaborative effort between the whole Video Infrastructure and Instagram Media Platform groups at Meta. The authors wish to prolong particular because of the next folks: Anthony Dito, Rex Jin, Ioannis Katsavounidis, Ryan Lei, Wen Li, Richard Liu, Denise Noyes, Ryan Peterman, David Ronca, Bhumi Sabarwal, Moisés Ferrer Serra, Ravi Shah, Zafar Shahid, Haixia Shi, Nidhi Singh, and Kyle Yoon.