It shouldn’t surprise anyone that the monitors movie and TV-making professionals use to judge and evaluate the footage they capture, edit, and manipulate are far different than the typical TV, phone, and tablet displays we all watch their work on. Many may not realize, though, that it’s not because the screens pros rely on are massive in size.
Most are smaller than the typical flat-screen TVs in plenty of living rooms. Instead, these professional-grade displays are extremely expensive, costing tens of thousands of dollars, because they’re designed to be exceptionally accurate in all areas that impact video image quality.
Sony has long been a leader in producing these advanced professional reference monitors and the company’s BVM-HX310, which was originally launched in 2018, is now widely viewed as the industry standard for mastering high-dynamic range content (HDR).
One of it’s biggest distinctions from a typical TV is its brightness capabilities. As the monitor’s own marketing materials state, “the BVM-HX310 features 1,000 nits full-screen brightness with 1,000,000:1 contrast ratio, making it ideal for displaying High Dynamic Range (HDR) content with rich, deep black areas and accurate reproduction of bright peak highlights.”
For comparison sake, the brightness levels of consumer TVs are usually measured in two common ways. So-called peak brightness – often the most gaudy stat cited prominently by TV makers – looks at the maximum brightness a smaller area of TVs display can achieve. The exact area measured can range widely depending on the testing outlet, but stays usually between 2% to 10% of the screen’s total area. Full-screen brightness on the other hand, is essentially exactly what it sounds like, measuring how bright the entire screen can get.
Nearly six years after the launch of the first BVM-HX310 Professional Master Monitor, now newer TVs are now finally boasting peak brightness figures that are capable of hitting the 1,000 nit threshold a growing amount of modern HDR content is mastered at. But plenty of TVs, especially high-end OLEDS produced as recently as a few years back, were still incapable of matching the standard.
Practically speaking, this means that even the best TVs most of us at home have now can’t precisely reproduce content mastered at 1,000 nits the way its creators would like. Instead, our home TVs engage in a process known as tone mapping, which recalibrates the maximum brightness of HDR video content to match the maximum brightness of the TV it’s being shown on.