The high energy cost of HDR

Power lines

A piece I wrote for Viaccess-Orca covering some of the latest news from the TV industry while trying to avoid all the ‘OTT is destroying linear TV/Linear TV is alive and well’ surveys: Hot TV News: Game of Thrones Season 6 Triggers Piracy Spike; NAB 2016 – Turning VR in to Actual R; The Environmental Costs of 4K HDR.

But while it’s interesting, though unsurprising, that all the anti-piracy measures in the world haven’t put so much as a dent in the Game of Thrones illegal download juggernaut (and perhaps even more germane that 720p and 1080p Torrent requests are edging closer to 50% of traffic), it’s the energy consumption story that fascinates.

Consumer testing data from the US indicates that 4K TVs consume an average of 30% more power than HD, primarily due to larger sets and stronger backlights. Then there’s the impact of  HDR to measure, which is still bit of an unknown but in the one set the centre tested it increased power consumption by a massive 47%.

Extrapolating that out and US residential energy bills could rise by more than $1 billion per year if all televisions larger than 36in transition to 4K at today’s average efficiency. And while that efficiency will come down, extrapolate that out to a whole planet and you have a significant cost and a significant drain on already stretched resources.

Photo credit: Ian Muttoo via Foter.com / CC BY-SA