The rest of the industry is finally catching up to what smartphone and tablet owners already knew: people want high-resolution, high pixel density displays. It’s ridiculous that even in 2014, the majority of mainstream laptops being sold feature a display resolution that’s not only lower than mid-to-high-end smartphones with screens a fraction of the size, but they employ vastly inferior panel technology to boot. Thus, the great white hope is 4K.
Kind of a misnomer, “4K” is essentially rounding up the horizontal resolution; the actual display resolution is typically 3840x2160. It maintains the 16:9 aspect ratio that a lot of us still chafe under, but essentially quadruples the resolution of a 1080p display, doubling it in each dimension.
For now we can ignore manufacturing costs for these new, higher resolution panels since those will eventually come down along with the overall price to the end user as economy of scale kicks in. Yet there are still a few mitigating factors that surface when we talk about making the move to a 4K display, especially if you’re a gamer, and these mitigating factors are the kinds of things that keep me using conventional 1080p/1200p displays at home.
New technology often materializes before the infrastructure is fully in place. Intel launched their Ultrabook initiative with Sandy Bridge processors that weren’t really designed to hit those low voltages. Each transition to a new memory technology was very gradual. 4K displays have a similar chicken and egg problem, and there are three key points where we’re just not quite ready.
Read: Op-ed: Are we ready for 4K?
This article is brought to you in partnership with TechSpot
35 Comments - Add comment