1080p? 4K? UHD? HDR? - An overview in the display jungle

1080p? 4K? UHD? HDR?

An overview in the display jungle

Mark Zeman
by Mark Zeman
on July 09, 2020
time to read: 6 minutes

Keypoints

This is how you deal with new Screen Resolutions

  • It is important to ask yourself what resolutions you want to look out for
  • Refresh rates are not that important
  • Instead HDR is providing serious advantages
  • And it remains important to confirm component compatibility

In this time of staying home and finding entertainment that is socially distanced, I’m probably not the only one who decided to upgrade their monitor(s). In my case, I decided that after almost 10 years, I would replace my mismatched set of 24-inch Full HD screens with two identical 4k UHD HDR screens. That’s a whole bunch of abbreviations though, so let me explain what it means, what you might want to look for in a screen and what my experience with this all has been and talk about something a little lighter than most things on the news.

The driving factor in wanting to upgrade my monitors was that modern monitors support higher resolutions than the screens from 10 years ago. This is also quite simply the most basic area that they improved in. The highest resolution that my old monitors supported was 1920×1080 pixels, also known as Full HD. Full HD is a marketing term that was differentiating itself against SD (standard definition) and HD Ready (1280×720) when it was introduced to the market in 2005. For bonus confusion, the body certifiying devices with those labels? Also called HD Ready.

These terms mostly came out of the world of television, where resolution had standardized and stagnated for a long time with various attempts at higher definitions failing to capture enough interest to stay and spread throughout the market. However, by the late 2000s, it was clear that TV would switch to digital broadcasts and as such, it was the right moment to move to higher resolutions with that.

These days, the same marketing brings us 4K UHD, Ultra High Definition. Since adding ever more adjectives gets silly eventually, the k-class is being talked about more, with 8K being another reference point. This refers to the rough width of the resolutions, with 4K typically representing 3840×2160 pixels and 8K representing 7680×4320 pixels. In this lingo, the 1920×1080 resolution of my old screens would be called a 2K resolution.

However, as this is marketing, all of these terms are rather fuzzy and can apply to multiple different resolutions. Particularly, wider formats might well have around 4000 pixels horizontally, but a significantly lower vertical resolution than expected so watch out when a screen is described as ultra-wide.

There is one more relatively common way to refer to resolutions, where resolutions are called 1080p or 2160p. This refers to the vertical resolution, so an ultra-wide screen might fairly be described as being 1080p and 4K, if it sports a resolution of 3840×1080 pixels. The p in this terminology doesn’t stand for pixels but for progressive as opposed to interlaced. Interlaced formats are not relevant for typical consumer use anymore and no modern TV or computer monitor still advertises them separately, even if they support them. Interlaced essentially stopped at 1080i, as compression of video signals obsoleted the bandwidth savings that interlaced formats offered.

Okay, so now we’ve established that I wanted to upgrade my computer display from 1920×1080 to 3840×2160 pixels. Going from 1080p to 2160p, since higher resolutions are, at this point, still very expensive. That got me sharper images and more space on my screen for things that stay the same size. What else has changed?

High Dynamic Range and Refresh Rates

Well, apart from monitors having gotten better in many small ways, from increased viewing angles over better blacks and faster brightness changes, two major improvements are potentially relevant for entertainment.

One that I’ll only touch on briefly is refresh rates. Sometimes included with the resolution standard, as 1080p60 for example, but more frequently just mentioned about the screen as a rate in Hertz (Hz), this simply describes how many times per second the screen can display a new image. Common is 60Hz but in particular for gaming, higher rates can be interesting and so screens with rates up to 144Hz (and more, for a ton of money) are available. If you aren’t planning to play shooter games competitively, however, this is not generally worth it. Not least because your PC needs to be powerful enough to run things at that refresh rate, too.

More relevant is the ability to play HDR or high dynamic range content. Dynamic range is about the difference between the brightest and the darkest parts of the screen and is summed up nicely by the following statement:

Bright things can be really bright, dark things really dark and details can be seen in both.

Since more and more movies are produced and streamed via Netflix et al with HDR imagery, an HDR screen makes sense and really does, in particular, improve dark scenes in movies. However, under Windows it comes with a potentially annoying drawback. Not all applications work with the HDR mode, so Windows will turn to SDR mode for those – which can take a second or two. Switching quickly to and from such an application becomes very cumbersome. It also requires some adjustment in the settings, as Windows HDR sets the brightness for non-HDR things weirdly by default, messing up the colors. This is relatively easy to fix if you’re not relying on perfect accuracy – and a massive effort if you do.

Other pitfalls and overall impression

In my case, replacing old devices came with one more unfortunate surprise. I thought I could just reuse the HDMI cable I was already using. However, with that, I ended up with a black screen until I went into low-resolution mode and from there tested out which resolutions worked. Turns out, I was using an HDMI cable that was too old to support resolutions higher than 1080p! That took… a while to figure out, so I do definitely recommend checking that first and figuring out if the cables enclosed with the screens will be long enough and the right type.

On a larger screen, the higher resolution is a bigger benefit, as the image stays sharp even at a larger size, while a too small image can’t get any better. However, even at the 28-inch size I have, there are noticeably more details.

However, the higher resolution is great for being able to view multiple windows or documents on the same screen and 4K HDR video is noticeably nicer than particularly non-HDR video, so if you’re thinking you might get stuck at home more, it’s definitely a nice upgrade. Especially when you start watching series like Netflix’ Dark, where being able to see details in scenes with high contrast is a big plus.

So for me, after the initial cabling woes, the higher resolution is nice, but HDR is what really improves the screens.

About the Author

Mark Zeman

Mark Zeman has a Master of Science in Engineering with focus on Information and Communication Technologies at the FHNW. He was able to transform his passion of information security to his focus since 2017. During his bachelor studies he worked for an email security company. (ORCID 0000-0003-0085-2097)

You need support in such a project?

Our experts will get in contact with you!

×
FONES Minimum Standard

FONES Minimum Standard

Mark Zeman

OWASP Core Rule Set

OWASP Core Rule Set

Mark Zeman

totemomail

totemomail

Mark Zeman

You want more?

Further articles available here

You need support in such a project?

Our experts will get in contact with you!

You want more?

Further articles available here