Game On!

Revolutionary screens for the players

This will be the biggest change in the image quality of computer screens since the replacement of CRT by LCD screens. 2017 will gown down in the history of video games as the beginning of the era of HDR screens.

Would you like a monitor screen to show you images of lightning cutting the sky in two, so bright it is nearly blinding, not just a pale imitation as is currently the case? A full range of colors and richness of details in a dark backstreet after dusk? More vibrant, vivid colors in all the ranges of the electromagnetic radiation spectrum visible to the human eye? In sum, would you like the image displayed by the screen to resemble to a greater extent what we see in the world with our eyes?

If so, better start saving up for a screen equipped with HDR (High Dynamic Range) technology. When you see the difference it makes in a direct comparison, you’ll be astonished. One glimpse of an HDR image and you’ll feel this is what the future of gaming looks like. Incredibly colorful.

A huge leap in image quality

What is HDR and why does it make such a big difference? In the simplest terms, this technology, compared to that currently used in classic LCD screens, ensures a substantially greater dynamic range of colors on the scale determined on one side by the darkest and on the other by the brightest points displayed on the screen. Thanks to the large potential difference in brightness it is possible to display more shades, which in turn results in more natural images, including richness of color as well as more visible details. The image brightness range of HDR is similar to that perceived by the human eye.

With traditional monitors the scale of brightness is substantially smaller, and could even be referred to as “flattened.” They cannot show us the blinding sun rising above the horizon, nor the web of wrinkles on the face of an old man hiding in the deep shadow of a tree. And we do not appreciate all the intense colors and diverse shades present in such a setting that we can observe in reality. Images displayed on traditional monitors, depending on the settings, will either be too bright in the dark areas or too dark in the bright ones. In practice, most parts of the surface will be affected by one or the other optical defect resulting from inadequate brightness levels.

If you are able to connect your computer to a HDR 4K TV set, we encourage you to watch the following comparison of images with HDR technology and without it (SDR: Standard Dynamic Range). The video comes from the remastered version of “The Last of Us” game released for PlayStation 4. If you display the following clip on a monitor that does not support this technology and resolution you will not be able to fully appreciate the differences.

HDR + 4K = a feast for your eyes

HDR technology has only been used in TV devices for a short time, with the first models equipped with this feature on sale last year. This year, however, we can expect a true market premiere of this technology in computer monitors already designed specifically with gamers in mind. It is most definitely worth the wait. Changes in image quality will be more spectacular and appealing for the viewer than in the case of the resolution boost from the popular Full HD (1920 x 1080) to WQHD (2560 x 1440), panoramic UWQHD (3440 x 1440), or even 4K (3840 x 2160).

Does this mean we’re encouraging you to choose a new monitor with HDR over WQHD or 4K? No. In fact, you do not have to face this dilemma. The producers who announced their first gaming monitors with HDR decided to combine this technology with high image resolution. There are already on the horizon HDR UWQHD models, as well as HDR 4K models. They should even come equipped with an extensive range of technical bonuses that’ll be greatly appreciated by gamers, such as very high screen refreshing frequency or synchronization with the graphic card (G-Sync, Free-sync) to ensure smooth image display without “breaking.”

If at this point you’re thinking that monitors with a resolution higher than Full HD are not for you because a graphics card that can display “The Witcher 3” costs a small fortune, we can reassure you there is a light at the end of the tunnel – it is going to be OK!

Medium grade computers can handle this as well!

Computers will benefit from the achievements of developers who are bleeding dry the electronics installed in the latest generation game consoles, such as the PlayStation 4 Pro and recently announced Xbox One X. The producers of both these devices (Sony and Microsoft) declare that they constructed them with 4K resolution image generation in mind. Obviously, the path from declaring to fulfilling promises is sometimes long, winding, and often leads us astray. We know this all too well both from the history of consoles as well as of PC graphics cards. Actually, we already know today that these expectations will not be met in the case of every game created with PS4 and XOX in mind. The Rubicon, however, has been crossed – 4K resolution is the new standard in video games and there is no turning back.

You may ask what this all has to do with the fact that today’s PCs, even ones with a decent graphic card (or even upper-middle class), give up when faced with a new game if we boost resolution up to 4K.

It actually has a lot to do with it. What is significant is the fact that the graphics systems in both previously mentioned consoles – PlayStation 4 Pro and Xbox One X – have a relatively moderate computing power, substantially lower than the cutting-edge graphics cards installed in computers. Even if we consider the consoles’ architecture is optimized for games processing, which makes it possible to get more out of the electronics than in the case of similar systems in the computer, it is not adequate power to ensure smooth 4K images in the latest games that are filled with many details and advanced visual effects.

It is all determined by technological tricks. Using the sophisticated development tools it is possible, by artificially boosting the image to 4K quality, to “trick” the human eye effectively enough that it would not notice any significant difference between 4K images generated naturally, thanks to pure computing power, and its impressive substitute.

Therefore, we have the right to expect with time similar solutions will be applied in computers designed specifically for games. Thanks to this we will be able to enjoy HDR and, hmm… “4K” images without the need to take out a loan for a new graphics card.

And what about the rest of the electronics?

Easy: the Intel processors used at the moment in gaming computers have enough computing power not to create a bottleneck in high resolution HDR images processing – WQHD, UWQHD, or even 4K. Your computer is driven by a powerful Intel Core i7? Excellent! Or maybe by a not much less efficient Intel Core i5? It should also be up to the task without any problems – please accept our congratulations on your excellent choice! Maybe you will need to buy some additional RAM memory, but those who today have 16 GB in their computers do not need to worry. It will be more than enough.

Impressive premieres ahead of us

The first monitors with HDR have already been released on the market, but this is mainly an offer addressed to professional computer graphic designers – for example, the 31.5″ BENQ SW320 for over PLN 6000. The most interesting models designed specifically for games are only now being prepared for sale, but we can admire the most impressive developments during this year’s Computex exhibition. Monitors with HDR and G-Sync were announced for this year by key manufacturers such as Acer and Asus. Let’s look at the offer prepared by the latter:

ROG Swift PG35VQ

Do you like panoramic, curved screens? ROG Swift PG35VQ should take your fancy! This is a 35″ monitor with concave screen with UWQHD (3440 x 1440) resolution, which corresponds to proportions of 21:9, and of course it is equipped with High Dynamic Range (HDR) technology. Its other advantages are: refreshing up to 200 Hz, peak luminance even up to 1000 cd/m² (as we recall high maximum luminosity is indispensable in HDR monitors). LED backlighting is dynamically controlled in 512 zones in order to – as the manufacturer declares – additionally highlight subtle differences between light and dark shades. The monitor was manufactured in quantum dot technology, which ensures a broader spectrum of colors than in traditional LCD displays and supports DCI-P3 color range in cinema standard. Owners of computers equipped with Nvidia graphics cards will be pleased with G-Sync standard support.

ROG Swift PG27UQ

We are not going to build up the tension, we will just say this: ROG Swift PG27UQ is the first announced gaming monitor with HDR and 4K. Period.

This 27″ beast, which received Computex’s 2017 Best Choice Golden Award, is able to refresh images with resolution of 3840 x 2160 and 144 Hz frequency. LED backlighting is dynamically controlled in 384 zones. Quantum Dot technology was also used with DCI-P3 standard color range support. The monitor, with IPS matrix, displays colors within the spectrum 25% larger than sRGB. ROG Swift PG27UQ is also a treat for users of Nvidia cards due to the G-Sync technology support.

Quite a promising start, some might say. Now we are waiting for HDR support to become the obligatory standard for games producers. However, it is hard to imagine the major, high-budget video games not taking advantage of the possibilities provided by state-of-the-art TVs and monitors in the future. High Dynamic Range may become an accepted part of the world of games faster than we expect today. We already cannot wait!

Find more gaming resources from Intel here.

Share This Article

Related Topics

Gaming Tech Innovation

Read This Next

Read Full Story