Why Do PC Games Look Bad on TV – We love PC Games [Explain & Conquer]
There are many differences between PC games and console games. Technological developments, number of games, equipment… The list is very long. But in practice, there’s one fundamental difference that directly affects the gaming habits of the two groups, and that’s the screens used.
PC gamers usually prefer gaming monitors, while console gamers use TVs. When these two screens are used interchangeably, it doesn’t always lead to good results.
It’s normal for PC games to look bad on some televisions. This is because the default settings of TVs don’t come close to the level of PC games in terms of refresh rate, brightness, or resolution.
Why does my PC look so bad on my TV?
There are hundreds of reasons why PC games look bad on some TVs. However, if you can’t see a game that looks great on your gaming monitor at the same quality when you connect it to the TV, you can consider the following reasons:
- Refresh Rate: Almost all modern gaming monitors have a refresh rate of at least 120 Hz. For televisions, the industry standard is still 60 Hz. In other words: If you connect a game from a PC with 120 or more FPS to a 60Hz TV, you may notice severe slowdown and blur in the game.
- Resolution: Plugging a game you’re playing in 1080p on your computer into a TV with 4K resolution isn’t a good idea. That’s because if your computer isn’t powerful enough to deliver that resolution, it’ll have to upscale 1080p to 4K. That means bigger pixels and, thus, a weird look.
- Color Settings: The default color settings on most televisions are bad. Therefore, your game that looks great on PC may look weird on TV. To prevent this, you should make adjustments such as contrast, brightness, black levels, tint, and backlight.
You should be clear about why you want to play PC games on TV. Most of the time, a bigger screen brings less performance. The main reason console games run smoothly on TVs is that they usually run at 120 Hz or less.
Why is a TV bad for PC gaming?
If you can rule out the above problems, there’s no reason why TVs should be bad for playing PC games. The important thing is that you have a computer powerful enough to run a high-resolution TV.
GPUs perceive televisions as any display, just like gaming monitors. Therefore, it can’t be particularly responsive to it. If you want to connect your computer to a TV with 4K resolution TV, you should select an appropriate resolution in the display settings. Otherwise, your PC games will appear blurry on the TV.
Of course, this only applies to single-player games. There’s no logical reason (except perhaps for racing games) to play a competitive game on a large television screen. The industry standard for games with esports leagues is still 24 inches. In a few years, it’ll be 27 inches at most.
Although the Playstation 5 and Xbox Series X are great consoles, PC technology will always be a step ahead of consoles. Without forgetting, if you have a system that can handle the highest graphics, you can experiment on the TV.
The video below shows you the God of War comparison for PS5 and PC.
https://www.youtube.com/watch?v=8zZ0C91rsn4Video can’t be loaded because JavaScript is disabled: God of War PS5 Vs PC Ultra Settings Graphics Comparison (https://www.youtube.com/watch?v=8zZ0C91rsn4)
Post Tags:
#tv
Cat
Cat – Spiritus Rector , aka The Whip
Cat is the heart and soul here at PCPH … without her, building this site would be a much slower process and much less fun …
report this ad
HDMI from PC to TV looks horrible
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
- Status
- Not open for further replies.
#1
-
Add bookmark
-
#1
Hey all,<BR><BR>I have a new PC that uses the intel H55 chipset which features HDMI video on the motherboard plus support for HD audio. The TV is a Samsung LNR268W. Its a 720p TV. I intend to use Media Center on the PC with the TV. I have run into a a problem, however. <BR><BR>When I plug the PC into the TV via HDMI, the picture quality is very bad. The text is blurry, the images look like they aren’t being dithered correctly, edges on graphics aren’t smooth, and the color is off. The desktop is also being clipped, but I fixed that using the «aspect ratio» settings in the intel drivers to adjust for overscan. The funny thing is, when I connect to the TV via VGA, everything looks perfect. What could be going on with the HDMI? It works fine when I plug a consumer electronics device like a tivo, DVD player, or standalone blue ray player. <BR><BR>The TV video on the PC sends 1280×720 via HDMI. It sends 1360×768 when connected via VGA. Switching the resolution to the later when connected via HDMI causes pretty significant underscan, and the picture quality doesn’t improve.<BR><BR>help?<BR><BR>AMSR
#2
-
Add bookmark
-
#2
Can you get the TV to display ‘Native’ or dot-dot instead of any stretch mode it may be displaying?<BR><BR>Sounds like you also want to get the PC to output 1360×768 through the HDMI. <BR><BR>It can get pretty messy getting all this to work but as an overview you don’t want either the graphics card or the TV to be doing any scaling.<BR><BR>Also if you can’t get the desired resolution try Powerstrip as this can give you more options with most graphics cards.<BR><BR>Good luck.
#3
-
Add bookmark
-
#3
Does the TV have a PC mode for the picture? This may bypass any image processing or under/overscan in the TV and get it to display the pixels 1:1.
#4
-
Add bookmark
-
#4
on the samsung 1080p models theres a picture size/zoom mode called «Just Scan» that puts the panel into 1:1 mode. I don’t know if the 720p displays have the same thing. <BR><BR>You also need to figure out what the actual resolution of your panel is. <BR><BR>And lastly, once you get the pixels to 1:1, to clean up text you may need to go into the display menu and adjust the noise reduction, edge enhancement and other settings that mess with the image.
#5
-
Add bookmark
-
#5
I had my PC hooked up to a Samsung LNR series. HDMI just seemed horrible out of the box, but a few things alleviated the display issues.<BR><BR>It is a 1360 (I think it’s actually 1366.) x 768 display, and you should set up a custom resolution, if you can, for use over HDMI.<BR><BR>For the love of a decent image, disable all the dynamic contrast and brightness options in the TV menu. Using the dynamic contrast option made everything, especially reds, look dithered and damn near like an 8-bit display. It also flickered (again, especially reds) when dealing with solid color blocks over HDMI.<BR><BR>According to Samsung, they don’t officially support hooking this television to a PC via HDMI, as they claim that’s what the VGA connector is for.<BR>This link: How-to guide details at the bottom that LN-R series sets do not support computer connections using DVI/HDMI.
#6
-
Add bookmark
-
#6
<BLOCKQUOTE><div>quote:</div><div>According to Samsung, they don’t officially support hooking this television to a PC via HDMI, as they claim that’s what the VGA connector is for. <BR>This link: How-to guide details at the bottom that LN-R series sets do not support computer connections using DVI/HDMI. </div></BLOCKQUOTE><BR><BR>Yeah I am starting to think that is because they don’t have a way to set it to do 1:1 over HDMI. There is no «Just Scan» option like on the newer sets. It appears to do that automatically over VGA — which is why it probably looks fine. I also wonder about the color mapping.<BR><BR>The thing is, I’m not sure «PC mode» on the display should even matter in this case. Since the graphics card has HDMI and is in a «HTPC» shouldn’t it be smart enough to behave like a consumer electronics device when hooked to a display identifying itself as a digital TV? CE devices all seemed to work fine (BD player, AppleTV, WDTV, S3Tivo, etc). Any PC that advertises itself as HTPC should send the same signal over HDMI as a standalone CE device, right?
#7
-
Add bookmark
-
#7
<BLOCKQUOTE><div>quote:</div><div>Originally posted by AMSR:<BR>Any PC that advertises itself as HTPC should send the same signal over HDMI as a standalone CE device, right? </div></BLOCKQUOTE><BR><BR><BR>Well, uh. ..there is «should» and there is «would».
- Status
- Not open for further replies.
Optimal distance to a working monitor vs TV screen (multimedia monitor) / Sudo Null IT News the technical parameters of monitor screens and TV screens have become significantly closer — and both often have the same 4k resolution, the same 16:9 aspect ratios, the same HDMI video interfaces, the diagonals of monitors are constantly growing, catching up with televisions (as a result, some of them may have the same matrices), and many TVs, in turn, “learned” to display an image pixel-for-pixel
in 4:4:4 color resolution .
But still, there is a parameter that essentially separates these two classes of devices, preventing the appearance of an “ideal display device” that is universal in all respects (well, or at least requiring serious compromises) — the optimal distance to the screen, which differs in each of these cases .
SpaceX Mission and Launch Control Centers
We will talk about this now
What is good and what is bad 👍👎
Many have probably noticed that it is more comfortable to work (namely work, and not watch video or play) while sitting at a relatively short distance from the screen, while watching a movie is best «on the big screen» being at a sufficient distance from it.
Although it would seem the «visual» (angular) size of the image on a 12″ tablet in the hands at a distance of 25 cm from the eyes, on a 32″ monitor at 65 centimeters (arm’s length), on a 100″ TV at a distance of 2 meters, and on a 200″ cinema screen at a distance of 4 meters, it turns out to be almost the same, but the feeling of watching a movie, the «depth of immersion in viewing» is completely different (we will leave the quality of a particular film outside the brackets).
In turn, working (writing an article, coding, drawing, drawing, counting, etc.) on a 32″ screen at arm’s length is more comfortable than on a 100″-200″ screen at a distance of several meters (although some adapt, but IMHO this is the exception rather than the rule. )
Of course, due to the fact that our vision is binocular, the human brain can estimate the distance to the screen, so we can determine that «visually identical» screens are actually different, located at different distance
But why does the human brain «demand» that the screen be at the «correct» distance from the eyes, otherwise it becomes uncomfortable?
And why what is good in one case becomes bad in another (and vice versa)?
Darwin is to blame for everything 🐒
I liked the explanation that I came across a long time ago, back at the end of the last millennium, in the pre-internet era (when
trees were big computer monitors were small, and»The TV was considered huge) in some paper popular science Soviet magazine. I don’t remember what kind of magazine it was, unfortunately I can’t name the article either (although I tried to find it on the Internet many times later).
Literally, the article is also already I can’t reproduce (as far as I remember, it concerned not only work at the monitor, but also work in general, reading books, watching videos, etc. ), but its main meaning was that the eyes and brain of a person over millions of years of evolution have adapted to so that passively watch movies look for «gifts of nature» on the ground and trees at a distance of several meters, track down prey (and try not to become prey yourself) preferably further away, actively hunt also at a fairly large distance (hitting with a stone or throwing a spear), but here work at a computer carve prey, cook mammoth shish kebab, collect fruits, chop wood for the shaft of a new spear, draw a rock painting, etc. more convenient at arm’s length (550-750 mm, in full compliance with GOST / ISO / SanPiN ). At the same time, for smaller work, in which the object is held with one hand and processed with the other (for example, holding a bone spearhead, processing it with a flint ax in the other hand, and then attaching it to a mammoth vein shaft, peeling an orange, removing a stone from a fruit , to hold a tablet, smartphone or book in your hand, flipping through the other hand and the like) is more ergonomic at an even shorter distance of 250-400mm. Therefore, for example, reading a book, holding it in your hands at a close distance, is convenient, but lying on the table at the same distance is no longer comfortable, you want to move away.
Internet picture
Well, actually, this article can be completed, the main conclusions are clear from the crossed out phrases, then only small explanations for this key paragraph.
So.
Work Monitor 🖥️
Behind a working monitor, we essentially “work with our hands” — we print the text of an article, search for information on the Internet, read, write code, operate with numbers in calculations, draw lines, arcs and splines in AutoCAD, draw in a graphics editor, etc. etc., and it is more convenient for us that the result of our «manual work» is displayed on the screen at an arm’s length working distance (550-750 mm), even if we actually never touch it with our hands (for me, this is generally a taboo) . The mouse cursor is essentially a “representative” of our hand on the screen, and in general it is our “hand” — I think many have tried to drive the fly off the screen using the cursor, and some (including the author of the article) tried to push its cursor from the top edge of the monitor.
At the same time, in the process of work, we have the opportunity to slightly bend or move away from the monitor by 50-100 millimeters, practically without changing the position of the body. For example, if our main working position is 600-650 cm from the screen (the top edge of the monitor is at the distance of bent fingers), we can lean slightly up to 500-550 mm from the screen (the top edge under the wrist) to see some small detail in the image , or vice versa, lean back a little to 700-750 mm (upper edge to the fingertips) to see the entire image.
Those. with a small movement, without changing the position of the chair seat, and hands on the table, we have the opportunity to “zoom” the image by almost one and a half times!
With b or a greater distance to the screen, this would be impossible.
Well, if the screen is wide, then if necessary, in addition to turning the head, we can quickly deviate slightly to the left or right by the same 50-100 mm, focusing on the desired side of the screen — this may be especially true if the screen of a large monitor is not curved, and at its edges there is a noticeable deviation of the gaze from the normal.
Working at a computer is more convenient when the main working area of the monitor is located below eye level, which is also logical in general — we “work with our hands”, and the “working area” is usually located below eye level.
Internet picture
Moreover, all of the above — regardless of the size and proportions of the monitor screen, its resolution and pixel density, and even the number of monitors, «but that’s a completely different story» ©, deserving a separate discussion.
Internet photo
And by the way, the same working distance is required of us
SanPiN 2.2.2-2.4.1340-03 Hygienic requirements for personal electronic computers and organization of work
…9. General requirements for the organization of workplaces for PC users …
9.4. The video monitor screen should be at a distance of 600-700 mm from the user’s eyes, but not closer than 500 mm, taking into account the size of alphanumeric characters and symbols.
and
GOST R ISO 9241-5-2009. Ergonomic requirements for office work using video display terminals (VDTs). Part 5: Workstation positioning and operator posture requirements
… A.2.12 Observation distance and deviations
The optimal distance between the video display and the user’s eyes depends on various factors. The viewing distance provided by the design, i.e. the distance specified by the display manufacturer sets the size to >400mm. The optimal viewing distance for sitting office work is 600mm. However, individual users prefer distances from 450 mm to 750 mm…
Multimedia monitor, home theater, TV 📺
On it, the user, aka the viewer, passively watches the events, admires the landscape together with the characters of the film, looks for “gifts of nature”, “stalks prey”, “looks around”, “hides from enemies”, etc. and so on.
As we have already noted, it is more comfortable to do all this at a distance of at least a few meters.
Well, so that the visual size of the image does not suffer, the screen should be as large as possible based on the expected resolution of the video content *
- This refers to the real resolution of the video content, and not upscale from a lower resolution video, and not low-bitrate «reuploads», which formally seem to correspond to the required resolution, but in fact they are not.
At the same time, it is desirable to have some margin of distance or the opportunity to move back a little, because unfortunately we do not always have the opportunity to choose a video of the resolution that we would like.
I once met a simple visual check of the correct selection of the diagonal and viewing distance:
- If, when viewing widescreen video on a 16:9 screen, there is no desire to “stretch” the image to full screen, then the screen size and viewing distance are selected correctly
- If such a desire subconsciously arises, then the viewing distance is too large (well, or the screen is too small)
Unlike working at a monitor, when watching a video (passive observation of events on the screen), it is desirable that the center of the screen be located slightly above eye level, so that the viewer, even being a passive observer, is in the thick of things, one might say, participate in them, and not looked down on it.
Picture from the Internet
Photo from the Internet
Information monitor, gaming monitor, laptop, tablet, etc. mobile devices
Initially, I planned to make separate sections on them (preliminary sketches were preserved in the drafts), but then I decided to limit myself to the title working and multimedia monitors so as not to clutter up the article. If there is interest, then we can talk about them, “but this is a completely different story …” (© ABS)
So.
Some conclusions 🤔
They turned out to be rather contradictory…
On the one hand, for a TV or a multimedia monitor, we must ensure a sufficiently large viewing distance, at least 1.5-2 meters — from a closer one, it is still not so comfortable to passively observe events, and together with the heroes of the film admire the scenery and look for «gifts of nature», «survey the surroundings», «hunt down prey», «hide from enemies».
Based on this, the optimal diagonal of a multimedia monitor for 4k content is about 55 «-75» or more (hereinafter, we mean a 16:9 screen- for screens with other proportions, the numbers will be different, but the logic is the same).
Well, on the other hand, as we have already noted, it will be uncomfortable to work (namely, work, and not “occasionally work a little”) behind such a screen at arm’s length.
Theoretically, of course, it is possible to create a “virtual screen” of the required size on such a video wall, and turn off the extra space or cover it with a neutral picture, as it was in the film “Total Recall” by Verhoeven,
Still from the movie «Total Recall» (1990)
but it should be taken into account that the pixel density on a 65″ screen is half that on a 32″ monitor with the same resolution, i.e. to create a 32″ 4k «virtual monitor» on a 65″ video wall, it will need to have its 8k resolution. “This is such a squiggle…” ©
However, if you don’t chase the ideal, you can try to find a more or less acceptable compromise.
For example, a 40 «monitor IMHO can be considered as a good «desk» multimedia screen for selfish individual video viewing from a distance of one and a half meters (although two people from such a distance will not be very uncomfortable). And at the same time, the pixel density of a 40 «4k screen is still quite high — the same as on a 27» QHD monitor (I repeat that we are talking about 16: 9), while such a monitor can be placed on an office table and work for them at arm’s length.
Well, on a larger diagonal, the pixel density already drops too much. So, on a 50 «4k screen, the ppi is the same as on a 25» Full HD, which is probably not enough for a working monitor by today’s standards.
On the other hand, 32″ is probably the minimum minimum for a TV screen — the viewing distance is just over a meter.
Of course, with a viewing distance of one and a half meters or less, it is unlikely that it will be possible to place an ordinary sofa or chair for viewing, since a work chair should also be placed in the same space — and here you can’t really clear up. So we’ll have to watch movies from the same office chair we work in. Fortunately, all modern office chairs have wheels, and it is easy to move back to the desired distance. The main thing is that there is a reserve for such a movement at the back.
But then an «underwater rake» comes out. Even two rakes…
— Firstly, sitting in an office chair is too vertical, and in this position it will be uncomfortable to watch a movie for several hours, it is desirable to be able to lean back, fall apart “like in an armchair” (and if the movie is boring, then take a nap) .
— Secondly, as we just found out, the screen space of a working monitor is located below eye level, and for watching video it is desirable that the center of the screen be located slightly above eye level.
Although all most modern office chairs are adjustable in height and inclination of the seat and (or) the back, but knocking down the usual adjusted working settings of the chair for the sake of watching a movie, in order to return everything to its place in an hour and a half, jumping on a chair for several minutes, trying to find the right position of the seat, it is lazy. In any case, it bothered me.
Of course, there are chairs with memory for several positions, but their price is very inhumane (at least by my standards). Although in this case it would be convenient to set up two positions — “working” and “multimedia”, and switch between them depending on current requirements.
But I chose a more budget option for solving this problem.
I have the most common chair, without remembering the adjustments, I adjusted its back to the “upper” (working) position, the horizontal position of the seat of the chair is also pre-adjusted and is fixed with a seat latch that blocks the tilt back (the spring from tilting back is set to a minimum, but thanks to this lock, the chair does not recline in the working position), and to fix the required seat height, an easily removable plastic spacer 12 cm long is installed on the gas lift rod of the chair.
When the seat needs to be lowered, the spacer is removed as easily as it is installed, the tilt lock is released and the seat can be lowered and tilted back.
As a result, the chair has two «quasi-fixed» positions:
- «Upper» — the seat is at a height of about 50-52 cm, the back is raised, while it is convenient to sit at the table, working at the computer at a distance of 55-65 cm from the screen ( “outstretched arm”), the upper edge of the screen is slightly above eye level, the gaze is above the center of the screen, the left hand is on an additional pad in front of the mouse (see article How to protect yourself from carpal tunnel syndrome? ), and the right one is on a small pillow lying on a “stand” fixed on the open table door.
- «Lower» — the seat is lowered down to a height of about 38-40 cm and tilted back, while the back also leans back as much as possible — it is convenient
to doze offto watch a movie, lounging in a computer chair at a distance of about 120-130 cm from the screen, looking a little lower its center. Additionally, under the table there is a “suspension” for legs from a car seat belt
I now have
32″ monitor-TV Samsung LT T32E310EX
as a “working-multimedia monitor” I already wrote in articles A simple way to get “Flicker-Off”: “disabling” PWM flickering backlight LCD monitors and TVs and Workplace illumination, backlight, screen brightness vs eye fatigue
- The monitor is mounted on a «school» desk on a small podium 18 mm high. To the right of it is a full-size Decktop, on top of the NUC 5th Gen (not visible in this photo), under the TV-Box on Android (with a clock), behind the monitor there is a router in WiFi-Bridge mode and NAS (not visible in the photo). ).
- A lamp is used for lighting, placed above the monitor and directed back to the wall to create a backlight (powered by a USB-controlled extension cord, included with the monitor), the second lamp is almost overhead (visible in the upper right corner of the frame) shines on the table and a keyboard, general lighting — a chandelier in the middle of the room. Read more about this in Article Workplace illumination, backlight, screen brightness vs eye fatigue
- Monitor screen brightness with computer inputs connected (Decktop and NUC) — 125 cd/m 2
The brightness of the monitor screen when the «multimedia» input (TV-Box) and TV is connected — 300 cd/m 2
For more information on setting it up, see article A simple way to get ”Flicker-Off”: “turning off” the PWM flicker of the backlight of LCD monitors and TVs
I will supplement this description with photographs of what this monitor looks like from the “working” and “multimedia” positions from the point of view of the eyes. The photographs below show: made with a flash so that the position of the eyes relative to the monitor screen can be seen by its reflection.
- The monitor is connected to a “working” computer input and displays the Windows 10 desktop.
- Armchair in the upper «working» position, eyes at a distance of 60-65 cm from the monitor screen.
Reflection of
eye flash in the photo shows that the eyes are located approximately 1/5 from its upper edge, there is a slight shift to the left from the vertical axis of the screen (the monitor screen is not curved, the main working area is closer to the left side, to the right help information and additional menus are usually located).
Test Gamma Shift , also illustrating the position of the eyes relative to the monitor *
(clickable picture, view at 100% scale)
- 5 is not working now due to his death author Alexander Sokolov. Most likely this site died with him, but I will leave a link to it as a «memorial».
«Multimedia» eye position relative to the monitor
- Shot on a Samsung Galaxy J7 SM-J730F smartphone camera (27 mm EGF), the smartphone was located as close to the face as possible, almost next to the eyes, the picture was specially taken with a flash so that the position of the eyes relative to the monitor screen could be seen by its reflection.
- The monitor is connected to the «multimedia» input, and the Android TV-Box launcher is displayed on its screen.
- Armchair in the lower «multimedia» position, eyes at a distance of 120-130 cm from the screen.
Reflection
eye flash in the photo shows that the eyes are located approximately 1/3 from the bottom edge of the monitor, almost along the vertical axis of the screen.
Conclusion
Well, in conclusion, a question that probably arose among some readers:
- Actually, why is this all?
Why try to crossbreed a hedgehog with a snake, wouldn’t it be better to have a separate working monitor, and a separate TV, multimedia monitor or projector, and not make any compromises!
So,
- Who might need it:
Well, firstly,
egoists single people living in a small one-room apartment, studio or room where there is simply no room for another large screen (and in another small one it makes no sense, since there are no special advantages compared to the one considered in there will be no “combo” article). And there is practically no need for a second device in this case, since there is only one user.
Secondly, in a family living in a house or multi-room apartment, in which there is a large TV in the living room (for example, as in the picture in the middle of the article), and each of the family members has a separate study. At the same time, it may be desirable to have the opportunity not only to work in your office, but also to relax and have fun watch a movie to your taste alone🤠 while watching «Luntik» in the living room👾 (or something to the taste of the second half💕 ).
Well, for family viewing, of course, you can gather in the living room and watch a family movie on the Big Screen.
However, I hope that the article will be of some interest to other readers.
My other publications from this cycle:
🖥️ FullHD vs 4k and integer scaling: is 2 x 2 always = 4?
🖥️ Workplace illumination, backlight, screen brightness vs eye fatigue
🖥️ An easy way to get ”Flicker-Off”: “turn off” the PWM flicker of the backlight of LCD monitors and TVs
🖥️ Test to check the color resolution of a monitor or TV when connected to a computer via a digital video interface
🖥️ The method of self-determination of the response time of the LCD screen of a monitor or TV
what you need to know about HDR — Iron on DTF
Dynamic range is more important than 4K.
51414
views
It is impossible to see HDR on an SDR display, so usually in such comparisons the picture on the left is intentionally worsened so that the essence is clear
4K, as the standard, finally went to the masses. We can safely call 2017 the turning point in history when this permission became truly consumer. The price of the panels dropped below $500 even for large diagonals, and the content itself, albeit with an interference, began to meet the necessary requirements.
Consoles have provided invaluable support here. The advent of PS4 Pro and Xbox One X has significantly accelerated the adoption of 4K screens. Gradually (but not without a bunch of reservations) an extensive Hollywood film library is being translated into 4K. Already, almost all original Netflix shows can be watched in decent Ultra HD.
Difference between 4K and Full HD
But do you really need 4K resolution on a TV? On the monitor, which is most often located a couple of tens of centimeters from the eyes, the Full HD picture begins to “crumble” into pixels already at 27 inches. For this reason, Retina/5K and even 1440p are becoming more popular.
Among viewers, adaptation is slower, since from three meters at least some difference becomes visible only from 43 inches, and ideally, to justify the purchase of a 4K TV, you will have to take a diagonal larger than 49inches It is obvious that such a luxury is not easily accessible for most Russian apartments, even if you have money — sometimes a huge TV simply has nowhere to go.
Exaggerated view of HDR
The author of the article tormented himself with the same thoughts before buying a new TV. Take a good FHD, and God bless him, with this 4K? The thought is reasonable only at first glance, since HDR is also attached to the resolution on decent panels. And this seemingly not very promoted technology really feels like a real leap in image quality, even if there is not as much content for these standards as we would like.
The purpose of the material is to simply and clearly explain what HDR is and how to use it. We already have a look at what the process of buying and installing a TV with 4K and HDR on DTF material looks like, let’s look at the terms in more detail.
What is HDR and why is it needed
Visual difference
Let’s start with the fact that the very term HDR looks like a joke from a clumsy marketer. It creates unnecessary confusion right from the doorstep. The HDR that is on TVs has nothing to do with the technology with a similar name in smartphones or cameras. If we talk about photography, then HDR is a combination of several images with different exposures, where the goal is to achieve the most even version in terms of shadows and light.
Everything is wrong with video — here we are talking about the total amount of information. Before the advent of HDR, the picture standard, including on Blu-ray, was 8-bit. This is a good color gamut, but modern panels, especially OLED, are capable of displaying many more shades, gradients and colors than 8-bit sources allow.
A new 10-bit standard is designed to solve this problem, which allows you to transfer significantly more information about the brightness, saturation, depth and color of the displayed scene. A new HDMI 2.0 protocol has been developed for it.
But do not rush to change all the wires along with the TV! Older cables are compatible with HDMI 2.0a, no matter what marketers hang on your ears. The main thing is that they should be marked «High Speed with Ethernet». We are only concerned with bandwidth — the connectors themselves have not changed.
At the time of this writing, HDR on a TV is sharpened exactly to 10 bits, although the digital filming standard is 14 bit RAW (even higher for film), so even modern panels are still far from fully displaying the information that directors and editors work with many paintings.
Okay, but what does it do in practice?
HDR vs SDR example based on Uncharted 4
The sun begins to shine noticeably brighter in the scenes, it becomes clear that in one frame there can be several different light sources in terms of brightness and saturation. The problem with pixilated midtones in dark scenes disappears, and gradients and complex color blends gain volume. The sky is no longer overexposed and does not merge with the ground at the horizon line. In short, you see the picture more with your own eye than with the limited eyepiece of a camera.
At the moment, video games are benefiting the most from technology, where various real-time lighting is finally displayed more or less correctly. In the cinema, a lot depends on the quality of processing the source for HDR and there is a lot of frank hack-work, but even now there are reference films for the format such as Guardians of the Galaxy 2 or John Wick 2.
Television manufacturers are your enemies
So far, HDR, like any advanced technology, is the Wild West. All new TVs are labeled HDR Ready, whether they can adequately display 10-bit or not. There’s a big difference between a true HDR TV and a panel that just displays HDR content, downsampling it to 8-bit and just kicking up colors and contrast.
It’s sometimes hard to figure out the manufacturers. You should look to see if the panel is advertised as 10-bit and if it meets the open HDR10 standard. A sure sign of a true HDR panel is support for Wide Color Gamut (Wide Color Gamut). Without it, HDR loses all practical meaning.
A notable number of LCD TVs use active 8-bit panels, which “think up” colors using special algorithms. The HDR picture on such panels is slightly worse than OLED, but they are also much cheaper.
The return of the «beat» wars
At any stage in the evolution of a picture, a war of standards inevitably arises. But if in the case of the battle between Blu-Ray and HD-DVD, everything ended in failure for the latter, including consumers who bought hardware, then the battle of HDR10 against Dolby Vision HDR is likely to end in a bloodless draw.
The HDR10 standard transmits fewer colors and only supports 10 bits, but is fully open. The Dolby standard preserves tints better and expands up to 12 bits, but it is more difficult to comply with and implement. In any case, the issue of support for a particular technology is solved by a simple software patch, and the same games already de facto work with HDR10, since it was chosen for their consoles by both Sony and Microsoft.
TVs themselves often allow you to use several standards at once, so it’s not worth bothering with this now.
And what to look at?
If we are talking about the screen, then it is better, of course, to take OLED and its analogues. You’ll get deep blacks and full support for all the HDR goodies. If the wallet does not allow you to spend about 80 thousand on a top-end TV, then do not despair. The LCD models of 2017 have finally overcome childhood sores and you will immediately notice the difference between HDR and SDR, even if you lose in gradations of black and brightness. The author of this article has an LCD panel with HDR support and, I can assure you, you can see the difference with the content in standard colors from the first seconds.
As far as the source is concerned, all modern game consoles output HDR in one way or another (except for the Switch and the «thick» Xbox One). The PS4 just outputs HDR (no 4K), while the Xbox One S/X allows you to both play UHD discs and stream native 4K HDR directly to your TV. Of the online services, Netflix and Amazon already support the standard, and Netflix includes both a library for HDR10 and content in Dolby Vision.
What to watch?
All original Netflix content since 2016, plus all 4K movies released by studios on disc and digital. Very soon, a collection of Christopher Nolan films will appear on sale, the digitization process of which was controlled by the director himself. As with The Dark Knight on Blu-Ray, it is sure to set the standard for both UHD masters and HDR for years to come.
What to play?
A noticeable number of games support HDR even on «basic» consoles. Games such as Horizon Zero Dawn, Uncharted 4, and Gran Turismo Sport are especially bright (forgive the pun) on the possibilities of technology.
The latter was created from scratch for HDR, and therefore it emphasizes all the advantages of an extended range of brightness and colors. Especially for GT Sport, Polyphony Digital developed HDR cameras to capture a real picture and then calibrate it to match the game one. And the number of colors displayed by the game still exceeds the capabilities of even the most expensive panels. What is called, the benchmark «for growth».
However, not all games are adapted to HDR equally well, so read reviews on the Internet and watch Digital Foundry reviews. There is no need to worry, though, as developers are getting better and better at understanding the possibilities of an extended range, and, consequently, the quality of console content will only grow.
Things are not so smooth on PC at the moment. There are few games with HDR, and the image output itself is associated with color rendering problems at the system level (crooked drivers, Windows quirks, and so on). The author’s experience with HDR/Windows 10 has been mixed. In addition, popular players still do not work well with HDR, so we have to wait. As the speed of adaptation of 3D on a PC shows, a sane implementation of HDR on computers should be expected in about six months. Yes, and the library will catch up.
Is everything really good, but what are the disadvantages?
Usually calibration tables in games look simpler
HDR also has a lot of disadvantages on the current iteration, but I will single out a couple of critical ones.
- Your equipment is most likely not ready for HDMI 2.0a and HDCP 2.2, so you will almost certainly have to change the receiver along with the TV. I faced this (PS VR of the first revision), Vadim also faced this (receiver with HDMI 1.4).
- If it seems to you that HDR spoils the picture or fails colors, then the screen needs to be calibrated. Some games offer convenient calibration tools (CoD WWII, GT Sport), but for the most part, you still have to rely on your own instinct in this matter.