Forums

Topic: Anybody else find that adjusting your TV settings is better than HDR ?

Posts 1 to 17 of 17

nomither6

The HDR feature on the PS4 pro for me , doesn't look as good as just tweaking my TV settings - I tried to give the HDR on my TV the benefit of the doubt , but games don't really look as good..It was just another case of me falling for the hype . at least it was , until I started playing with my TV's color & contrast settings , sharpness , and a bunch of other things that I have no idea what it means , lol .

Just thought I'd put that out there if anyone ever wants to try it. It did wonders for me and even sometimes , I Iike to switch up my TV settings - sometimes I make games look quasi-realistic and , some games like COD , I make look cartoon by messing with the vibrancy and saturation .

Edited on by nomither6

nomither6

Zeldafan79

I was driving myself nuts trying to get the perfect picture on my PS4 and switch. I'm like should the RGB setting be on limited or high range? Is factory default good enough? Where should my black level be Etc Etc. Then i just said ah screw it i could be playing my game instead of fighting with the picture options all day.

Ever notice sometimes the in game picture settings don't even seem right? You put it on the suggested settings and it still seems off. Like for example i was playing dark souls and i was contstantly having to adjust the brightness because in certain areas i couldn't see worth a damn. I had the brightness bar where it said to and i might as well have been playing blindfolded.

"Freedom is the right of all sentient beings" Optimus Prime

nomither6

@Zeldafan79 ''Ever notice sometimes the in game picture settings don't even seem right? You put it on the suggested settings and it still seems off.''

yep , i realized that a while ago actually . i used to presume that default on everything was the best way to go , and to not mess with anything but , boy was i wrong . its all about customizing your experience to how you like it . what someone may think looks good , may look like trash to you . the way i have my custom settings now - and then when i switch over to ''default'' picture , is like night and day . my custom settings are so much better ; even in games when i adjust the brightness/graphics my way .

you should give it another try , you'd be surprised just how much better you can make a game look on your own . it'll enhance the immersive experience too .

nomither6

JohnnyShoulder

Yeah I messed around with my TV settings like crazy when I got my 4K TV. Apart from one or two games I was never really satisfied with the results for HDR on my OG PS4. The only game I've had a problem with is Cyberpunk on PS5, but that had a weird implementation of HDR.

Everything else has been fine, so either I have just got used to it or all that messing around with the TV settings has made a difference.

Life is more fun when you help people succeed, instead of wishing them to fail.

Better to remain silent and be thought a fool than to speak and remove all doubt.

PSN: JohnnyShoulder

johncalmc

I don't know what HDR does and I don't know how to set up my TV. My TV turns on and I can play games on it. That's my level.

johncalmc

Twitter:

nomither6

@nitram2k11 ''Same here, it's amazing how much difference decent TV settings can make when compared with the out of the box settings.''

yeah , its better than HDR on my TV , i have that turned off . its practically useless compared to the TV settings and how much you can adjust on your own .

nomither6

BAMozzy

Adjusting your TV is NOT the same as HDR. SDR is 8bit colour and not as 'wide' a colour range. Not only that, SDR is mastered to around 120nits. Yes you can turn the 'brightness' up, but you are not getting the highlight details either. Just being 'bright' doesn't make it 'correct' or better.

HDR isn't about making 'EVERYTHING' brighter - its about getting more accurate and more realistic colours and lighting. Take a lightsabre or neon sign for example. In SDR, they often use 'white' to simulate the look that its 'glowing' with a gradient around the outside to give the impression of 'colour'. However, HDR could just use a very 'bright' red. Its a similar trick to artists who try and paint 'fire' for example and use white to simulate intense brightness.

HDR is about colour volume and HDR has significantly greater Colour volume
Untitled
Untitled

The top image is the 'colour range' - often shown as a triangle. 709 is SDR and 2020 is HDR - as you can see, the triangle is much 'bigger' so has a wider Colour Range. SDR is mastered to about 120nits (or thereabouts). HDR is mastered to at least 1000nits, some at 4000nits and some games are upto 10000nits. In the second image, that 'colour range' is at the bottom and the 'height' is brightness in 'nits'. A SDR Red (Red-256, Green-0, Blue-0) can be 0 to100-120nits but a HDR Red (R-1032, G-0, B-0 HDR is 10bit so has more 'colours' too over 1bn vs 16m) can be 0 to 10000nits. The colour volume difference is what HDR is - not just about 'brightness' but getting more accurate to Real Life. SDR specs was based on the limitations of CRT TV's but our eyes and 'real life' is much brighter and HDR is getting 'closer' to the range of colours and brightness our eyes can actually see...

If you don't have a 'decent' HDR TV, then there is a chance that the HDR image will be 'tone mapped' down to the limitations of your TV. Its like downscaling a 4k image to a 1080p screen - it has to lose 75% of the pixel density to 'fit' on the screen. However, when the 'bulk' of the image is under 200nits (for example), and the peak brightness may hit 1000nits, if your TV is only 400nits max, then its condensing 0-1000nits into 0-400nits, so often looks 'darker' and 'duller' because the 0-200nits has to be reduced or all the highlight detail - everything brighter than your TV can display, gets 'clipped'. Depending on your TV's Tone Mapping Algorithm, will depend on whether you get clipping or not anyway.

Not all HDR TV's are created 'equal'. A LOT are more 'HDR Compatible' in that they will take a HDR image and 'tone map' it down to fit the limitations of the TV, rather than say 'format not supported/recognised' but that doesn't mean you are 'watching' a HDR image as it was mastered and meant to be 'viewed'. Its like saying a 4k image downscaled on a 1080p screen is a '4k' image - its still a 1080p image because your TV only has 1920x1080 pixels and therefore can only display that many pixels regardless. If your TV isn't at least 1000nits. then its going to be Tone Mapping (downscaling the HDR quality) to fit the capability of the display so its not the HDR image as it was meant to be viewed.

A lot of SDR recommended TV settings offer a daytime and a night-time viewing - brighter during the day to offset the 'brighter' ambient lighting - but with HDR, they are ALL tone mapping down at least 'some' content. No domestic display can reach 4000nits (let alone 10000) so some content will have to be 'darker' than it was intended to be. You can't exactly 'compensate' either for watching HDR during the day as its likely to be 'darker' due to tone mapping than it should be - even at night. So something in SDR at say 100nits can be turned up to 200nits during the day time but in HDR, that 100nits could be just 70nits (for example) because its tone mapped down to fit the 'highlights' in. You can't turn it up anymore because its already at max brightness so looks 'darker' and 'worse' - but that's more because of the Display's limitations than the HDR format itself.

Edited on by BAMozzy

A pessimist is just an optimist with experience!

Why can't life be like gaming? Why can't I restart from an earlier checkpoint??

Feel free to add me but please send a message so I know where you know me from...

PSN: TaimeDowne

nomither6

@johncalmc Heres a look at what adjusting my TV settings did

https://imgur.com/wLNDM1Z

The top row , i guess , is ''default'' , and then the bottom pics are after I tweaked with the settings . The quality is more noticeable in person , rather than through my crappy phone .

nomither6

nomither6

@BAMozzy ''HDR is about colour volume and HDR has significantly greater Colour volume''

seems like I did that on my own , and better than HDR , since HDR on my TV is garbage . I posted a picture in my last post of how it looks , the bottom row is just me adjusting the settings without HDR and it looks better than what HDR on my TV does. I just doubled-checked also before I responded , by trying HDR again to make sure. And it still doesn't look as good for me with this TV . It looks a lil' washed out and too soft - the picture doesn't ''pop-out'' with detail , contrast & color , like my custom settings .

edit : Btw , as I mentioned in my main post - I tweaked with a lot of settings - things that I dont even know what it means like - gain , color space , color calibration , black detail , offset , brightness , saturation , gamma , color temperature , tint , contrast , etc

Edited on by nomither6

nomither6

BAMozzy

@nomither6 Better is arguable - on the one hand, you have personal preference which cannot really be quantified as to what makes it 'better' in your opinion over what someone else may 'prefer'. In this situation, its pointless to argue because its a 'personal' preference...

However, there is an alternative that 'experts' and the industry look for when it comes to reviews etc and that is how 'accurately' a TV can reproduce the image and colours. Most reviews will give the colour accuracy out of the box and after 'professional' calibration - and some will even give the readings out of the box and calibrated. Better, in this case, is quantifiable - the more 'accurately' the TV reproduces colours across the spectrum, the 'better'. Cinema projectors are calibrated frequently to ensure they display the colours and brightness as the Director intended.

I can make my TV look Gaudy and Bright, over saturate the colours to make them 'pop' more, but that's not 'realistic' or the way the content was intended to be 'viewed' - you can also lose 'detail' in highlights and shadows too so not only less accurate colours, less detail because you crush shadow detail or blow out/over saturate highlight details. That being said, if you 'prefer' that look, that's your choice but in terms of 'better', I'd argue that its 'significantly' worse because its much less accurate - the fact you have played around with the settings without really understanding what those do means its probably less accurate than it was out of the box! At the end of the day, its 'your' TV and its you, not me, that will be viewing it. My post was to explain WHY High Dynamic Range may look 'worse' than SDR.

If it's washed out, chances are you have a 'mismatch' between your console and TV's settings - Limited RGB on one and Full on the other - when you should have BOTH set to the same - either both Full or both Limited. Full RGB goes from 0-256, Limited is 16-235. If you are sending Limited the 'blackest' black is 16,16,16 (not 0,0,0) so it looks a bit washed out. If you send Full to a Limited display, it crushes the blacks (all those blacks less than 16,16,16 get 'crushed'). Some TV's, especially those first few years of HDR TV's, were 'SDR' TV's that would accept a HDR image and literally tone map it down to SDR because the TV doesn't have the volume of colours or the dynamic range to display the image as it was intended to be viewed. If you are basing HDR on seeing it on a TV that's not 'great' at displaying HDR, especially one that's also not 'properly' calibrated, then you are not really seeing HDR to say its better/worse. What I am saying is that HDR on a Decent HDR display will be 'better' (by Better, I mean much more accurate to the Director/developers intended vision than messing around with Settings and an SDR image.

The problem with HDR is that you NEED a decent HDR Display to actually see the benefits of it and actually compare and contrast. A red LED light in a movie/game, in SDR has to 'simulate' it glowing by making it 'white' in the middle and gradually fade to Red on the outside edge - simulating the 'brightness' burning out the 'film' in a camera. In HDR, that is just a bright Red that 'glows' because its the luminosity that makes it glow as it would in real life. If I put two pics side by side now though, the SDR would look 'brighter' because its simulating brightness and the HDR version would look 'dull' because its being tone mapped down to SDR and that 'bright' red is just red - its not glowing at 1000nits+ because the display can't deliver that...

A pessimist is just an optimist with experience!

Why can't life be like gaming? Why can't I restart from an earlier checkpoint??

Feel free to add me but please send a message so I know where you know me from...

PSN: TaimeDowne

nomither6

@BAMozzy ''If it's washed out, chances are you have a 'mismatch' between your console and TV's settings - Limited RGB on one and Full on the other - when you should have BOTH set to the same - either both Full or both Limited.''

I do have both the same and even double-checked and adjusted HDR . Like I said , I gave HDR on my TV the benefit of the doubt , & i've tried it plenty of times . It doesn't look as good on my TV , and I don't think its an opinion that my custom settings look better when the HDR on my TV - as i've said- is trash . Trash as in - it legitimately doesn't look good objectively. The bad HDR on my TV is the sole reason why I found out about adjusting my TV settings myself .

Edited on by nomither6

nomither6

nomither6

@BAMozzy This also isn't to diss HDR , if that what it seems like ; your explanations and thorough responses arent going unnoticed . im specifically only talking about the HDR experience with my TV.

Edited on by nomither6

nomither6

BAMozzy

@nomither6 Again, personal opinion - and I can totally understand why on certain TV's, HDR may indeed look worse in its tone mapped presentation than messing with the settings to try and make it pop more. I also said that Personal Preference and the fact its your TV so you are the one watching are to be respected regardless - even if you understand its less accurate - its still your choice.

The title of the thread mislead me a bit and I wanted to try and explain why HDR is 'superior', how it can suffer the lower down the spec you go because of tone mapping, why it matters (as real life and our eyes perceive much greater dynamic range than even displays are targetting - in that first image I posted, the range of colours in that big arc is the range of colours our eyes see - the little triangle is what SDR offers and no domestic TV offers the full 100% of that bigger triangle (about 80% is 'good' now) and that the issue isn't the format, its the display. I also wanted anyone else who reads this to understand that HDR isn't the problem, its most likely the way your specific display chooses to tone map HDR down to its limitations. The thing about Forums is that they are also read by many others, not just the OP, so hopefully that information is shared as I think its better to be informed. I wouldn't want people to think HDR is 'bad' or that people can do a better job by simply messing with the settings without understanding what they do...

I also wanted clarify what I mean by 'better' too because better to me is different to you because I prefer to view as accurately as possible to the way it was 'graded' to look. if HDR looks 'bad' because of my display, I would just use the SDR graded version - turn HDR off - so to me that would be 'better' than messing with the settings to create a 'look' you would of preferred - but that's 'individual' preference which is totally fine.

I own a 'decent' (HDR Premium certified) HDR TV so know how much HDR can transform a scene. I want people to understand that its not the format, its the Display and personally, I also prefer colour accuracy over anything else. If HDR is 'poor' on a display, I'd rather play it graded for SDR and view it in SDR.

Therefore I haven't messed with the 'calibrated' settings to do a better job on 'my' TV as the HDR is 'decent' even by the very latest standards.

A pessimist is just an optimist with experience!

Why can't life be like gaming? Why can't I restart from an earlier checkpoint??

Feel free to add me but please send a message so I know where you know me from...

PSN: TaimeDowne

nomither6

@BAMozzy Understandable & you're right ; I gave it another try for the umpteenth time - changed the settings again (mostly back to default) , and changed the RGB range to full , since I do believe it doesn't display properly , and I can see it being an opinion now . It doesn't look bad , its just that my preference for colors clouds my judgement . I'll stick with HDR again for a while , maybe it'll grow on me this time . i apologize for my arrogance.

& btw , If I ever became a game developer my game would need health warnings and probably score low because of the many eye-strains id cause & my taste in colors - too sweet . lol.

Edited on by nomither6

nomither6

BAMozzy

@nomither6 I wasn't criticising you or your choices to play with the settings. As I said before, its your choice, your TV and you that is viewing it. I was just trying to say that HDR is going to be dependent on the quality of the Display. Currently, there isn't a 'domestic' display that doesn't have to 'tone map' HDR content - there are some that don't need to with 1000nit mastered content, but some films are mastered to 4000nits and we don't have 4000nit TV's with 100% of the 'colour range' either.

Its a bit like trying to go from HD to 4k and whilst Content may be native 4k, the Displays are only up to 1600p (with many around the 1200-1440p range) so have to downscale the content to fit that image in. You are not seeing the 'full' benefits of 4k. Of course, resolution isn't the issue here - we have 8k TV's but the principal is 'similar'. No current TV can display ALL HDR as it was 'intended' to be viewed so you are seeing a 'scaled' down version to fit the limitations of your display - the better the 'specs' the less scaling down is needed. A 1600p TV would have more detail retained than a 1200p TV showing the same 4k image because its not having to 'lose' as much information to fit into a 'smaller' size. Some TV's have much greater Colour Volume so don't need to scale down the colours and dynamic range as much.

The Settings on a TV are their to 'correct' minor variations in the consistency of colours and colour 'balance', to calibrate the TV. One may have too much Magenta in the Shadows and too much Cyan in the Highlights, maybe some minor variations in greyscale balance etc and another, the opposite so you can't really 'copy' settings from another and expect a more accurate picture. They use greyscale too to ensure the colours are 'balanced' as 50% grey should have 50% Red, 50% Green and 50% Blue - but invariably out of the box, it may be 48%, 49%, 52% and after Calibrating, its 49.97%, 50.05%, 49.98%. Point is, those settings are to 'calibrate' a TV so that all the greys are 'balanced' as close as possible and the colours are as accurate as possible. I would never recommend anyone playing with the settings without understanding what they do. TV's are often too 'bright' and Gaudy in at least one 'mode' because they want them to make other TV's in a store look 'dull' in a bright show room and 'appear' better, but obviously not 'accurate' to the way it was intended to be viewed. They have a 'Cinema' mode too for the most accurate out of the box mode but that's not great for really bright Showrooms...

At the end of the day, its still up to the user - but with 'certain' TV's, you can risk damaging the panel, cause it to wear out unevenly and/or quicker. If you are watching things much brighter than they were intended, the TV's are using more 'electricity', getting 'hotter' and wearing out 'faster'.

Edited on by BAMozzy

A pessimist is just an optimist with experience!

Why can't life be like gaming? Why can't I restart from an earlier checkpoint??

Feel free to add me but please send a message so I know where you know me from...

PSN: TaimeDowne

  • Page 1 of 1

This topic has been archived, no further posts can be added.