I find it's made easier by having an SDR monitor next to my HDR, and I use its top white brightness for the "paper white" value, whenever it's configurable that way, and then set the top end to the highest value where I no longer notice a change in the picture if I push it higher.
For real and at this point there’s too many things to mess with. I almost want a pro to come and do it themselves but then it would be how THEY think it should look and that’s not going to match what I like haha.
I'm gonna piggyback. Volume settings on TV. It's like every streaming service and show has wildly different audio balancing. I watch one show, then my kids turn on theirs and get blasted with sound. Not to mention the fact that ads are ALWAYS triple the volume of actual content
I used to be a sound designer and the worst part about the above is that there are genuinely good reasons for each streaming services choice. It's just in combination that it falls apart
Think one way would be for TVs/sound systems to be able to take a minimum and maximum volume and scale audio to fit within those ranges. Something like that, maybe with an option to quick open the ranges for normal/unrestricted volume
Most actually KINDA have this already. Problem is it sounds like trash. It'll be called night mode or dynamic range compression. I swear to God they let developers who don't know a thing about audio write the code for those algorithms though lol
That's super interesting actually! Is it more to do with the video that's being played + the hardware its attached to? Or do they try to optimize for specific scenarios?
YouTube uses a -14 LUFS target (exact number has been hotly debated lol). They do this because the vast majority of content on their platform is made to be played on desktop speakers and mobile phones which didn't sound great with greater dynamic range.
I've pretty much given up on it. I've gone through so many attempts to calibrate images and it's miserable every time. Turned it off and immediately my games looked less washed out.
As a quick tip - turn the gamma down in windows 11/10 to almost zero. It's omega poorly configured in HDR.
(though games usually override this anyway, it's still really bad; can help with the base HDR calibration which you should then redo, which may then positively impact your experience in games)
This is good to know in the future, I plan on replacing my broken pc. But I'm only playing on consoles right now and still find a lot of games look really dulled out with HDR on.
I have no idea why this isn't standardized by now, doubly so if your TV has a filmmaker mode where everything should be standardized. No reason a console/client shouldn't be able to detect that.
I've gone through the trouble of calibrating both my TV's and Console's HDR settings to community recommendations and there are still games that I can never line up the black and white images to be visible at the same time, just one or the other. That's saying nothing of other instruction types.
Dude this. Is this game using the system calibration? Is it using its own? This game I set to 800 but this game needs to be set to 14.00 and that one needs to be 1,500? It’s insanity.
Try doing the cyber punk one... took me 45 minutes with tutorials. At that moment I was happy the cyberpunk sequel was going to he made with Unreal Engine, never thought I'd feel this way.
It's just meant to help offset the ambient lighting in your room that's hitting the screen. Basically just brighten it until the details lost by light in your room hitting your screen are visible
At least it's not like the old days of CRTs where they just showed a static image and expected you to change the settings on your monitor/TV instead of in the game
|--0-------| Brightness
|-------0--| I am Darkness
|-----0----| Paper White
|---0------| Brown Paper Bag
|--------0-| Lights
|-0--------| Pepsi Lights
|--0-------| Northern Lights
I wonder if it would help if you have tutorial on, they had a pop up that says "is your brightness good" at the first loading screen after about 10 or 15m?
I’ve also learned that some developers aren’t great at implementing it either, take Square-Enix—the lighting in their output is always super off, like, the default illuminance seemingly glazes the environment with the intensity of a refrigerator light, & really fucks up indoor-outdoor transitions.
Poor HDR implementation is the reason I won't be upgrading to a new monitor anytime soon. I keep thinking about how good OLED black levels could look, but I dont want to shell out $800-$1000 for a crappy HDR experience because of inconsistency from developers and Microsoft.
Make sure you bareeely see it, ah but can you tell if you can still see it or if it’s just your brain thinking it’s there based on memory? Better just dial it up just in case!
More than that, I hate that fake HDR displays exist. They make people think HDR is garbage. Anything under 1500nits is not HDR and if it actually dims all other values to peak 700...it's a lie and makes HDR look bad.
That depends on the display. There's no real regulations for HDR. 700 nit can mean sustained brightness or peak brightness. Some displays even lower the base brightness to increase peak brightness. So instead of getting a brighter, cleaner image you get a dark, dull image.
When it comes to well graded HDR content, the bright highlights are such a small portion of the image that ABL doesn't play a factor, so full screen brightness doesn't really matter. Plus I wouldn't want 700 nits across the full screen anyway.
I have both a 700 nit C2 and a 1500 nit G4. Both look fantastic. The 1500 nit model obviously looks better with some content, but the difference isn't as big as you'd think and a lot of content doesn't make use of enough dynamic range for it to matter.
Well, all I know is I've used cheap HDR displays and have a portable display that claims to be HDR and guess what...My LG OLED is REAL HDR and amazing. Colors are beautiful and rich and brightness can be blinding. Cheap HDR is trash and I switch to SDR mode.
It doesn’t help that some of these games seem to just have it even though it doesn’t seem designed for it. So it just ends up being a bad filter that no amount of calibration can make look good.
I have started just ignoring HDR altogether. Most of the time, it's a pain to get the settings right, and the rest of the time, it just looks like crap because it's a poor implementation of it.
Comments
/Rant
LUFS is the unit of loudness we use to report the "median volume" of a program, as it were.
(though games usually override this anyway, it's still really bad; can help with the base HDR calibration which you should then redo, which may then positively impact your experience in games)
one game base game fine
mod i use all time
tried another and its fd my OS and the game still fixing it for other stuff
this tech is like al crap
I've seen digital artists use those, googling tells me they're called ColorChecker charts or Macbeth charts.
|-------0--| I am Darkness
|-----0----| Paper White
|---0------| Brown Paper Bag
|--------0-| Lights
|-0--------| Pepsi Lights
|--0-------| Northern Lights
But yes, the brighter you can go, the better, but don't sacrifice good black levels with a bad LED or you're not getting extra dynamic range anyway.
Tonemapping has also become much better now.
The not Visible, barely visible and visible charts are the true heroes.
How tf am I supposed to know what the devs intended the game to look like?