← Home ← Back to /g/

Thread 106832194

59 posts 20 images /g/
Anonymous No.106832194 [Report] >>106832786 >>106833769 >>106834989 >>106836657 >>106836911 >>106837953 >>106839649 >>106845745 >>106847968 >>106853377
SDR
>Color primaries based off of real-world performance of CRTs at the time
>max signal is defined as comfortable white, which all displays can hit
>every half-competent display can achieve these targets and reproduce an image to the artist's intent, and even displays that aren't properly calibrated can get close due to all of the data being displayed without modification
HDR:
>spec is designed around a hypothetical display that can achieve Rec2020 color primaries and 10,000 nits
>the best consumer displays top out at around a 10th of the brightness and only extend to P3 color primaries
>the solution is for the display itself to dynamically tonemap HDR data that exceeds the display's capabilities
>how this occurs varies from display to display

I love HDR but working in/creating for this space is a pain in the ass. I can make HDR that looks great at 1000nits but looks like ass in SDR. How do we solve this problem? Games have HGIG, which overrides the tonemapping of the display in favor of the artist's tailored tonemapping solution, but movies and images don't have this luxury.
Anonymous No.106832264 [Report] >>106833485
10K nits was a mistake. Even if we had ideal screens, it would look way too bright to be comfortable.
Anonymous No.106832652 [Report]
I completely agree even though I love watching HDR movies on my LG C2. Why this technology more than most has the most diehard consoomers that will just bark "poor" at you the second you critique it is fucking wild.
Anonymous No.106832786 [Report]
>>106832194 (OP)
>>the best consumer displays top out at around a 10th of the brightness and only extend to P3 color primaries
If by >best you mean cheap shit from Amazon, sure.
Anonymous No.106833485 [Report]
>>106832264
Obviously I don't want a fullscreen window at 10k nits, but I think it's fine for small highlights like emissive lightbulbs or specular highlights.
Anonymous No.106833769 [Report] >>106834012 >>106838521 >>106838588
>>106832194 (OP)
The standardization of brightness is the stupidest thing ever and the source of most of the issues.
1. The perceived brightness depends on the viewing environment, to which your eyes adapt
2. Real HDR content is not realistically mastered anyway (muh realism argument)
3. Real displays will never achieve realistic outdoor brightness.
4. Realistic outdoor brightness should not even be the goal for viewing content on a small square which lacks the 360 degrees of ambient outdoor lighting and will just blow out your retinas.

Any serious standard would have same relative brightness between pixel values, not these stupid display-specific compromises that all lead to different result.

There is no universal solution for this because hardware have different capabilities so for example, a 10000 nit mastered content would just be too dim on a real display with a standard gamma curve, or severely blown out because a 1000 nit monitor could never display a 2500 nit outdoor scene.

Realistically, the artist HAS to design the content for a specific level of display technology in mind, and supporting screen of different capability levels is going to create extra work and design considerations. Like for example, on a brighter screen you could use 10% pixel value for normal lighting and use very thin, realistically bright sparks for visual effects, but on a dimmer screen, you might have 50% pixel value for normal lighting and then use a bloom effect around the sparks to reach the same brightness, since realistically thin sparks at 100% pixel level wouldn't be bright enough on the bad screen.

In games, a brightness slider is an okay solution to support different capability screens, and the developer has control over how the slider affects the visuals.
Anonymous No.106834012 [Report]
>>106833769
>Realistically, the artist HAS to design the content for a specific level of display technology in mind
I don't even understand the point of having a "futureproof" spec if you're just going to have to do stuff like this anyway. Might as well develop "HDR P3-1000" as its own specific format.
Honestly, I wish the HDR format had data blocks set aside for tone mapping instructions beyond the nebulous MaxFALL and MaxCLL. It would make much more sense to just let the author specify how content should be tonemapped instead of relying on whatever flavor of tone mapping may or may not be included by the manufacturer. Also, switch to Relative contrast instead of Absolute brightness (it would make tone-mapping formulas easier to set the user's SDR brightness as 1.00 in equations anyway)
Anonymous No.106834989 [Report]
>>106832194 (OP)
most good screens cover a decent chunk of rec2020 now, in any case using gamut compression rather than clipping to tonemap it makes it a non issue
I agree that using an absolute scale of nits for luminance is retarded though, but HLG and/or dolby vision solves this. Shame the standard is so fucked up now though
Anonymous No.106836657 [Report]
>>106832194 (OP)
>he fell for the meme
Anonymous No.106836724 [Report] >>106836793
hdr in windows makes the colors look nicer but its bad for performance in games.
Anonymous No.106836793 [Report]
>>106836724
Why would it be bad for performance? As far as I know it literally costs nothing.
Anonymous No.106836911 [Report]
>>106832194 (OP)
Useless. Just use existing formats with higher bit depth and higher peak brightness. "HDR" is the ipv6 of color specs.
Anonymous No.106837468 [Report] >>106841968 >>106845882 >>106846469 >>106846553
Here's the classic video on the subject
https://yedlin.net/DebunkingHDR/index.html
Anonymous No.106837953 [Report] >>106837980 >>106838185
>>106832194 (OP)
just say youre poor
Anonymous No.106837980 [Report]
>>106837953
>poor this poor that
Obsessed and rent free.
Just say that you are insecure and have more self esteem issues than money.
Anonymous No.106838088 [Report] >>106838237 >>106838289 >>106855162
>get monitor labeled as HDR capable
>try to get HDR working on both linux and wangblows
>hdr videos all look washed out on either of them using mpv+vkdhdrlayer or madvr
>look up more info, turns out displayport and hdmi versions on it are technically capable of HDR but not quite
And don't get me started on TVs, the whole standard is a mess.
Anonymous No.106838185 [Report]
>>106837953
Unfortunately I don't have millions to lobby ITU and SMPTE to make their standards suck less
Anonymous No.106838237 [Report]
>>106838088
>not making sure the monitor is VESA-certified
Anonymous No.106838289 [Report] >>106838416
>>106838088
>24000/1001
Pretty much everything in (literal) kino is a mess, because it's just patch upon patch since the 1920s, where e.g. 24fps first got defined, because sound started to be used in film and that required a standard frame rate.

Fast forward thirty years
>In December 1953, the FCC unanimously approved what is now called the NTSC color television standard (later defined as RS-170a).
>The compatible color standard retained full backward compatibility with then-existing black-and-white television sets.
>Color information was added to the black-and-white image by introducing a color subcarrier of precisely 315/88 MHz (usually described as 3.579545 MHz±10 Hz).[19]
>The precise frequency was chosen so that horizontal line-rate modulation components of the chrominance signal fall exactly in between the horizontal line-rate modulation components of the luminance signal, such that the chrominance signal could easily be filtered out of the luminance signal on new television sets, and that it would be minimally visible in existing televisions.
which thusly led to picrel
>Due to limitations of frequency divider circuits at the time the color standard was promulgated, the color subcarrier frequency was constructed as composite frequency assembled from small integers, in this case 5×7×9/(8×11) MHz.[20]
>The horizontal line rate was reduced to approximately 15,734 lines per second (3.579545 × 2/455 MHz = 9/572 MHz) from 15,750 lines per second, and the frame rate was reduced to 30/1.001 ≈ 29.970 frames per second (the horizontal line rate divided by 525 lines/frame) from 30 frames per second.
>These changes amounted to 0.1 percent and were readily tolerated by then-existing television receivers.[21][22]
https://en.wikipedia.org/wiki/NTSC#History
...nice, eh?
(If you can, get EU DVD versions for better picture PAL 576p vs NTSC 480p; but pay attention to fps shenanigans and original recording, i.e. Telecine and interlacing shenanigans)
Anonymous No.106838416 [Report] >>106838767
>>106838289
*Also, all those long-dead sons-of-bitches did all that shit on ANALOG technology.
Fucking legends.
>https://en.wikipedia.org/wiki/NTSC
Still though, the more you read this article, the more you're simultaneously impressed by all the nifty work, but also facepalming at the sheer.... disarray of it all.
Anonymous No.106838521 [Report] >>106842080
>>106833769
SDR also had a brightness standard.
We just chose to ignore it because while it made sense for CRTs at 100 nits, most LCDs blew past that almost immediately and wanting a brighter screen took precedence over using the standard .

It seems odd you'd complain about there being a brightness standard because anything could change how you perceive that brightness but truth is that's been like that forever.
Anonymous No.106838588 [Report] >>106843149
>>106833769
Your misunderstanding HDR
HDR doesn't use regular gamma and it's actually up to the display to tone map.
If you have a movie with 10k nits peak but most real scenes are at 500 nits, if you used a display that is only 1000nits capable that doesn't mean the whole movie is dim.
The scenes get tone mapped to the capabilities of the display. So your 500 nit scenes will be displayed at 500 nits as the display is capable but any highlights above 1000 nits will just be limited in brightness.
Anonymous No.106838767 [Report] >>106843287
>>106838416
>ANALOG technology
In the pre-transistor era, to emphasize.
Anonymous No.106839649 [Report]
>>106832194 (OP)
>spec is designed around a hypothetical display that can achieve Rec2020 color primaries and 10,000 nits
it's not. 10000 is just the limit
Anonymous No.106840325 [Report]
any "hdr" monitor that allows you to change the brightness is not following the spec
Anonymous No.106840383 [Report] >>106843208
How do I change brightness then on an HDR screen? Having a fixed brightness seems retarded
Anonymous No.106841968 [Report]
>>106837468
The only good point hes makes were detailed in the OP. The entire rest of the video is crap.
Anonymous No.106842080 [Report] >>106842457
>>106838521
>SDR also had a brightness standard.
It actually does not. SDR was always relative. The brightness of "comfortable white" (1.00 on the signal range) is up to the preference of the viewer.
Now, there's sRGB, which has a 2.2 gamma curve, which causes colors to brighten up more quickly for viewing in casual settings, and there's Rec.1886, which uses the same color primaries and has the same relative model, but has a 2.4 gamma. But the actual "specified max brightness" was never a thing.
Movie studios would impose an in-house standard for the use in their grading rooms, as it makes sense to have the same image across different monitors, and that was commonly 100nits for dim grading rooms, but even in different areas, monitors may be calibrated to different brightness.
Anonymous No.106842457 [Report]
>>106842080

is horrible and might be better to forget as there are not many crt displays anymore with phosphor glow delay

black linear toe varies sRGB Rec.709 earlier Rec newer Rec rest of the curve is near identical and edited image or color graded video should look what it supposed to be
Anonymous No.106843149 [Report]
>>106838588
>any highlights above 1000 nits will just be limited in brightness.
In other words, brickwalls highlight detail into a blob of white, like an amateur cranking up contrast/brightness in photoshop. The fact that this is built into the standard is ridiculous. Now, a smarter display will roll off the top end more smoothly, but there is no standard method for this, so the artist has no control over the final result. Normally, you can view a piece of art on a computer screen at any brightness level, and it's still going to be recognizable as the same image with all the same details. Brightness-mandated HDR with eternally ill-equipped hardware completely destroys this.

You reduce your phone screen's brightness when in a dark environment and it might still feel bright, and you crank it to extreme when out in a sunny day and it's still not bright enough. Everybody understands this, and it's why standardizing brightness is clearly a stupid idea and complete hogwash.

A more sensible way would be letting go of the mandated brightness, make the average dimmer, and watch it in a sufficiently dark room. This gives you more headroom to correctly display the relative brightness of highlights in a way that the artist intended, which is more important than absolute brightness.

A 500 nit display can perfectly display all the detail of 2000 nit content with the correct relative brightness if you just configure it to, it's just that it might look too dim in a lit up environment. Assuming ideal contrast (oled), it's purely a battle of maximizing perceived brightness in a set viewing environment. On an LCD, you are limited by contrast regardless of viewing environment, and the limit of HDR becomes a matter of how much raised blacks you are willing to tolerate.

There is some value in defining the "average brightness" is on different nit levels of HDR, but considering the effect of viewing environment, it's nothing more than a rough suggestion.
Anonymous No.106843208 [Report]
>>106840383
>Having a fixed brightness seems retarded
Indeed it is.

I mostly watch HDR at night in a dark room.
I don't want it to be too bright.
They say OLED TV's lack brightness but I lowered the brightness setting on my OLED TV for comfort.
Anonymous No.106843250 [Report] >>106844690 >>106846469
>hypothetical display that can achieve Rec2020 color primaries and 10,000 nits
>the best consumer displays top out at around a 10th of the brightness and only extend to P3 color primaries
Lol
Lmao
>Didn't take a look at 2026 lineup before posting...
https://cecritic.com/news/tvs/tcl-ces-2026-tvs-announced-x11l-q10m-ultra-q9m
Anonymous No.106843287 [Report]
>>106838767
>Not even mentioning adjusting CRT convergence rings (and talking about pil, not even delta)...
Anonymous No.106843712 [Report] >>106844863
I wish everyone would just settle on a standard instead of having a dozen different proprietary implementations
Anonymous No.106844690 [Report] >>106848933
>>106843250
>10k nits (boosted)
>100% Rec.2020 with just white backlight LEDs
>not even out yet
I'll believe it when I see it
Anonymous No.106844863 [Report]
>>106843712
Nearly every display can at least parse HDR10 media, so the problem is not that there are a billion different standards. The problem is that the standard of the one format commonly in use sucks. If your display receives data that's brighter than what it can handle, the specified way to deal with it is "lmao figure it out yourself". Even the same content on the same screen will look different on different applications, where this was unacceptable on SDR. It makes me wish I could specify, within the file itself, my own preferred tone mapping instructions (kind of like what games can do with HGIG). At least then I can somewhat predict the look if someone runs my HDR content through their screen, whatever it may be, rather than hoping that it somehow looks good when subjected to whatever the fuck the display manufacturer does. As far as I'm aware, none of the formats do what I want, and the only one that comes close is the Ultra HDR gain mapped JPG format for still images.
Anonymous No.106845023 [Report] >>106845501 >>106845570
i must confess that I don't understand what the fuck is HDR
is it like a dynamic colorspace that adjusts to keep relative contrast?
for example like adjusting the red color of a flame to appear brighter relative to the darkest pixels in the frame?
Anonymous No.106845501 [Report] >>106845570 >>106850948
>>106845023
It's essentially a way to store and display images that can get "brighter than white" without blinding you every time you want to open a Microsoft word document. It's kind of like Pic Related if Pic Related was brightened to where the background was #FFFFFF white.
Anonymous No.106845570 [Report] >>106845680 >>106847851
>>106845501
This is bullshit marketing.
>>106845023
It's just a storage and transmission format, it doesn't make your screen any brighter or more vivid like in the marketing materials.
It has two changes from "SDR":
- Higher bit rate (10bit), this is good as it means there won't be artifacts like banding in extreme scenes
- Includes brightness data, to tell your screen what brightness to run at

The later is the cause of all the problems, as with "SDR" the user decides brightness, with HDR the content producer decides your screen brightness (But then your display doesn't have infinite brightness/color so it doesn't do what they intended in the end anyway).
Anonymous No.106845680 [Report]
>>106845570
>This is bullshit marketing.
I prefer the term "vast oversimplification".
The thing that separates HDR from SDR is where the "white point" is located. For SDR, it's at the very max signal value. For HDR, it's closer to the beginning. You have content that's displayed at the traditional Paper White, then you have headroom to store and display content that gets brighter than that. Both HDR and SDR can have as many or as few bits as possible. I have 8-bit HDR PNGs, for instance. They're banded as all get-out, but they're still HDR.
Anonymous No.106845745 [Report] >>106846469 >>106846553
>>106832194 (OP)
>I'M GONNA DEBOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOONK
https://www.yedlin.net/DebunkingHDR/
Anonymous No.106845882 [Report] >>106846469 >>106846553
>>106837468
>2 hours and a dozen incredibly overpayed experts talking about this shit
Reminds me of the thread about how ridiculous keyboard fags are and all the threads over the years shitting on audiophiles. Just put the cute jpegs on the pixel matrix jeez.
Anonymous No.106846469 [Report] >>106846553 >>106849376
>>106837468
>>106845745
>>106845882
You know, I actually took the liberty of watching this video (multiple times, even), and for one-hundred and twenty minutes of my time, he presents:

Two valid criticisms about the HDR format itself:
- The choice to encode absolute values for luminosty rather than a relative contrast delta like SDR (wholly impractical when viewing conditions can vary wildly, especially amongst smartphones)
- The complete nonexistence of reproducability due to no standardized tonemapping between displays

A valid critique surrounding the marketing of HDR:
- HDR is a CONTAINER, it is NOT an "enhancement". SDR content should look IDENTICAL in SDR formats and HDR formats. Wide Gamut/Luminosity is available for content that explicitly takes advantage of it
- HDR as an acronym has been thoroughly abused through the years, meaning everything to exposure bracketing to eye adaption to a scene-linear rendering pipeline (but not actual HDR)

The "debunking" is mostly just boomer-ranting:
- this is garish
- that is garish
- garish garish garish
- "bust through the ceiling"

And if it's not just personal preference, it's actual fallacy:
- The dipshit likely isn't actually looking at real HDR content; he says at the end of his video that all the "HDR" he's been showing is actually just SDR with a post-process gain on the highlights, which he calls Sizzle. So he doesn't get the nuanced differences of an HDR tonemap of scene-data compared to an SDR tonemap.
- One of his scopeviews shows a scene with color converging exactly from an sRGB primary point, which suggests it's sRGB-tonemapped instead of HDR or scene-data
- "10,000 nits/Rec.2020 is wasteful" which is true at the moment in time, but as is the case here >>106843250 it may not be soon.
- "HDR is wasteful" as if the format needs to use all 10,000 nits instead of just using what's appropriate.

(continued...)
Anonymous No.106846553 [Report]
>>106837468
>>106845745
>>106845882

>>106846469 (continued)
And my absolute favorite part of the video, is his "HDR is inefficient" bit, which begins at 1:07:45 and is the epitome of a fallacious argument.

>HDR does not use the bits as efficiently as SDR
>so here's an sRGB image, we're going to put it in an HDR container
>and now we're going to reduce the bits of the HDR container to 8
>wooooooooooow look how posterized and bandy it is
>[proceeds to never mention the 'H' portion of EICH DEE ARRR]
>this proves HDR is less efficient

It would be like if I gave Kill Bill shit for using 3x the amount of data during the black and white segment because "you only need 8 bits for BW", then proceeded to shove the BW section into a 3-3-2 RGB container and complain how blocky and bad the Black and White looks, and use that as proof that color is overrated and stupid.
It is beyond retarded.

----

Point is, it's a shit video and you should feel bad for blindly posting it. I have an HDR display, I have HDR content. I looks nice. Nicer than SDR. It is, by no means, a perfect format, but it is the future of display.
Anonymous No.106847851 [Report]
>>106845570
oh nice explanation
Anonymous No.106847968 [Report]
>>106832194 (OP)
Fake hdr is best of both worlds.
Anonymous No.106848933 [Report] >>106850365
>>106844690
>He thinks he'll be able to stand 10k nits brightness...
Yeah, right; unless you watch tv outside in bright daylight, I absolutely doubt it...
>Bitches about (not so future) proof standards
Ok, zoomer
Anonymous No.106849376 [Report]
>>106846469
> it may not soon
cope, or incredible amounts of blind faith that this shit format will ever be fixed and that the fix will be adopted.
Anonymous No.106850365 [Report] >>106850602
>>106848933
I wouldn't want to subject myself to a fullscreen flashbang of 10,000 nits. I've done that with 1,000 and it was enough. But for the odd specular highlight here and there, it's fine.
Anonymous No.106850602 [Report]
>>106850365
>Odd highlights
Yeah, knowing what full white screens even from a paltry 3/400 nit screen under low light looks like, I would hope so.
I wouldn't be too confident everyone involved will be so sensible, though.
Anonymous No.106850948 [Report] >>106852050
>>106845501
not misleading at all
Anonymous No.106852050 [Report]
>>106850948
It's an SDR approximation of HDR. How else do I demonstrate it? It's not like HDR is some 4th dimensional future tech; it's literally just "brighter images".
By the way, this is the website these were taken from: https://gregbenzphotography.com/hdr-gain-map-gallery/
Anonymous No.106853377 [Report]
>>106832194 (OP)
ive seen good HDR movies with an OLED TV. it can be done. however, i noticed lately movie graders seem to be making everything muted and dim to the point its almost SDR anyways. such a shame.
Anonymous No.106854512 [Report]
Maybe I'm a retard, but I've got an OLED TV and the only time I've been impressed with HDR is in the various demos on YouTube.
Regular pirated films either look too muted or the highlights are overblown and clipping, losing detail, same with games on PC/PS5.
I understand the idea behind it, but at this point it seems like more of a hassle than a benefit.
Anonymous No.106854692 [Report] >>106854814
HDR movies can look really great on my LG G3. It feels like you've cleaned your windows for the first time after 4 years of dirty accumulating. Or like watching your first DVD, after you only knew VHS.
But it also can look really bad and dark. And it depends on how the film was mixed. And there is no standard. And studios will do whatever the fuck they want. And now you HDR movie will be dark as shit, and you can do dynamic tone mapping, but now it's inaccurate. And OLED will look good with darker movies, reaching 1000-1400 nits with highlights. But desu if you have a very bright scene, and it's only 400-500 nits. And you notice. So you can get a QLED or something that can get brighter, but now you blacks aren't as good so highlights in dark scenes don't pop as much as they could.

tl;dr: HDR implementation is a horrible trainwreck. HDR *can* be great for movies. And overall it's worth it if you got a capable TV. For gaming it's just a hassle, even if some games can have nice effects, they often look "off" when gaming in HDR since no developer designs them with HDR in mind.
Anonymous No.106854814 [Report] >>106855311
>>106854692
Give me a couple of examples of a well mastered film so I can finally see what I'm supposed to be seeing.
Anonymous No.106855112 [Report]
HDR -is- a hassle. It's not the toggle everyone expected it to be. It's like 7 sliders instead of a toggle. There's no standard, no right implementation, and you have to have the right display for it, but it's not always right.
I love HDR when it works. There's games where I enable HDR on my OLED display (sometimes I use RenoDX mods) and even at a measly 400 nits they look amazing to me. But it's a headache everywhere the fuck else.
-On linux support is primitive and it narrows your options by a fucking lot.
-SDR to HDR is not great, on Windows I have to switch between AutoHDR, RTX HDR, ReShade HDR, it's a whole fucking mess
-Videos are another whole can of worms
-I'm completely confused about this technology in regards to photo editing (am I supposed to work with HDR enabled? do these programs even support it? when I upload my work is it gonna look like shit on SDR displays?)
Anonymous No.106855162 [Report]
>>106838088
>get monitor labeled as HDR capable
$300 IPS monitors are "HDR capable". It's more like unusable. You get a toggle that completely misrepresents this technology to you, and you end up coming here and telling people that HDR isn't needed, it's garbage, it makes shit looks worse.
Marketers want HDR not to sell. They actively do everything they can to avoid selling this technology. They make all sorts of claims that make no sense about it, and they sell shit that doesn't even work that's going to introduce the idea that HDR is not worth anything at all. It's an absolute fucking shame. I remember high resolution displays being all the rage, and you could just see it. High refresh rate displays maybe not to such an extent (because for some fucking reason, some people are unable to tell the difference even between 60 and 120hz), but it could still be shown clearly. But with HDR? Some people think HDR is just "oversaturated shit". It makes no sense. There is no standard for this, it's pure chaos.
Anonymous No.106855311 [Report]
>>106854814
Alien was really good, would recommend it to anyone who wants to experience HDR. I also recommend 2001 and Back to the Future.