Anonymous
10/9/2025, 4:17:02 AM
No.106833769
>>106832194
The standardization of brightness is the stupidest thing ever and the source of most of the issues.
1. The perceived brightness depends on the viewing environment, to which your eyes adapt
2. Real HDR content is not realistically mastered anyway (muh realism argument)
3. Real displays will never achieve realistic outdoor brightness.
4. Realistic outdoor brightness should not even be the goal for viewing content on a small square which lacks the 360 degrees of ambient outdoor lighting and will just blow out your retinas.
Any serious standard would have same relative brightness between pixel values, not these stupid display-specific compromises that all lead to different result.
There is no universal solution for this because hardware have different capabilities so for example, a 10000 nit mastered content would just be too dim on a real display with a standard gamma curve, or severely blown out because a 1000 nit monitor could never display a 2500 nit outdoor scene.
Realistically, the artist HAS to design the content for a specific level of display technology in mind, and supporting screen of different capability levels is going to create extra work and design considerations. Like for example, on a brighter screen you could use 10% pixel value for normal lighting and use very thin, realistically bright sparks for visual effects, but on a dimmer screen, you might have 50% pixel value for normal lighting and then use a bloom effect around the sparks to reach the same brightness, since realistically thin sparks at 100% pixel level wouldn't be bright enough on the bad screen.
In games, a brightness slider is an okay solution to support different capability screens, and the developer has control over how the slider affects the visuals.
The standardization of brightness is the stupidest thing ever and the source of most of the issues.
1. The perceived brightness depends on the viewing environment, to which your eyes adapt
2. Real HDR content is not realistically mastered anyway (muh realism argument)
3. Real displays will never achieve realistic outdoor brightness.
4. Realistic outdoor brightness should not even be the goal for viewing content on a small square which lacks the 360 degrees of ambient outdoor lighting and will just blow out your retinas.
Any serious standard would have same relative brightness between pixel values, not these stupid display-specific compromises that all lead to different result.
There is no universal solution for this because hardware have different capabilities so for example, a 10000 nit mastered content would just be too dim on a real display with a standard gamma curve, or severely blown out because a 1000 nit monitor could never display a 2500 nit outdoor scene.
Realistically, the artist HAS to design the content for a specific level of display technology in mind, and supporting screen of different capability levels is going to create extra work and design considerations. Like for example, on a brighter screen you could use 10% pixel value for normal lighting and use very thin, realistically bright sparks for visual effects, but on a dimmer screen, you might have 50% pixel value for normal lighting and then use a bloom effect around the sparks to reach the same brightness, since realistically thin sparks at 100% pixel level wouldn't be bright enough on the bad screen.
In games, a brightness slider is an okay solution to support different capability screens, and the developer has control over how the slider affects the visuals.