Search Results
3/4/2025, 4:30:37 PM
>>1355244
Their target file sizes (and therefore bitrate, and therefore quality / clarity of the picture) are extremely inadequate.
In the past (in the 00s and before), many people more or less "had to settle" for lower size / lower quality releases, because of low internet speeds and low space on hard drives etc, but they actually understoon that this is inferior quality and that no magic can make a ~1.5 GB rip of a movie look good.
In modern day, Silicon Valley tech giants managed to convince the new generation of normies that abysmally low quality of content (photos, videos, audio) is fine and is how things should be. An entire generation, more or less, has grown up with Instagram photo quality, with Youtube video quality/bitrate, with Soundcloud audio quality. They don't know any better.
Not only that, but also the same companies pushing for these changes also introduced some "amateur tech bro" attitudes into the general public, creating annoying people who know juuuuuust a little bit about things, but not enough to know what they don't know. These people will tell you "but muh codecs bruh! have you heard about these cool codecs they are the shit!" and they are very eager to accept actual very bad quality served to them, as long as it's coupled with buzz words like "1080p", "4K", "opus", "20 kHz cutoff" etc. In the past, only geeky autists cared about this stuff, and they were the kind of people who actually knew the topic in-depth. Now, more people care about this but a lot of their care (and their superficial knowledge) is misdirected.
Bottom line is, even with a modern codec, you can't squeeze a movie into ~1.5 GBs and make it look decent. At best, you can make it barely passable for small inferior screens (smartphones) but that's it.
Also, >>1355245 is right: at this small file size, a deliberate resize into 480p will genuinely look better. Spatially consistent loss of smallest details in the image is much better than visible major encoding artifacts.
Their target file sizes (and therefore bitrate, and therefore quality / clarity of the picture) are extremely inadequate.
In the past (in the 00s and before), many people more or less "had to settle" for lower size / lower quality releases, because of low internet speeds and low space on hard drives etc, but they actually understoon that this is inferior quality and that no magic can make a ~1.5 GB rip of a movie look good.
In modern day, Silicon Valley tech giants managed to convince the new generation of normies that abysmally low quality of content (photos, videos, audio) is fine and is how things should be. An entire generation, more or less, has grown up with Instagram photo quality, with Youtube video quality/bitrate, with Soundcloud audio quality. They don't know any better.
Not only that, but also the same companies pushing for these changes also introduced some "amateur tech bro" attitudes into the general public, creating annoying people who know juuuuuust a little bit about things, but not enough to know what they don't know. These people will tell you "but muh codecs bruh! have you heard about these cool codecs they are the shit!" and they are very eager to accept actual very bad quality served to them, as long as it's coupled with buzz words like "1080p", "4K", "opus", "20 kHz cutoff" etc. In the past, only geeky autists cared about this stuff, and they were the kind of people who actually knew the topic in-depth. Now, more people care about this but a lot of their care (and their superficial knowledge) is misdirected.
Bottom line is, even with a modern codec, you can't squeeze a movie into ~1.5 GBs and make it look decent. At best, you can make it barely passable for small inferior screens (smartphones) but that's it.
Also, >>1355245 is right: at this small file size, a deliberate resize into 480p will genuinely look better. Spatially consistent loss of smallest details in the image is much better than visible major encoding artifacts.
Page 1