>>8268975
>sorry for trying to say that Waifu is any better or the same as ESRGAN
No worries lol.
>That's so interesting and paradox.
I can't say for sure, but what I think happens is that all the artifacts and noise gets smaller, making it easier to work with, but so do the lines, which often makes them cleaner.
>Is that >>8193359 how you usually do it? The double upscale/resize?
Not necessarily double, just generally down to the nearest "common" resolution like 480p, 720p 1080p and such.\
With videos I always take anything above 720p down to 720p, and if it's between 480p and 720p, it depends just where exactly it is. If it's 576p, I often let it stay as that, but if it's below that and 480p, I generally downscale to 480p.
Very rarely that I touch anything below 480p, but if I do, I tend to not downscale anything. But who knows, maybe it there is a benefit even then, but I can't be bothered testing.
>I remember using some sort of Topaz software to upscale images back in 2010?
I didn't try it that long ago but it's not that long ago Topaz was quite shit, and in some aspects it still is, at least for videos, unless like I said, it's for detailed 3D stuff.
Plus Topaz is significantly slower for video than ESRGAN, which is another reason to only use it for those scenarios. Shit runs at like 5 fps.
Back on the topic of downscaling, this is some code I have for my python ESRGAN setup, so you can see when and what I downscale to, even if it's for videos:
if aspect_ratio > 1.4: # 16:9 aspect ratio
if 300 <= height <= 530:
return "853:480"
elif 531 <= height <= 650:
return "1024:576"
elif 651 <= height <= 3000:
return "1280:720"
else: # 4:3 aspect ratio
if 200 <= height <= 300:
return "320:240"
elif 301 <= height <= 550:
return "640:480"
elif 551 <= height <= 650:
return "800:600"
elif 651 <= height <= 2000:
return "960:720"