>>715798917
Yeah, VRR can be turned off and through nvidia you can also choose if you want or not to have VRR on windowed mode.
Tearing is not perceived the same on all games. Even at a high refresh rate it's far easier to perceive tearing on something like a 2D game than a first person shooter. That's why you'll find many people don't mind playing a shooter at 200fps without any kind of sync on at all and they'd prefer it that way.
When you play with a controller shit's more obvious. Even at something like 130fps you'll notice at least a bit of that. If you have vsync on, it gets worse actually. Not because of input lag or anything like that (it's far less noticeable on higher framerates) but because if you move the camera around consistently you'll notice small "jumps", some sort of annoying juddering, because the refresh rate obviously cannot match the framerate at all. It really fucks it over.
I have a 360hz OLED monitor myself, and there's been several times where for whatever reason (maybe installing drivers again or whatever) VRR was disabled and I could tell right away without having to check anything. "Why does it look like shit when I move around even though I'm getting a lot of FPS" and then I check the monitor OSD, and it's telling me VRR is disabled.
Of course I also notice when it's enabled because some games have a tendency to change framerates very rapidly in loading screens and there is brightness flickering, like the webm in the OP. But if you'll notice in that webm at the bottom, that's happening because it's fluctuating very stupidly, and normally you wouldn't encounter that while playing, unless you had a horribly unoptimized game.