>>723165010 (OP)
The concept of frames being AI generated is not necessarily intrinsically bad - if it's blurry or whatever, the tech just needs to get better. The issue with the concept is that, unless the game can be fully AI generated with inputs (which isn't coming any time soon), the frames cannot take into account changing gamestate.
So in a theoretically perfect scenario, you might have a model that can predict the next frame. This would let you see normal frames without any delay, and the AI would insert extra frames in the middle; but this wouldn't let the game visibly react to events (player input, or even internal state like an enemy doing something, a rocket exploding, etc.) any faster. So they wouldn't be "real" frames.
But even that doesn't exist, instead to make it work all frames you see are delayed, and the AI takes the latest frame (that you don't get to see yet) and the previous frame, and interpolates in between them. So not only does the AI not allow the game to react faster, but because it has to be delayed by a frame, it actually delays what you're seeing on the screen worse than if you didn't have frame generation at all.
This isn't useless; there are times when this is fine. For example, single-player games that run fast enough that a frame of lag is fine - ideally 120fps native, maybe 60fps is acceptable. Then you use framegen to go to 240 "fps" (or 120 respectively), at the cost of one frame of lag.
However then you get shit like borderlands 4 which has lows below 60fps at 1080p on a 5090, and again 60fps is far from ideal - even in single player games, input lag isn't crucial but more lag still feels worse for responsiveness, and 16.7ms (one frame at 60fps) sounds low but is definitely not negligible.
Running a game at 30fps and having framegen "fill it in" to 60fps is supposedly one of the advertised usecases but is absolutely awful, since it introduces 30+ms of input lag at minimum which starts getting noticeable.