>>715399910 (OP)Hardware was more of a limiting factor in graphics back then. At a certain point, it became trivial to get 95% of the way to absolute photorealism. If you're looking at the leap from, say, 10% to 40% (inb4 people disagree with the numbers I chose as if this is truly quantifiable in the first place), well there just isn't another 30% leap from 95%. As hardware keeps improving, you can use that extra power to skip optimization, or you can do stuff like ray tracing that uses 50000% more computing power for a 1% gain in realism over other techniques, but where where the hell would you expect to go from current graphics in the next 10 years that would be comparable to the leap from GoldenEye 007 (1997) to Crysis (2007)? I'm not saying graphics won't continue to improve, but as long as we're talking about video games displayed on a screen and not some ridiculous paradigm shift like true virtual reality (like the brain-implant type and not just strapping two little screens to your face), we're pretty close to the ceiling. There's only so much you can do by sending colors to a 2D array of pixels.