>>106182175 (OP)
it has to do with the difference between rasterizer vs raytracing. In raster you only sample from the center of the pixel while in raytracing you sample randomly kinda stochastically inside the pixel to throw the ray. Also when reading research you can see two different of one in film that is memory intensive for fidelity, filling it with 256mb density fonction for a small glint. In real time this fidelity doesn't exist, everything is an edgecase. Kinda make it fun. With all the fancy ass GTX 4090 we throw one ray per 4 pixel and we use it to approximate shadow and denoise that result and add a blur on it. I wanna give it 15 years. The distance between both are getting blurry. Altough, rich tech companies are now obsessed with putting money on anything AI instead, so it genuinely slowed down a bit.