>>718741035
You still have to set lighting values for that sun and those lights and you can still do fancy things like make those lights only illuminate certain objects or only from certain angles, or ignore shadows, or only draw in reflections, the lights still have a lot of options.
The real point is that RT is holistic solution to lighting, you can have an artist spend time setting falloffs for lights, determining what lights emit shadows, what ones don't, having lights emit shadows based on distance to player, distance to camera, angle etc. have special additional lights to fake bounce lighting in a scene or you can just use ray tracing and have all of those things occur naturally, it's a great time saver for artists and can achieve a much better end image, the problem is that Nvidia wanted to push RT multiple generations before it was really viable, the first RT cards have 2RT cores per SM, later ones only have 1 because every single gen the amount of ray/triangle intersections doubles, so in theory an RTX50 series RT core can do 8x the amount of ray/triangle intersections per cycle per core than an RTX 20 series GPU, but that's an over simplification since you still need to build the BVH structure and store a separate RT scene in memory, and you need lots of cache and fast Vram to do this and there's still the GPU itself having to build that scene to render it, so even though it's 8x, it's not really 8x, but there's still a large jump in RT performance comparing an RTX 50 series GPU to an RTX 20 series GPU. There's also the issue where AMD went full retard and their RT solution was fucking trash, the RT cores aren't fully dedicated and some tasks are done on the GPGU, they rely on larger BVH structures so more BVH transitions have to occur and those large structures require more resources to maintain.
There was a serious lack of standardization and RT just isn't what it could be, but it's kind of 3d acceleration anyway, it will be the standard.