Is it diminishing returns? Yes, of course.
Is it taxing on your GPU? Absolutely.
But, consider Control.
Control is a game made by the people who made Alan Wake. It’s a fun AAA title that is better than it has any right to be. Packed with content. No microtransactions. It has it all. The only reason it’s as good as it is? Nvidia paid them a shitload of money to put raytracing in their game to advertise the new (at the time) 20 series cards. Control made money before it even released thanks to GPU manufacturers.
Would the game be as good if it didn’t have raytracing? Well, technically yes. You can play it without raytracing and it plays the same. But it wouldn’t be as good if Nvidia hadn’t paid them, and that means raytracing has to be included.
A lot of these big budget AAA “photorealism” games for PC are funded, at least partially, by Nvidia or AMD. They’re the games you’ll get for free if you buy their new GPU that month. Consoles are the same way. Did Bloodborne need to have shiny blood effects? Did Spiderman need to look better than real life New York? No, but these games are made to sell hardware, and the tradeoff is that the games don’t have to make piles of money (even if some choose to include mtx anyway).
Until GPU manufacturers can find something else to strive for, I think we’ll be seeing these incremental increases in graphical fidelity, to our benefit.
Lara Croft has been around since the triangle boobs. There aren’t too many other characters that have been in 3D as long as Lara Croft (Mario 64 Was released the same year, but Mario hasn’t come as far as Lara Croft has in terms of photorealism). Plus, she’s instantly recognizable. Personally, I don’t think there’s any deeper reason than that.