Should put this whole issue to rest (for a while, at least 😉).
I don’t care about split screen but more evidence that the Series S was a mistake. At the very least Microsoft is going to have to ease up on the requirements.
Edit: It has come to my attention that I need to improve my reading comprehension. This only affects the S. 🤦♂️
The console itself wasn’t a mistake. Their promises of feature parity was the mistake.
Not making it have the same amount of RAM was also a mistake, it could have been just a weaker GPU which would have had less issues.
Whoever made that decision obviously never worked in gamedev.
I don’t think anyone in these comments has worked in gamedev.
I was talking about the person(s) at Microsoft, who decided that it’s a good idea to have less RAM on the Series S than on the Series X…
(And for context: I work in gamedev, and in my experience making games stay within the memory budget is one of the toughest parts of porting games to consoles.)
who decided that it’s a good idea to have less RAM on the Series S than on the Series X…
Supply chains are complicated, and MS probably did their due diligence to ensure minimal blockages. From seeing the memory structures of newer video cards, I’m pretty sure there are supply constraints to memory to think of.
Honestly I think gamedevs leaning on memory this hard instead of compute is a mistake. You can have intelligently tiled, procedurally generated textures and have a lot more of them, but instead everyone is leaning on authored content on disc. This goes against industry trends in non-game rendering where procedural generation is the norm. If Doom Eternal can look that good with forward rendering, there are no excuses.
My main beef with the hate on the Series S is that both times it’s been a big deal (BG3 and Halo Infinite), it has been split screen which has held back shipping. The community would be as justified going after split screen as they are going after the Series S.
Tell that to our artists 😉. As a coder I’m all for procedurally generated content. I did replace several heavy textures in our games by procedural materials, to squeeze out a couple of extra MB. However, that’s not the way artists traditionally work. They often don’t have the programming knowledge needed to develop procedural materials on their own, and would need to rely on technical artists or programmers to do so. Drawing a texture however, is very much part of their skillset…
But yeah, the mention of “squeezing out a couple of MB” brings me to another topic, namely that (at least in our games) the on-disk textures are only part of the RAM usage, and a relativley small one on comparison. In the games I worked on, meshes made up a significantly larger amount of RAM usage. We have several unique assets, which need to fulfill a certain quality standard due to licensing terms, such that in the end we had several dozens of meshes, each over 100 MB, that the player can freely place… Of course there would still be optimization potential on those assets, but as always, there’s a point where further optimization hits diminishing returns… In the end we had to resort to brute-force solutions, like unloading high quality LODs for meshes even if they are relatively close to the player… Not the most beautiful solution, but luckily not often needed during normal gameplay (that is: if the player doesn’t intentioally try to make the game go out-of-memory).
But I’m rambling. The tl;dr is: The memory constraints would not be a big deal if there was enough time/money for optimization. If there is one thing that’s never enough in game dev, it’s time/money.
OK so this is now offtopic for the conversation, but…
However, that’s not the way artists traditionally work.
To some extent, it’s authoring tools which affect how they work. A procedural materials pipeline can help them compose on top of already procedural content. In a way, you could see PBR as a part of that pipeline because PBR materials are physics modelled. Having said that I do take your point, even building out that pipeline takes time. Creating a PBR materials library is not super easy, and obviously organic stuff is very hard to model as a material.
meshes made up a significantly larger amount of RAM usage
From watching blender modelling, I thought the pattern was to have minimal rigging on the base mesh and then tesselation via normal maps + subdivision (apparently this is very doable even with sculpting). Obviously for animation you need a certain quality but beyond that I thought everything would be normal maps, reflection maps, etc etc.