I don’t mean BETTER. That’s a different conversation. I mean cooler.
An old CRT display was literally a small scale particle accelerator, firing angry electron beams at light speed towards the viewers, bent by an electromagnet that alternates at an ultra high frequency, stopped by a rounded rectangle of glowing phosphors.
If a CRT goes bad it can actually make people sick.
That’s just. Conceptually a lot COOLER than a modern LED panel, which really is just a bajillion very tiny lightbulbs.
And yet most of the time in the past 2 year the best choice for a gaming PC would be a 3D cache Ryzen with an Nvidia GPU. Is there something particular you have in mind that supposedly doesn’t work with an AMD chipset and an Nvidia GPU?
PCI-Express is not an open standard but both AMD and Nvidia are members and it’s what both use for their GPUs and AMD for it’s chipsets (as well as Intel). It’s certainly not a secret cabal.
It’s all in the same family, literally…
https://www.cnn.com/2023/11/05/tech/nvidia-amd-ceos-taiwan-intl-hnk/index.html
This supports your claim of AMD vs Nvidia not working optimally together how?