Hello all, first time poster.
I’ve been gaming on Ubuntu for a few months now with no issues on my 2080ti.
It’s time to upgrade my card before Starfield comes out, but I haven’t seen too many positives regarding Radeon on Linux.
Does anyone have any experience with this? I’d like to get the 7900 XTX over a 4090 for the price difference alone.
My AMD CPU has been great on Ubuntu.
AMD on Linux is pretty much hassle free
AMD GPUs are fantastic on Linux. I’m running an MSI 6800XT and the only flaw is the the RGB lighting isn’t properly exposed so I can’t turn it off. Everything else just works and I’ve never had to give it a single thought since buying it. I just put it in and started playing my games.
It’s been working flawlessly for me. Brand new cards can be a little bit buggy at the beginning as they release before the drivers are quite ready, but my Vega 64 and RX 570 have been absolutely flawless the last 4-5 years. AMD’s fine wine reputation is especially true on Linux.
I’ve always had issues with NVIDIA’s drivers and basically never with AMD. I can’t recall when is the last time I had driver-related graphical issues, but I sure can with NVIDIA. The only time I’ve had issues with my AMD cards is when I bought the Vega 64 at launch and had to run a dev branch kernel and mesa for it to work properly, but once that made its way to mainline and release kernels I haven’t had to do that since.
For how great AMD usually is on Linux, it’s not without it’s issues. RDNA2 (the entire RX 6000 series) still suffers to this day from this 2 years old issue that can cause stutter in games as the GPU constantly downclocks itself aggressively. I still prefer it over Nvidia (having owned one and now using AMD) but just be aware, it’s not all as perfect as some Linux users would have you believe.
One warning, if you use a display over HDMI, then you might have a bad time with Radeon on Linux. I use an LG C2 TV as my monitor, and there is a bug in the driver that forces it to a crappy ycbcr mode that ruins text clarity. I did try all manner of workarounds like hacking up the EDID profile, but I gave up and went back to Windows for now.
It’s a very specific issue, but a showstopper if it affects you
That’s the display advertising a specific function set, not the card, and it has to do with the large panel size and pixel density. If you used a proper monitor I’m pretty sure you wouldn’t have that issue.
Ok, then why does the same display work perfectly in windows? The display supports full RGB both 8 and 10 bit uncompressed. There is an open issue for this on the driver gitlab repo.
Don’t monitor shame me please. Also, the pixel density is within 5% of my 27" 1440p monitors.