• ramble81@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    8 months ago

    I must be in the minority, but I’ve been digging Intels ARC GPUs. For their price point and the fact I don’t plan bleeding edge AAA games, they’ve actually done pretty well. Additionally I’m tired of nvidia’s price gouging and AMD following after, I want to support a disruptive third party. Their driver support gets better every release and I can’t wait to see their next generation of cards.

    • DarkThoughts@fedia.io
      link
      fedilink
      arrow-up
      0
      ·
      8 months ago

      Too many edge case issues, especially for someone who plays a lot of indie titles and uses Linux. Also, they kinda just went into the low performance market. If they’d launch something for the upper midrange I’d be more interested (assuming they improved on a lot of fronts of course).

      • smoothbrain coldtakes@lemmy.ca
        link
        fedilink
        arrow-up
        0
        ·
        8 months ago

        The new dedicated cards are actually very good. They sell them at a competitive price because they are not powerhouses, but they get the job done. If you’re targeting 1080p at your top end, it’s almost a no-brainer to go with an Arc card. If you’re pushing a higher resolution, it’s probably better to go with another manufacturer, unless you’re fine with higher resolutions and lower framerates.

    • smoothbrain coldtakes@lemmy.ca
      link
      fedilink
      arrow-up
      0
      ·
      8 months ago

      I agree with the Arc cards.

      They are good, they are cheap, and they’re targeting the midrange to low-end hardware segment which is not covered by any other manufacturer.

      I have a 3090 in my desktop but I have an Arc card on my server for Moonlight/Sunshine streaming, as well as Plex transcoding. It’s the cheapest card to have AV1 encoding built in.

      I also keep seeing them increase performance significantly with every driver update, which is pretty cool.

      • hollyberries@programming.dev
        link
        fedilink
        arrow-up
        0
        ·
        8 months ago

        I’m interested in your use of the Arc card for media transcoding. What one did you get and how would you say it compares to a GTX 960? The one in my server died and I stuck a spare 2060 in there a while back and am looking to downgrade to something sensible.

        Most of my media is 1080p x264 with some 4k HEVC (and growing) if that helps.

        • smoothbrain coldtakes@lemmy.ca
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          8 months ago

          I had a 1050ti in the machine and I bought an A770. It’s overpowered for transcoding but I do remotely stream games at 1080p, which is a good workout for the card.

          For simple transcoding I would buy the A310 since it’s the cheapest card with AV1. I’m running an old 6th Gen i7-6600k and I had to mess with the UEFI to allow REBAR, but I used this tool to do it.

    • al4s@feddit.de
      link
      fedilink
      arrow-up
      0
      ·
      8 months ago

      Supposedly Nvidia has become a lot better on Linux lately. They finally dropped their weird framebuffer API or whatever (the one that was the reason for horrible Wayland compatibility and also caused a heated Linus Torvalds moment), and I think they even made their linux drivers open source.

      • alessandro@lemmy.ca
        link
        fedilink
        arrow-up
        0
        ·
        8 months ago

        Historically speaking, Nvidia was always the best for Linux. Nvidia’s success history with Linux trace back to the 2004 with State-of-the-art 3d capabilities (albeit for arcade machines). At that time ATi radeon 3D capabilities for Linux were below sub-par.

        The problem with Linux+Nvidia is that it was never “the Linux way”… but always the “Nvidia way”.

        The Linux way is… flexibility: it mean you can use whatever kind of Linux you want, and the drivers works straight out of the box (basically you need open source drivers). Instead Nvidia always pushed for fixed binary blob that required specific kernel and rigid environment.

        The modern support for Linux by AMD is mostly “the Linux way”, that’s why the Linux community love AMD more than Nvidia.

        In any case of hardware parity between Nvidia and AMD; Linux crowd will always prefer AMD, because AMD mean you can use any kind of Linux distro-thing and have an uncompromising gaming experience.

      • AProfessional@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        8 months ago

        They do support their driver yes, but it will never be as good as long as it’s proprietary. The open nvidia module isn’t ready and still backed by proprietary blobs.

    • baconisaveg@lemmy.ca
      link
      fedilink
      arrow-up
      0
      ·
      8 months ago

      I’ve used a 3090 on Ubuntu and Arch without any issues for things like 3D rendering (Blender, Daz) and most of the Steam games I played without any issue. I was also able to run most of the AI models and tools.

      AMD? Well, it works ok for games I guess, but it’s a huge pile of shit other than that. Linux tards who pretend to care about “proprietary software!!!” on the one hand then talk about Proton/gaming performance in the other are nothing but hypocrites.

  • edinbruh@feddit.it
    link
    fedilink
    arrow-up
    0
    ·
    8 months ago

    On Windows, Nvidia without thinking twice. On Linux, depends, on rDNA 4 and the next release of Nvidia drivers, but probably still Nvidia.

    Unfortunately, despite how much I would rather buy from someone else, AMD’s products are just inferior, especially software.

    • Vik@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      8 months ago

      For your points against:

      The OpenGL UMD was completely re-engineered. This premiered with the 22.7.1 release, so nearly two years ago. AMD now have the most performant, highest quality OpenGL UMD in the industry, which is particularly relevant for workstation use cases (where OpenGL remains the backbone of WS graphics).

      PhysX is proprietary, I don’t know what can be done about that, but your point is valid here.

      AMDs approach to ray acceleleration has always favoured die area efficiency up until now, though I can totally understand your disappointment with the performance in that area. That said, the moment I really care about RTRT in gaming is when it’s no longer contingent on the raster model. reflections, shadows and GI are nice and all, but we’re still not really there yet.

      I dont know how GCN was such a terrible arch since it was the basis of an entire console generation. An argument could be made about how its GPGPU design may have hindered it at gaming on desktops but it had matured extremely well over time with driver upgrades, despite their given price + perf targets at release. Aside from that (and related to point 1), RDNA UMDs are all PAL based. I’m not sure what you’re alluding to with this? Could you please elaborate?

      Your final remark is untrue (FMF, AL+, gfx feature interop, mic ANS, a plethora of GPUOpen technologies) but I will forgive you not keeping up with a vendor’s tech if you don’t actively use their products.

      • edinbruh@feddit.it
        link
        fedilink
        arrow-up
        0
        ·
        8 months ago

        I’m literally using a full AMD PC right now. I don’t like Nvidia as much as the next person. I think they use terrible monopolistic practices, and if the competition were on par I would not buy Nvidia. But they aren’t.

        • Woozythebear@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          8 months ago

          The guy asked what’s better for gaming and you want on a rant about Nvidia being better because of AI workloads and other software.

          Amd are the better cards for gaming, Nvidia may have better ray tracing but most games don’t even use ray tracing so you will spend an extra 30% to get the same gaming performance from an AMD card that actually has enough Vram to play the games at ultra settings and higher resolution.

          • edinbruh@feddit.it
            link
            fedilink
            arrow-up
            0
            ·
            edit-2
            8 months ago

            Well, if you are not gonna use Nvidia’s extra stuff, buy an AMD, by all means.

            But what you say is disingenuous. “AI and other software” is not entirely unrelated to gaming. Things like hairworks, physx, and most gameworks in general run on CUDA. And for AI (which I don’t care about that much) there is DLSS, and they are working on AI enhanced rendering.

            Most games don’t use those technologies, but some do, and you will miss out on those.

    • DarkThoughts@fedia.io
      link
      fedilink
      arrow-up
      0
      ·
      8 months ago

      but if AMD from the past had proposed an alternative we would have a standardized physics extension in DirectX by now, like with dlss

      Why the fuck put this on AMD when it was Nvidia who did their usual proprietary bullshit? “AMD is worse than Nvidia because they didn’t provide us with a better alternative!” ???

  • Blxter@lemmy.zip
    link
    fedilink
    arrow-up
    0
    ·
    8 months ago

    I had a 1060 and upgraded to a 3080 ahile ago. For next upgrade most likely will do AMD unless Nvidia can convince me to go with them again

  • BombOmOm@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    8 months ago

    Here is my process for new cards: Pick a pricepoint, head over to Video Card Benchmark and scroll down until you find the first (ie fastest) video card that meets that price point. Also double check prices at PC Part Picker. For used cards, the chart is still pretty useful, just a bit more manual (and money saving!) to get used prices from eBay/Mercari.

    I personally have an AMD bias though, since they have pulled way less shit than nVidia. But, that is your decision to make.

  • aluminium@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    8 months ago

    Nothing at the moment, I’d wait for the Nvidia RTX 5000 and AMD RX8000 cards. They should release later this year.

  • hperrin@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    8 months ago

    They’re all pretty good. Even the Intel cards are pretty good now. I guess, what’s most important to you? If you want maximum compatibility with games, go for Nvidia. If you want better price to performance, go with AMD or Intel. Although, if I were you, I’d wait until AMD and Intel’s next gen. Both are coming (relatively) soon (probably before the end of the year), and will probably be a lot better than what’s out now.

    One caveat, if you use or plan to use Linux, Nvidia can present some difficulties, so avoid them.

    Actually two caveats, if you plan to use hardware encoding, like you’ll be streaming on Twitch while you play games, avoid AMD. Their hardware encoding is pretty trash. Both Nvidia and Intel are much better.

    My current lineup (I know I have a lot of machines, but my wife and I both play games, and I do AI workloads as well):

    • RTX 3090 (mostly for AI)
    • Radeon RX 6700 XT (great card)
    • Arc A380 (for transcoding, but I’ve gamed on it, and it’s great)
    • Radeon RX 6600 (my main card, just because it’s in my living room HTPC, running ChimeraOS)
    • reversedposterior@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      8 months ago

      For the hardware encoding side it used to be true before OBS introduced better AMD encoder support. I have a 6800XT and it works just fine for streaming casually, though I agree that if you stream professionally then Nvidia is the better option.

    • baconisaveg@lemmy.ca
      link
      fedilink
      arrow-up
      0
      ·
      8 months ago

      The amount of self-hosted AI integrations is only going to grow as well. I have a 3090 in a closet PC and I use it for everything from image generation to VSCode/Neovim code completion and code chat. One of the things I’d really like to see in the next few years is a wide variety of local AI driven self hosted Alexa replacements.

      • hperrin@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        8 months ago

        Oh, I would love that. Self hosted voice assistant is like the panacea. Mycroft was awesome at first, but it never really panned out.

    • JackGreenEarth@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      8 months ago

      How much VRAM does your AI card have? The one I have only has 6GB, and I’ve found that quite limiting.

      • hperrin@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        8 months ago

        The 3090 has 24GB. Yeah, 6GB is too small for a lot of things. Even 24GB is too small for some of the models I’ve tried.