• linkhidalgogato@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    7 months ago

    im a fan of no corporation especially not fucking amd, but they have been so much better than intel recently that im struggling to understand why anyone still buys intel

  • arefx@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    7 months ago

    Ryzen gang

    My 7800x3d is incredible, I won’t be going back to Intel any time soon.

    • Rakonat@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      Me who bought AMD cpu and gpu last year for my new rig cause fuck the massive mark up for marginal improvement on last gen stats.

      • SuperIce@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 months ago

        Not sure how much longer I’ll be using the 5950x tbh. We’ve reached a point where the mobile processors have faster multicore (for the AI 370) than the 5950X without gulping down boatloads of power.

      • felsiq@lemmy.zip
        link
        fedilink
        arrow-up
        0
        ·
        7 months ago

        To put this into context, the zen5 X3D chips aren’t out yet so this isn’t really an apples to apples comparison between generations. Also, zen5 was heavily optimized for efficiency rather than speed - they’re only like 5% faster than zen4 (X series, not X3D ofc) last I saw but they do that at the zen3 TDPs, which is crazy impressive. I’m not disagreeing with you about the 7800X3D - I love that chip, it’s def a good one - just don’t want people to get the wrong idea about zen5.

    • r00ty@kbin.life
      link
      fedilink
      arrow-up
      0
      ·
      7 months ago

      Also on the 7800X3D. I think I switched at just the right time. I’ve been on Intel since the Athlon XP. The next buy would have been 13/14th gen.

      • arefx@lemmy.ml
        link
        fedilink
        arrow-up
        0
        ·
        6 months ago

        I’m not that worried about it effecting me lol, i would be more concerned about my intel cpu dying, especially since it’s been around for decades.

      • LeadersAtWork@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        6 months ago

        tldr: Flaw can give a hacker access to your computer only if they have already bypassed most of the computer’s security.

        This means continue not going to sketchy sites.

        Continue not downloading that obviously malicious attachment.

        Continue not being a dumbass.

        Proceed as normal.

        Because if a hacker got that deep your system is already fucked.

        • Blue_Morpho@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          6 months ago

          It’s more serious than normal because if your PC ever gets owned, a wipe and reinstall will not remove the exploit.

          “Nissim sums up that worst-case scenario in more practical terms: “You basically have to throw your computer away.””

      • sparkle@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        6 months ago

        Are you just posting this under every comment? This isn’t even a fraction as bad as the Intel CPU issue. Something tells me you have Intel hardware…

    • Scrubbles@poptalk.scrubbles.tech
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      Honestly even with gpus now too. I was forced to team green for a few years because they were so far behind. Now though, unless you absolutely need a 4090 for some reason, you can get basically the same performance from and, for 70% of the cost

      • Cyborganism@lemmy.ca
        link
        fedilink
        arrow-up
        0
        ·
        7 months ago

        I haven’t really been paying much attention to the latest GPU news, but can AMD cards do ray tracing and dlss and all that jazz that comes with RTX cards?

        • Scrubbles@poptalk.scrubbles.tech
          link
          fedilink
          English
          arrow-up
          0
          ·
          7 months ago

          Yes, but by different names. They use FSR that’s basically the same thing, I haven’t noticed a difference in quality. Ray tracing too, just not branded as RTX

        • natebluehooves@pawb.social
          link
          fedilink
          arrow-up
          0
          ·
          7 months ago

          DLSS is off the table, but you CAN raytrace. That being said I do not see the value of RT myself. It has the greatest performance impact of any graphical setting and often looks only marginally better than baked in lighting.

          • linkhidalgogato@lemmy.ml
            link
            fedilink
            arrow-up
            0
            ·
            7 months ago

            dlss is a brand name both amd and intel have their own version of the same thing, and they are only a little worse if at all.

          • Cyborganism@lemmy.ca
            link
            fedilink
            arrow-up
            0
            ·
            7 months ago

            It depends greatly on the game. I’ve seen a huge difference in games like Control where the game itself was used to feature that… Well… Feature! You can see it in the quality of the lighting and the reflections. You also get better illumination on darker areas thanks to radiated lighting. It’s much more natural looking.

      • anivia@lemmy.ml
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        6 months ago

        I disagree. Processing power may be similar, but Nvidia still outperforms with raytracing, and more importantly DLSS.

        Whats the point of having the same processing power, when Nvidia still gets more than double the FPS in any game that supports DLSS

        • reliv3@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          6 months ago

          FSR exists, and FSR 3 actually looks very good when compared with DLSS. These arguments about raytracing and DLSS are getting weaker and weaker.

          There are still strong arguments for nvidia GPUs in the prosumer market due to the usage of its CUDA cores with some software suites, but for gaming, Nvidia is just overcharging because they still hold the mindshare.

    • scrion@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      7 months ago

      For years, Intel’s compiler, math library MKL and their profiler, VTune, really only worked well with their own CPUs. There was in fact code that decreased performance if it detected a non-Intel CPU in place:

      https://www.agner.org/optimize/blog/read.php?i=49&v=f

      That later became part of a larger lawsuit, but since Intel is not discriminating against AMD directly, but rather against all other non-Intel CPUs, the result of the lawsuit was underwhelming. In fact, it’s still a problem today:

      https://www.extremetech.com/computing/302650-how-to-bypass-matlab-cripple-amd-ryzen-threadripper-cpus

      https://medium.com/codex/fixing-intel-compilers-unfair-cpu-dispatcher-part-1-2-4a4a367c8919

      Given that the MKL is a widely used library, people also indirectly suffer from this if they buy an AMD CPU and utilize software that links against that library.

      As someone working in low-level optimization, that was/is a shitty situation. I still bought an AMD CPU after the latest fiasco a couple of weeks ago.

  • gmtom@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    7 months ago

    Can we talk about how utterly useless that default could cooler is? Like for relatively high end gaming CPU it really shouldn’t be legal for it to ship with something so useless.

  • Lizardking27@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    7 months ago

    Ugh. Can I just say how much I fucking HATE how every single fucking product on the market today is a cheap, broken, barely functional piece of shit.

    I swear to God the number of times I have to FIX something BRAND NEW that I JUST PAID FOR is absolutely ridiculous.

    I knew I should’ve been an engineer, how easy must it be to sit around and make shit that doesn’t work?

    Fucking despicable. Do better or die, manufacturers.

      • Allonzee@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        6 months ago

        Capitalism: “Growth or die!”

        Earth: I mean… If that’s how it’s gotta be, you little assholes🤷👋🔥

        It’s kind of gallows hilarious that for all the world’s religions worshipping ridiculous campfire ghost stories, we have a creator, we have a remarkable macro-organism mother consisting of millions of species, her story of hosting life going back 3.8 billion years, most living in homeostasis with their ecosystem.

        But to our actual, not fucking ridiculous works of lazy fiction creator, Earth, we literally choose to treat her like our property to loot, rape, and pillage thoughtlessly, and continue to act as a cancer upon her eyes wide open. We as a species are so fucking weird, and not the good kind.

    • Doombot1@lemmy.one
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      Most of the time, the product itself comes out of engineering just fine and then it gets torn up and/or ruined by the business side of the company. That said, sometimes people do make mistakes - in my mind, it’s more of how they’re handled by the company (oftentimes poorly). One of the products my team worked on a few years ago was one that required us to spin up our own ASIC. We spun one up (in the neighborhood of ~20-30 million dollars USD), and a few months later, found a critical flaw in it. So we spun up a second ASIC, again spending $20-30M, and when we were nearly going to release the product, we discovered a bad flaw in the new ASIC. The products worked for the most part, but of course not always, as the bug would sometimes get hit. My company did the right thing and never released the product, though.

      • /home/pineapplelover@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        6 months ago

        It’s almost never the engineers fault. That whole Nasa spacecraft that exploaded was due to bureaucracy and pushing the mission forwards.

    • 31337@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      I’ve put together 2 computers the last couple years, one Intel (12th gen, fortunately) and one AMD. Both had stability issues, and I had to mess with the BIOS settings to get them stable. I actually had to under-clock the RAM on the AMD (probably had something to do with maxing-out the RAM capacity, but I still shouldn’t need to under-clock, IMO). I think I’m going to get workstation-grade components the next time I need to build a computer.

    • Buddahriffic@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      7 months ago

      It’s not easy to make shit that doesn’t work if you care about what you’re doing. I bet there’s angry debates between engineers and business majors behind many of these enshitifications.

      Though, for these Intel ones, they might have been less angry and more “are you sure these risks are worth taking?” because they probably felt like they had to push them to the extreme to compete. The angry conversations probably happened 5-10 years ago before AMD brought the pressure when Intel was happy to assume they had no competition and didn’t have to improve things that much to keep making a killing. At this point, it’s just a scramble to make up for those decisions and catch up. Which their recent massive layoffs won’t help with.

    • InputZero@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      So this doesn’t apply to the Intel situation, but a good lesson to learn is that the bleeding edge cuts both ways. Meaning that anyone buying the absolute latest technology, there’s going to be some friction with usability at first. It should never surmount to broken hardware like the Intel CPUs, but buggy drivers for a few weeks/months is kinda normal. There’s no way of knowing what’s going to happen when a brand new product is going to be released. The producer must do their due diligence and test for anything catastrophic but weird things happen in the wild that no one can predict. Like I said at the top, this doesn’t apply to Intel’s situation because it was a catastrophic failure, but if you’re ever on the bleeding edge assume eventually you’re going to get cut.

  • littletranspunk@lemmus.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    7 months ago

    Glad my first self-built PC is full AMD (built about a year ago).

    Screw Intel and Nvidia

    7700X is what it was built with

    • black0ut@pawb.social
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      Afaik it wasn’t a temperature problem, it was voltage related. Obviously cooler temps help, but you would probably still be vulnerable to this.

  • kamen@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    6 months ago

    Don’t be a fan of one or the other, just get what’s more appropriate at the time of buying.

  • angrystego@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    6 months ago

    I thought the point would be a depressed and self deprecating “I’m something of an Intel CPU myself”.

    • zaphodb2002@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      7 months ago

      I loved my FX cpu but I lived in a desert and the heat in the summer coming off that thing would make my room 100F or more. First machine I built a custom water loop for. Didn’t help with the heat in the room, but did stop it from shutting down randomly, so I could continue to sit in the sweltering heat in my underpants and play video games until dawn. Better times.

      • helpmepickaname@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        7 months ago

        Of course it didn’t help the heat in the room, the heat from the CPU still has to go somewhere. Better coolers aren’t for the room, they’re for the CPU. in fact a better cooler could make the room hotter because it is removing heat at a higher rate from the CPU and dumping it into the room

      • rotopenguin@infosec.pub
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        You might want to go through the trouble of extending that radiator loop all the way out through a window.

      • Bytemeister@lemmy.world
        link
        fedilink
        Ελληνικά
        arrow-up
        0
        ·
        6 months ago

        I had the FX8350 Black Edition, and that thing would keep my room at 70f… In the winter… With a window open.

        Summer gaming was BSOD city. I miss it so much.