im a fan of no corporation especially not fucking amd, but they have been so much better than intel recently that im struggling to understand why anyone still buys intel
Researchers discover potentially catastrophic exploit present in AMD chips for decades
They’re both very flawed
Despite being potentially catastrophic, this issue is unlikely to impact regular people.
Doesn’t seem very similar to me.
Sounds like some precious and sweet intelboi feels bad holding the bag…
What’s so bad about AMD? They seem a lot better than Intel imo.
They are bad at writing software and firmware support is sketchy. That second point is technically the motherboard vendors fault but it could be due to confusing design and documentation on the AMD side. Hardware-wise they are great AFAIK.
Most of the shopping I’ve been helping people with lately has been for laptops. And while there are slightly more AMD options then before laptops are still dominated by Intel for the most part. Especially if you’re trying to help someone pick something while on a tighter budget.
Of all the CPU and GPU manufacturers out there, AMD is the most consistently pro-consumer with the least corporate fuckery, so I take mighty exception at your ‘especially not fucking amd’ comment.
Can we talk about how utterly useless that default could cooler is? Like for relatively high end gaming CPU it really shouldn’t be legal for it to ship with something so useless.
Intel has not halted sales or clawed back any inventory. It will not do a recall, period. The company is not currently commenting on whether or how it might extend its warranty.
They may be greedy but they are not stupid. Clearly they calculated that by just ignoring the issue and eating the lawsuits, they save money compared to trying to make an actual solution (whatever that would even look like in the first place)
Gotta love fucking over the consumer twice! They’re gonna get, what, $5 out of a class action? $5 and a burned out cpu, yay!
Ryzen gang
My 7800x3d is incredible, I won’t be going back to Intel any time soon.
I’m still staying with my 5950X. So many cores!
Not sure how much longer I’ll be using the 5950x tbh. We’ve reached a point where the mobile processors have faster multicore (for the AI 370) than the 5950X without gulping down boatloads of power.
7800X3D is so damn good, it also outperforms the newer Zen5 processors 💀
To put this into context, the zen5 X3D chips aren’t out yet so this isn’t really an apples to apples comparison between generations. Also, zen5 was heavily optimized for efficiency rather than speed - they’re only like 5% faster than zen4 (X series, not X3D ofc) last I saw but they do that at the zen3 TDPs, which is crazy impressive. I’m not disagreeing with you about the 7800X3D - I love that chip, it’s def a good one - just don’t want people to get the wrong idea about zen5.
Also on the 7800X3D. I think I switched at just the right time. I’ve been on Intel since the Athlon XP. The next buy would have been 13/14th gen.
Me who bought AMD cpu and gpu last year for my new rig cause fuck the massive mark up for marginal improvement on last gen stats.
I’m not that worried about it effecting me lol, i would be more concerned about my intel cpu dying, especially since it’s been around for decades.
tldr: Flaw can give a hacker access to your computer only if they have already bypassed most of the computer’s security.
This means continue not going to sketchy sites.
Continue not downloading that obviously malicious attachment.
Continue not being a dumbass.
Proceed as normal.
Because if a hacker got that deep your system is already fucked.
It’s more serious than normal because if your PC ever gets owned, a wipe and reinstall will not remove the exploit.
“Nissim sums up that worst-case scenario in more practical terms: “You basically have to throw your computer away.””
Okay, so what gets permanently “owned?” The BIOS on the motherboard, the CPU? is the GPU also hosed?
The CPU is hosed. I assume the throw away is in reference to laptops or mini PCs where the CPU isn’t socketed.
Yeah the Intel issue is definitely a bigger problem. Imo.
For CPUs nothing beats AMD
Are you just posting this under every comment? This isn’t even a fraction as bad as the Intel CPU issue. Something tells me you have Intel hardware…
For years, Intel’s compiler, math library MKL and their profiler, VTune, really only worked well with their own CPUs. There was in fact code that decreased performance if it detected a non-Intel CPU in place:
https://www.agner.org/optimize/blog/read.php?i=49&v=f
That later became part of a larger lawsuit, but since Intel is not discriminating against AMD directly, but rather against all other non-Intel CPUs, the result of the lawsuit was underwhelming. In fact, it’s still a problem today:
https://medium.com/codex/fixing-intel-compilers-unfair-cpu-dispatcher-part-1-2-4a4a367c8919
Given that the MKL is a widely used library, people also indirectly suffer from this if they buy an AMD CPU and utilize software that links against that library.
As someone working in low-level optimization, that was/is a shitty situation. I still bought an AMD CPU after the latest fiasco a couple of weeks ago.
Honestly even with gpus now too. I was forced to team green for a few years because they were so far behind. Now though, unless you absolutely need a 4090 for some reason, you can get basically the same performance from and, for 70% of the cost
I haven’t really been paying much attention to the latest GPU news, but can AMD cards do ray tracing and dlss and all that jazz that comes with RTX cards?
Yes, but by different names. They use FSR that’s basically the same thing, I haven’t noticed a difference in quality. Ray tracing too, just not branded as RTX
DLSS is off the table, but you CAN raytrace. That being said I do not see the value of RT myself. It has the greatest performance impact of any graphical setting and often looks only marginally better than baked in lighting.
dlss is a brand name both amd and intel have their own version of the same thing, and they are only a little worse if at all.
It depends greatly on the game. I’ve seen a huge difference in games like Control where the game itself was used to feature that… Well… Feature! You can see it in the quality of the lighting and the reflections. You also get better illumination on darker areas thanks to radiated lighting. It’s much more natural looking.
I disagree. Processing power may be similar, but Nvidia still outperforms with raytracing, and more importantly DLSS.
Whats the point of having the same processing power, when Nvidia still gets more than double the FPS in any game that supports DLSS
FSR exists, and FSR 3 actually looks very good when compared with DLSS. These arguments about raytracing and DLSS are getting weaker and weaker.
There are still strong arguments for nvidia GPUs in the prosumer market due to the usage of its CUDA cores with some software suites, but for gaming, Nvidia is just overcharging because they still hold the mindshare.
I thought the point would be a depressed and self deprecating “I’m something of an Intel CPU myself”.
bro think he amd gpu
Ugh. Can I just say how much I fucking HATE how every single fucking product on the market today is a cheap, broken, barely functional piece of shit.
I swear to God the number of times I have to FIX something BRAND NEW that I JUST PAID FOR is absolutely ridiculous.
I knew I should’ve been an engineer, how easy must it be to sit around and make shit that doesn’t work?
Fucking despicable. Do better or die, manufacturers.
Capitalism: “Make as much as possible as fast as possible”
Capitalism: “Growth or die!”
Earth: I mean… If that’s how it’s gotta be, you little assholes🤷👋🔥
It’s kind of gallows hilarious that for all the world’s religions worshipping ridiculous campfire ghost stories, we have a creator, we have a remarkable macro-organism mother consisting of millions of species, her story of hosting life going back 3.8 billion years, most living in homeostasis with their ecosystem.
But to our actual, not fucking ridiculous works of lazy fiction creator, Earth, we literally choose to treat her like our property to loot, rape, and pillage thoughtlessly, and continue to act as a cancer upon her eyes wide open. We as a species are so fucking weird, and not the good kind.
So this doesn’t apply to the Intel situation, but a good lesson to learn is that the bleeding edge cuts both ways. Meaning that anyone buying the absolute latest technology, there’s going to be some friction with usability at first. It should never surmount to broken hardware like the Intel CPUs, but buggy drivers for a few weeks/months is kinda normal. There’s no way of knowing what’s going to happen when a brand new product is going to be released. The producer must do their due diligence and test for anything catastrophic but weird things happen in the wild that no one can predict. Like I said at the top, this doesn’t apply to Intel’s situation because it was a catastrophic failure, but if you’re ever on the bleeding edge assume eventually you’re going to get cut.
I’ve put together 2 computers the last couple years, one Intel (12th gen, fortunately) and one AMD. Both had stability issues, and I had to mess with the BIOS settings to get them stable. I actually had to under-clock the RAM on the AMD (probably had something to do with maxing-out the RAM capacity, but I still shouldn’t need to under-clock, IMO). I think I’m going to get workstation-grade components the next time I need to build a computer.
It’s not easy to make shit that doesn’t work if you care about what you’re doing. I bet there’s angry debates between engineers and business majors behind many of these enshitifications.
Though, for these Intel ones, they might have been less angry and more “are you sure these risks are worth taking?” because they probably felt like they had to push them to the extreme to compete. The angry conversations probably happened 5-10 years ago before AMD brought the pressure when Intel was happy to assume they had no competition and didn’t have to improve things that much to keep making a killing. At this point, it’s just a scramble to make up for those decisions and catch up. Which their recent massive layoffs won’t help with.
Most of the time, the product itself comes out of engineering just fine and then it gets torn up and/or ruined by the business side of the company. That said, sometimes people do make mistakes - in my mind, it’s more of how they’re handled by the company (oftentimes poorly). One of the products my team worked on a few years ago was one that required us to spin up our own ASIC. We spun one up (in the neighborhood of ~20-30 million dollars USD), and a few months later, found a critical flaw in it. So we spun up a second ASIC, again spending $20-30M, and when we were nearly going to release the product, we discovered a bad flaw in the new ASIC. The products worked for the most part, but of course not always, as the bug would sometimes get hit. My company did the right thing and never released the product, though.
It’s almost never the engineers fault. That whole Nasa spacecraft that exploaded was due to bureaucracy and pushing the mission forwards.
Glad my first self-built PC is full AMD (built about a year ago).
Screw Intel and Nvidia
7700X is what it was built with
Id rather exploit than hardware failure
How many times are you going to post the same, unrelated link ??
This. Full AMD on my last build as well.
I don’t care about any corp, I was looking at best bang for buck at the time. I was shocked how everyone I knew was like you should get this intel or that Nvidia, and when I asked why not <comparable performance AMD at 2/3 the price>, all I was getting back was marketing blabber.
Don’t be a fan of one or the other, just get what’s more appropriate at the time of buying.
AMD fans be like:
FANcy
Just got this card as an upgrade to my 5700xt. It is so good, and REALLY pretty.
Hey, I also have 5700xt. What card is this? And how much of an upgrade is it?
Sapphire 7900xtx Nitro+, and 3x performance boost PLUS far more stable frametimes at the same framerates
you know, im kinda a pc fan myself…
this is why I paid extra for a better cooling system on my rig.
You can never go overboard with cooling system
Afaik it wasn’t a temperature problem, it was voltage related. Obviously cooler temps help, but you would probably still be vulnerable to this.
This keeps getting slightly misrepresented.
There is no fix for CPUs that are already damaged.
There is a fix now to prevent it from happening to a good CPU.
Not out yet. But you can manually set your clocks and disable boost.
Not out yet.
Actually the 0x129 microcode was released yesterday, now it depends on which motherboard you have and how quickly they release a bios that packages it. According to Anandtech Asus and MSI did already release before Intel made the announcement. I see some for Gigabyte and Asrock too.
So, not out yet. At least not fully.
If you prefer being right, rather than just accepting the extra information, then sure let’s go with that.
But isn’t the fix basically under clocking those CPU?
Meaning the “solution” (not even out yet) is creeping those units before the flaw creeples them?
They said the cause was a bug in the microcode making the CPU request unsafe voltages:
Our analysis of returned processors confirms that the elevated operating voltage is stemming from a microcode algorithm resulting in incorrect voltage requests to the processor.
If the buggy behaviour of the voltage contributed to higher boosts, then the fix will cost some performance. But if the clocks were steered separately from the voltage, and the boost clock is still achieved without the overly high voltage, then it might be performance neutral.
I think we will know for sure soon, multiple reviewers announced they were planning to test the impact.
Thanks for the clarification
That was the first “Intel Baseline Profile” they rolled out to mobo manufacturers earlier in the year. They’ve roll out a new fix now.
Remember Spectre? When they recommended disabling hyperthreading?
I wanna switch to amd some day
r/ayymd