Much of this isn’t unique to PC gaming. And if there ever was a dark age for PC hardware, we’ve recently crawled out of it, thankfully.
What bugs me the most right now (and doesn’t quite get addressed in this article) is low performance standards. Everyone’s pushing 4K and ray tracing, which makes it hard out here for us framerate nerds. It’s starting to feel like every major release that comes out is Crysis, something for my hardware to grow into. Only with blurry anti-aliasing/supersampling techniques now.
One new, big positive I’m not seeing talked about much is a growing variety of Japanese publishers are taking PC seriously now, and that hasn’t happened in over thirty years. I’m including Sony in this, even with their recent missteps in the space, and Square Enix’s recently announced restructuring suggests simultaneous PC releases in the future for their games. That will inject some competition in PC gaming, although be aware that Japan has its own share of publishers that release broken ports.
It’s not raw framerates that are bad now, developers pushing tech is not a bad thing and has been how gaming has been since it’s invention, aside from the “dark ages” of X360 ports where PC just meant 360 graphics at crazy res and framerates.
The problem nowadays is games are straight unplayable even at lowered settings or extreme hardware due to shader compilation or streaming stutters. This is just bad programming with no fix aside from an engine rework, and most devs don’t have engine programmers anymore since they just ship UE4/5
My recent realization is that the very high-end of the GPU market is totally unnecessary. A 4070 can play practically any game at 4K with decent framerates. And if you are fine with just “high” settings instead of maxed out, at very good framerates too.
In 2020 I bought a 2080 Super for more than 800€, I considered it crazy expensive but I had the money and my 1070 had died. 23 months later it died and Amazon refunded the money, so I proceeded to buy another GPU. That money was only enough for a 3070ti and I still had to pay 30-40€ more. I’m happy with my 3070ti for now, and I hope it lasts long enough, I don’t think I can afford that kind of money anymore.
Pretty sure nobody cares, but man as a third world country guy it feels weird as fuck to see these reasonable-looking numbers only to multiply by 50 and get a heart attack.
I don’t disagree, but gaming laptops are always overpriced. You’re paying a premium for the small form factor. (And I assume they also have the much less powerful RTX 4070 Mobile, which makes the value proposition even worse for laptops.)
What bugs me the most right now (and doesn’t quite get addressed in this article) is low performance standards.
I’d add low control standards. Since everything is a console port now, everything needs to be dumbed down to be playable by controller. That’s why we don’t see certain genres much any more (sims or RTS) and get shooters with included aim “aids” or cross-play wouldn’t be possible.
Much of this isn’t unique to PC gaming. And if there ever was a dark age for PC hardware, we’ve recently crawled out of it, thankfully.
What bugs me the most right now (and doesn’t quite get addressed in this article) is low performance standards. Everyone’s pushing 4K and ray tracing, which makes it hard out here for us framerate nerds. It’s starting to feel like every major release that comes out is Crysis, something for my hardware to grow into. Only with blurry anti-aliasing/supersampling techniques now.
One new, big positive I’m not seeing talked about much is a growing variety of Japanese publishers are taking PC seriously now, and that hasn’t happened in over thirty years. I’m including Sony in this, even with their recent missteps in the space, and Square Enix’s recently announced restructuring suggests simultaneous PC releases in the future for their games. That will inject some competition in PC gaming, although be aware that Japan has its own share of publishers that release broken ports.
It’s not raw framerates that are bad now, developers pushing tech is not a bad thing and has been how gaming has been since it’s invention, aside from the “dark ages” of X360 ports where PC just meant 360 graphics at crazy res and framerates.
The problem nowadays is games are straight unplayable even at lowered settings or extreme hardware due to shader compilation or streaming stutters. This is just bad programming with no fix aside from an engine rework, and most devs don’t have engine programmers anymore since they just ship UE4/5
With high-end GPUs costing what they still cost, I don’t think we’re out of it just yet.
My recent realization is that the very high-end of the GPU market is totally unnecessary. A 4070 can play practically any game at 4K with decent framerates. And if you are fine with just “high” settings instead of maxed out, at very good framerates too.
In 2020 I bought a 2080 Super for more than 800€, I considered it crazy expensive but I had the money and my 1070 had died. 23 months later it died and Amazon refunded the money, so I proceeded to buy another GPU. That money was only enough for a 3070ti and I still had to pay 30-40€ more. I’m happy with my 3070ti for now, and I hope it lasts long enough, I don’t think I can afford that kind of money anymore.
A 4070 is the high end, though. Any laptop with a 4070 in it will cost over £1.5k
Pretty sure nobody cares, but man as a third world country guy it feels weird as fuck to see these reasonable-looking numbers only to multiply by 50 and get a heart attack.
I don’t disagree, but gaming laptops are always overpriced. You’re paying a premium for the small form factor. (And I assume they also have the much less powerful RTX 4070 Mobile, which makes the value proposition even worse for laptops.)
I’d add low control standards. Since everything is a console port now, everything needs to be dumbed down to be playable by controller. That’s why we don’t see certain genres much any more (sims or RTS) and get shooters with included aim “aids” or cross-play wouldn’t be possible.