Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The indies I like don't need a serious graphic card. A good bunch of them work just fine on a 2018 i3 mac mini with the integrated graphics decelerator...

Don't you need the 1000+ EUR cards to play AAAs at the same performance as a console? That is, twice the price of a console only for the video card.

Edit: and I need the PCs for work anyway.



No, even considering the optimization involved, consoles are about the same performance as a RX 6700 non-XT. They lean heavily on upscalers, and the upscalers are usually inferior quality so a 3060 ti / 3070 is hitting in the same general ballpark of performance too.

Xbox series X also only has 10GB of vram and series S is 8GB, which people usually don’t realize. Microsoft used a fast partition/slow partition strategy (like GTX 970) so in practice the slow segment is your system memory and you can’t really cross over between them because it kills performance.

You can get 3060 Ti for $275 now or 6700XT for $330. NVIDIA has DLSS which is generally higher quality for a given level of upscaling (FSR2 quality is closer to DLSS balanced/performance level), which offsets the raw performance difference a bit. Or AMD has more raw raster and VRAM. But that's kinda your ballpark price comparison, not a 4090. The consoles aren't 4090 either, they're rendering games in 720p or 640p and upscaling.

But it gets into this weird space where people refuse to turn down a single setting or use even the highest-quality upscalers on PC, but PC is too expensive, so they'll buy a console where the settings are pre-turned-down for them and they'll be upscaled silently from even lower resolutions with even worse-quality upscalers, with no choice in the matter. Consoles are like the Apple products of the world, they take away the choices and that makes people happier because having too much choice is burdensome.


> But it gets into this weird space where people refuse to turn down a single setting or use even the highest-quality upscalers on PC, but PC is too expensive, so they'll buy a console where the settings are pre-turned-down for them and they'll be upscaled silently from even lower resolutions with even worse-quality upscalers, with no choice in the matter. Consoles are like the Apple products of the world, they take away the choices and that makes people happier because having too much choice is burdensome.

Yeah, i'd rather play the -ing game instead of counting the fps?


> Yeah, i'd rather play the -ing game instead of counting the fps?

yes, but, you can do that on PC too - just punch in medium settings and turn on DLSS Quality mode and away you go. You can get a $300 GPU that does the same thing as the console, you don't need to spend $700+ on a GPU to get console tier graphics.

The problem is that people insist on making comparisons with the PC builds at max settings, native-resolution/no upscaling, while they don't have a problem with doing those things on the consoles. And when you insist on maxing out a bunch of exponentially-more-expensive settings, you need a 4090 to keep up, and gosh, that makes PC building so much more expensive than just buying a console!

but again, the console is running 640p-960p internal resolution and upscaling it to 4K, which is like DLSS Performance or Ultra Performance mode. And if you enable those settings on PC, you can get the same thing for a pretty reasonable price. Not quite as good, but you're getting a full PC out of the deal, not a gaming appliance.

It's always been about consoles having an Apple-style model where they lock you into a couple reasonably-optimized presets, while PC gamers hyperventilate if you take a single setting off Ultra or benchmark with DLSS turned on. And obviously in that case you're going to need a lot more horsepower than consoles offer. Which is more expensive.

Also, GeForce Experience has a settings auto-optimizer which does this with one click, or you can use settings from the PCMR Wiki or DigitalFoundry etc. It does tend to target lower framerates than I'd prefer (as a 144 hz-haver) but there's a slider and you just move it a couple notches to the left.


> The indies I like don't need a serious graphic card. A good bunch of them work just fine on a 2018 i3 mac mini with the integrated graphics decelerator...

Yes, but your console don't run them so you need a (low end) PC + a console if you want to play both.

> Don't you need the 1000+ EUR cards to play AAAs at the same performance as a console?

No you don't. If you want to pay 1000+ that's because you ~~like to waste money~~ want to play at 4K res 144fps with the highest possible setting in the next 5 years at least. You can play AAA titles with the same kind of settings you have on console on a 300-500 euro graphic card. So for the price of the console you get an equivalent upgrade for your PC and don't have to pay a premium for your games and can play all your games on the same device.


Aren't console graphics usually not as good as the PC versions? Even if they have the same resolution, I was under the impression that effects and whatnot were lower. If that's in fact the case, then a 1000 EUR card wouldn't give you the same experience as a console. Hell, my mid-range AMD I bought new for 300 EUR has better graphics than my sister's PS4 pro.


Do you really mean better graphics instead of higher numbers?

Is your mid range AMD card the same generation as the PS4 Pro or you should compare with a PS5?

Are you comparing the same title?


I actually mean better graphics, yes. I'm not sure what you mean by "numbers" (fps?), but I'm talking draw distance, shadows, grass details, etc. I was comparing Red Dead Redemption 2. We don't have other games in common.

My AMD card is a 5600 XT, bought in 2020 IIRC, right before prices exploded due to mining. I don't think the PS5 was out at the time. Anyway, I've never seen one, so I'd be hard-pressed to make any comparison with it.

I've also not tested this in person, but I seem to remember watching a recording of someone playing GTA V on a PS4, and the graphics didn't look as good as on my PC. But then, I don't know how the compression and whatnot affected the quality.


You may be right for Rockstar games. Last one I played on PC was San Andreas. I mean in theory as a game developer you can add larger textures and whatnot in a PC game. If you want to spend money on it.

My impression generally is that, especially for the PS5 generation, there is a negligible difference unless you want those 180 fps and 8k and 16x fake frames or whatever DLSS is.


I'm not a hardcore gamer, nor do I follow these things too closely so I may be off here, but if I compare the specs of a 5600XT and the PS4's GPU, the former seems quite a bit faster. On the other hand, the PS4 has unified memory, whereas my PC is running PCIe 3 and DDR3 (quad channel, but still).

> My impression generally is that, especially for the PS5 generation, there is a negligible difference unless you want those 180 fps and 8k and 16x fake frames or whatever DLSS is.

As someone not particularly interested in the field (I own a PC because I need to do actual PC stuff, the "gaming" GPU I bought to kill time during covid lockdowns), my impression is that when a new generation console comes out, it's quite competitive with non-absurd PC builds. But PC GPUs tend to get noticeably better during the lifetime of the console. The PS4 came out 6 - 7 years before AMD released my GPU.


Its not what the GPUs can do. Its what the game devs find easy to support.


For 1080p and 1440p certainly not. Something like an RX 6750 XT will carry you all the way.

(The PS5 GPU was comparable to an RX 5700 XT / RTX 2070 Super at the time, although it's apples and oranges as the PS5 is an integrated system target whereas the PC is an open platform)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: