Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The upgrade to 90 Hertz is really good for a non-obvious reason: In the Steam Deck userbase, the "Golden 40", playing games at 40fps and the screen at 40 hertz, is a pretty well-liked trick for getting the frame time right in between 30 and 60fps at 25ms while "only" needing power to render 10 more frames per second than 30, making for a much better experience than 30.

The only problem with this is if a frame is slightly late at 40 hertz, you're waiting the full 25 ms for the next one instead of 16.6ms at 60hz. Being able to run the screen at 80 hz for 40fps games cuts that stutter time on a missed frame in half to 12.5ms, and will make a huge difference!



So does the Steam Deck not support VRR for the onboard display? I see articles saying support was added for external displays, but it’s not clear whether the onboard one has it. If it does, then it seems like it shouldn't be a problem for a frame to be slightly late.

Edit: I read some other comments that explain the situation. It sounds like there is no VRR for the internal display unfortunately.


No, it sadly doesn't do VRR and neither does this new one. According to the LTT video[0], it's because of the internal connector that the internal display is attatched with, because external VRR screens do work. They speculate that valve were just limited by what is available on the market because they are not quite shipping enough units yet to warrant fully custom designs / orders. Apparently there are hints towards this being the same supplier that also supplies the Switch OLED's screen.

[0]https://www.youtube.com/watch?v=uCVXqoVi6RE


Bloomberg says the supplier is Samsung.[0] Not entirely surprising because Sony is somewhat of a competitor and that only leaves LG who is not nearly as good at mobile oled which is one of Samsung's largest markets. This said I know eg. Apple uses multiple display suppliers for the iphone 14/15 base model so it could be the same here.

[0]https://www.bloomberg.com/news/articles/2021-03-04/nintendo-...


Sony isn't a supplier for a cheap OLEDs. LG, BOE, and other Chinese competitors are. Random game tend to have static HUD that is a risk for burn in. I imagine that Samsung OLED is a safe choice for game console thanks to Nintendo's verification.


Which other chinese suppliers besides BOE can achieve 1000 nits HDR 2? Even BOE can barely do that being they have been canceled from the iphone 15 supply chain due to their reliability. They were having issues with iphone 14 as well which was only

>800 nits max brightness (typical); 1200 nits peak brightness (HDR)[0]

[0]https://www.apple.com/iphone-14/specs/


VRR has a very sad story on Linux too.


The display uses a mipi interface instead of a edp. This is because the screen isn’t completely custom and likely used one from a manufacturer similar to the switch oled which also uses a mipi display.

If valve sold more units it might justify a completely custom solution but this is still way better than what people had to do to get a better screen before.


Isn't this just LTT speculation and nothing is confirmed yet?


You can just open the Deck and look. It's not a 40-pin eDP connector, it's a flat ribbon, exactly like a tablet display. Also the natural orientation is portrait, which you can see if you turn off the boot splash.


Well, the part about it being the same display as the Switch etc.


It's not the same display, but probably comes from the same mother glass. Same subpixel layout, same density.


People consider 40fps the sweet spot?


Yeah, for games that don't quite run at 60fps. You set the screen to 40 hertz, the difference between 30 and 40 is HUGE latency and general smoothness wise.


It doesn't sound like a big difference but 40fps is actually about halfway between 30fps and 60fps in response times. Because 30fps is 1 frame every 33ms, 60fps is 16ms. 40 fps gives you 25ms.


It is only needing to produce 33% more frames for 40fps vs 30fps. Compare that to producing 60fps which is 100% more frames. So you are only sacrificing a small amount of performance to get 40fps but a significant amount to reach 60fps.

In practicality, most people can't tell the difference between 40fps and 60fps, or at least won't notice it in normal gameplay.

So you get nearly the same experience as 60fps but for much less processing power and therefore also better battery life, less fan noise/cooling, and you can crank graphics a little higher.

Most people are more likely to notice the difference between "Medium" and "High" very easily or "High" and "Very High", but won't notice the impact of 40fps vs 60fps. So that's why 40fps is generally a sweet spot between performance and experience.


> In practicality, most people can't tell the difference between 40fps and 60fps, or at least won't notice it in normal gameplay.

Oh, good, we're back to justifying technical limitations with fabricated myths from early 2000s console marketing teams.


Seriously. I can even tell the difference between 120 and 175 only using my desktop…


The bigger the display, the easier it is to visually detect a difference in refresh rates


Can't tell the difference between 40 and 60 but you can tell 30 and 40?


Not unlikely. 15ms is about the threshold humans can detect latency. This takes you from 33ms to 25ms which is much closer to that boundary.

Also depends on context a lot - on a 7" screen it might be hard to notice the difference, but you can easily tell apart 120hz on a large monitor or in VR.


> 15ms is about the threshold humans can detect latency

I think this is an oversimplification. We can detect latency, meaning a difference between input and screen response, at around 13ms.

But that doesn't equate to a new frame every 13ms being perfect, latency-wise. First, it ignores all the other sources of latency such as from input devices. Assume 6ms latency on the controller, that means you have to get a new frame out within 7ms from receiving the controller input.

That aside, there's a lot more than latency involved in creating beautiful smooth images on screen. For example, humans can detect visual change unrelated to input at about 500hz or even higher, especially in grayscale. I've heard that about 0.5m, or 2000hz, gray to gray is about the limit at which further improvements would be meaningless.

Of course the law of diminishing returns applies here. The fastest screen I've tested is 360hz and I can't say that I see any benefits over a 144hz screen, although I can see the difference in artificial benchmarks on blur busters.


That actually checks out, because the color-detecting cones in the center of your eye are much less sensitive responsive to changes (framerate) than the rods in your peripheral vision.


Can easily detect the difference between 60 and 144hz as well.


There's a weird cult-like attitude around 40fps that I've noticed. Personally I run everything at the normal 60hz and almost everything I play runs fine, even at the higher resolution (1920x1200) of my external monitor. Counter-Strike 2 runs horribly and Baldur's Gate 3 was sub-60, but mostly I don't play anything that demanding (mostly indie, lots of 2D). It might matter more if you play all the new AAA titles that come out.


> There's a weird cult-like attitude around 40fps that I've noticed.

As someone who learnt about this public 40fps love just a few minutes, I'd like to add that this was something I had noticed myself years ago. 40fps feels "smooth" and closer to 60fps, than 25/30fps for some reason. Unfortunately my graphics card struggled but that's a different story.


I think I'm missing something with graphics cards. I have a 1060 on a i7-3770 and it can put out 120fps on most games. My M1 Macbook can play those same games through Rosetta 2 and Wine + GPTK at a similar FPS.

What are people doing that they have like a 3080 on a 12700 and can barely hit 60fps??


Anecdotally, I play recent 3d games on an ultrawide (not quite 4k but significantly higher res than 1080p). It's 3440x1440 pixels with a 144 Hz refresh. On a 4080 and ultra settings, most games can't get there. Thankfully DLSS makes it a lot better.

(Using geforce now on a MacBook hooked up to an external monitor).

BG3 is very smooth. Cities Skylines 2 stutters. Starfield was okay. Talos Principle 2 is laggy. Cyberpunk is laggy.

Normally I don't care about graphics all that much, so I'll usually turn the details down to medium or even low. But these days DLSS is so good that's less necessary.


Well, you havent mentioned your screen resolution. The people struggling with the 4080s are going to max settings and 4k, possibly with some RTX effects because they paid for the whole GPU.


In my specific case with the graphics card, back then in 2016/17 I was using an intel UHD5500. Now in 2023 I’m using… an Intel HD520.

Yes I need a new laptop *frustration*


It's not a cult, 40 is just a sweet spot between performance, battery life, and comfort for a large number of demanding 3D games. If you often play in docked mode, the battery life doesn't mattery so you can probably achieve 50 or 60 in some modern games.


There's a lot of stuff that can't reach 60fps but has headroom over 30fps, that's where the 40 comes from. It's a completely arbitrary point, just happens to be halfway between 1/60 and 1/30 s, and (external benefit for stuff like PS5 Spider-Man 2) it divides evenly into 120Hz displays without needing variable refresh.


I used 40fps on my steam deck for playing Death Stranding. It's not a fast paced game, so the smoothness reduction wasn't huge. But it is demanding and visuals are a big part of the experience, so being able to turn up the graphics a bit was great.


I guess I play faster games than most because I always put the graphics settings to lowest detail, native resolution, and aim for 144fps (my monitor's native refresh rate). Anything less than 90fps/Hz is pretty bad to me.


That's pretty reasonable. My phone screen is 120hz but my main computer monitors are only 60hz, so I more-or-less don't know what I'm missing. I tend to care more about color and viewing angles than refresh rate on a monitor and steer more toward the professional side than the gaming side, so I haven't had much exposure to very high refresh rates on a regular basis yet. My monitors are also pretty old now, but they still work fine. A couple HP ZR24w monitors I got refurbished from Newegg. I think of them as being like the monitor equivalent to a used-ThinkPad/EliteBook.


That's at least partially because of the mismatch between display refresh rate and frame rate - one of the things that makes the Deck's 40 fps mode on the internal display nice is that it also changes the display refresh rate to 40hz, making it look consistent and synchronised.


I limit Elden Ring to 35FPS to avoid stuttering and it basically feels as smooth as 60FPS on the XBox Series S. As long as you get at least ~30FPS the framerate doesn't matter as much as framerate stability.


It's a bit better than 30 while saves battery compare to 60


Maybe for battery.


On the Deck, yes.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: