Some of us strongly prefer things to be presented as they are, without artificial enhancements.
This means that if a movie is shot at 24FPS (as nearly all of them are), and is shown to theater audiences at 24FPS, then it should also be displayed at 24FPS in the living room.
(But if you prefer to view the world through rose-tinted glasses, then you do you.)
I understand that, but everyone here is saying that the stuttering version is better in itself and the smooth version is horrible? To my eyes it's the opposite.
It's definitely something that is different from person to person. I strongly prefer it disabled, but not because it looks terrible most of the time - I could get used to it if it looked exactly like it would look if it had been produced with that higher framerate. The issue arises whenever it breaks, for example by making the acceleration of visible motion unnatural. This happens fairly often, either through unrealistic acceleration, or by breaking the previously established visual language of the movie. That's where it breaks my immersion - but that's not the case for everybody, and it's absolutely legitimate to say that you prefer either, or don't care at all!
Maybe a good analogy to understand the "it's objectively wrong" perspective (even if I disagree) is AI upscaling, for example of historical photos. Just like autosmoothing it adds details in a mostly plausible way, and some people prefer it, but it adds fake detail (which understandably annoys purists), and sometimes it actually breaks and produces visual artifacts.
To me, the "smooth" version is artificial and alien in ways I can't quite articulate, just as it is hard to articulate why a long-winded LLM response, while having good grammar, might be both stupid and wrong.
Sure, it's smoother; anyone can see that. It's also weirdly smeary or something.
The (presumably) 24FPS version has a regular amount of judder, and it's the same amount of judder that I've experienced when watching films for my entire life, and each of those frames is a distinct photograph. There is zero smearing betwixt them, and there is no smearing possible.
We don't want "low frame rates". A lower frame rate is not the goal.
If films were commonly shot and released at 120FPS, then we'd see videophiles clamoring to get the hardware in-place in their homes to support that framerate.
But we're not there. Films are 24FPS. That's what the content is. That's what the filmmakers worked with for the entirety of filming, editing, post, and distribution processes.
And the process of generating an extra 96 frames every second to fill in the gaps of the actual content is simply not always very good. Sometimes, it's even pretty awful.
It seems obvious to say, but artificially multiplying a framerate by a factor of 5 inside of a TV frequently has issues.
>If films were commonly shot and released at 120FPS, then we'd see videophiles clamoring to get the hardware in-place in their homes to support that framerate.
I'm not sure that's actually the consensus opinion. Some of the complaints about frame interpolation are about specific kinds of artifacting, but many are of "the soap opera effect", and those same complaints were levied against The Hobbit, which was actually filmed at a higher frame rate.
This means that if a movie is shot at 24FPS (as nearly all of them are), and is shown to theater audiences at 24FPS, then it should also be displayed at 24FPS in the living room.
(But if you prefer to view the world through rose-tinted glasses, then you do you.)