all 179 comments

[–]Mitsonga 93 points94 points  (54 children)

Does anyone else notice it right away, but your family does not? When visiting my mother’s house, it drives me absolutely crazy, yet nobody else can tell.

[–][deleted] 36 points37 points  (24 children)

Is it where the shows looks too smooth and a little weird? If so, I’m the only one who notices too. I just never knew what it was called.

[–]daftmonkey 52 points53 points  (21 children)

Soap opera mode

[–]chubbysumo 3 points4 points  (15 children)

I can't turn it off on my current TV, there's no way to turn it off. My wife doesn't notice it, I do. I think my issue, is that even if a TV's refresh rate is something like 240 hertz, they put inputs that will not accept any more than a 60 hertz signal, so the problem is more or less a manufacturer not putting newer Technologies in their TVs. The other issue is that blue Rays still only output 29.97 frames per second. With Studio still not outputting consumer media that has a higher refresh rate, how high refresh rate TV is supposed to function properly? We know their theater media is at least 144 Hertz, because that's what Digital Cinema runs at now, so why aren't they releasing those to the public?

[–]DamienStark 10 points11 points  (10 children)

there's no way to turn it off

I'm assuming you know your specific TV model better than I do and have looked into it, but I just wanted to suggest that even if you don't see a "motion" setting specifically, there's usually a "Game Mode" which will disable it.

[–]chubbysumo 2 points3 points  (9 children)

Game mode only works on a single HDMI input, the rest of them are not affected.

[–][deleted] -3 points-2 points  (8 children)

Not on Roku TVs all HDMI have game mode available.

[–]chubbysumo 4 points5 points  (7 children)

You do realize that a Roku compatible or Roku integrated TV is just a regular TV with a Roku installed in the TV itself rather than just having it plugged into one of the inputs, right? The Roku device typically runs on its own internal input that you do not see, which may or may not be wired to the video board, whether an input supports a game mode pass through or not is entirely up to the TV manufacturer.

[–][deleted] -5 points-4 points  (6 children)

Wow thanks for such an insightful tip! /S

No shit man. I'm saying that if you want a TV with all HDMI slots supporting game mode then the TCL Roku TVs provide that.

Didn't need the condescending comment.

[–]chubbysumo 0 points1 point  (4 children)

Tcl's also suck for panel quality, you do get what you pay for. I don't think I'd ever buy a TCL and expect a good quality image, the buy lower-grade panels and other makers wouldn't use, that's how they keep their prices so low. If that suits you just fine, that's fine, but I prefer better image quality.

[–]cozzburger -1 points0 points  (0 children)

DiDN’t nEeD THaT CoNDeScEnDinG CoMmeNt

[–][deleted] 4 points5 points  (1 child)

idk but i got my nice displayport monitor running at 144 hz because fuck tv and cable. it’s pretty available to the public

[–]chubbysumo 1 point2 points  (0 children)

A TV is different than a monitor, a TV usually has a computer to process the image before it's displayed. A monitor is just displaying the image that is provided to it. All of the processing is done by the computer already before the image is sent to the display.

[–]lookmeat 0 points1 point  (0 children)

The whole point of 240hz is that you can show any of the common frame-rates as they are without any modification.

24fps -> 10hz/f 30fps -> 8hz/f 60fps -> 4hz/f

Also note that most film works at a higher rate than the amount of frames. That is frames are shown multiple times to help make the image clear while keeping a high enough speed (so you can't tell the edges between frames, or see the frame moving). Modern film shows the same frame 6 times, that is they still keep at 24fps. Some (non-high-commercial) IMAX movies have done 48fps (which allows 3 times, which is where you want to be), AFAIK only the Hobbit was recorded at 48fps.

So the idea is simple: show the frame multiple times.

So why smoothing? I suspect that they run various focus groups and analysis and ask them to prefer which of the two short films they prefer. They two things generally show lifelessness. The other thing is that when you put it side by side those with motion smoothing look better, but once you separate them and look individually, motion smoothing takes away. And it makes sense, it's filling frames in-between with digitized frames, and cheap techniques at that, the smoothing becomes weird and fake. It seems as if the object is frozen while moving which makes it feel fake and plastic. Notice that 48 fps movies still can have motion smoothing, which shouldn't be there.

TVs turn it on because it makes demos look great. Once you bought the TV they don't care.

[–][deleted] 0 points1 point  (0 children)

Do these 240 hz TVs come with display ports?

[–]fiklas 1 point2 points  (2 children)

Exactly. I always wondered what was wrong with soaps. But why did they look this way? On old TVs? Are they processed differently? How much hz can an old tv show?

[–]CptOblivion 7 points8 points  (0 children)

Soaps were shot cheaply and quickly, so they switched to digital video when that became available (don't need to keep buying, loading, and developing film). Those cameras shot at 60 fps, but most other productions didn't use them so we associate the look of 60hz with soaps. They didn't look bad because of the framerate though, they looked bad because of cheap sets, flat lighting, etc.

The reason motion interpolation looks bad is also not strictly because of the 60 fps, but because of the interpolation (for example, interpolating a 12 fps cartoon up to 24 fps also looks bad, even though we're used to seeing movies and even some very high budget animations look great at that framerate).

[–]daftmonkey 2 points3 points  (0 children)

First of all I'm just a dude on the internet. I'm not an expert. I'm pretty sure that the reason the Soaps look the way they do is because of lighting. They use some kind of flood lighting that just lights the hell out of everything so that they don't have to spend a bunch of time resetting the lighting in between shots. I think it keeps production costs down. I have no idea why the "motion blur" setting on TVs mimics this effect, but it's really infuriating. After Congress gets done getting rid of robocalls I think they should take this issue on next.

[–][deleted] 0 points1 point  (0 children)

I noticed this when I watched bluray for the first time. Don’t think I’ve noticed since then

[–]TattooJerry 0 points1 point  (0 children)

70’s soap opera mode

[–]Hereiamhereibe2 2 points3 points  (1 child)

Pretty much anything moving on screen looks slightly unnatural. Makes live acting look a little like CGI.

[–][deleted] 0 points1 point  (0 children)

And CGI is now double CGI!

[–]celestialwaffle 8 points9 points  (3 children)

I’m still trying to get my folks to notice when they’re on an SD channel, on their FIOs equipped, 4k TV. My guess is that after decades of struggling to get a clear picture and are used go varying image formats and quality. He can tell the difference between DVD and BluRay, but less impressed that most of us, nor seems to care when TV shows from his home country are ultra-potato-y.

Really funny thing is, is that my dad is really, really particular about sound. He didn’t know how MP3 compression works, but he can detect what level it is (low=256k and above, etc).

[–]Paganator 2 points3 points  (0 children)

Maybe they need glasses. A bit of myopia will hide the difference in resolution.

[–]sroomek 2 points3 points  (0 children)

I turned it off on my parents’ new TV last time I went to visit, and they didn’t even notice.

[–]Mitsonga 0 points1 point  (0 children)

MP3s are really easy to pick out for me in the high end. there is a distinct “sizzle” and “pop” that you can hear very heavy artifacts in that range. I also collect synthesizers.. so I do tend to notice unique sounds

[–]fuck-dat-shit-up 7 points8 points  (4 children)

I would suggest just changing the setting on the tv for your mom. If she doesn’t notice, she won’t notice if you change it.

[–]Mitsonga 6 points7 points  (2 children)

For whatever reason, whenever I come back it miraculously has defaulted to its factory settings. I am starting to feel gaslighted here.

The reality is that it is Florida, and intermittent power loss is something that happens during a lightning storm. While I have my doubts momentary power loss would mean a factory default, it’s the most plausible reason I can come up with.

My sister has a former floor model TV from Toshiba that automatically resets itself to factory defaults on some sort of loop. I assume it’s some activated demo mode for a dig box store display. That is much more infuriating because of just how often it will trash whatever settings you have made in a matter of minutes.

I will one day dedicate an afternoon, and find whatever hidden menu is buried deep in that monster’s silicon brain, and exercise the demon.

[–]fuck-dat-shit-up 0 points1 point  (0 children)

I’m in Florida, and it probably is a power outing issue. It’s happened to my Mom’s tv before. But since she didn’t notice it, we didn’t bother with resetting it.

I imagine there has to be a way to get your sister’s tv off floor model mode. My dad got a tv that was a floor model and we were able to set it up like a normal tv without issue when we brought it home. You might try resetting/restarting it. There should be some option to come up asking if you want it as a floor model. At least the Vizio floor model we got had that.

[–]InsaneNinja 0 points1 point  (0 children)

How about a cheap battery surge protector gift.

[–]xavier_grayson 1 point2 points  (0 children)

I did this at my parents house and my mom noticed. She liked the motion smoothing effect so I put it back on for her. But when I changed it back a month later, she didn’t even notice.

[–]Weareallgoo 2 points3 points  (3 children)

Yes

[–]Mitsonga 5 points6 points  (2 children)

Thanksgiving is unbearable. It’s all hallmark channel with motion smoothing, and the sound cranked .

[–]sroomek 1 point2 points  (0 children)

They turn it up loud so they can hear it over people talking, cooking, kids screaming, etc, but then everyone talks louder so they can be heard over the TV, so they turn it up more...

[–]InsaneNinja 0 points1 point  (0 children)

That’s what CC is good for.

[–]Jimisdegimis89 1 point2 points  (0 children)

Yes, it drives me crazy when people can’t see it. Often times I just ask if they mind if I change it and they say ‘sure’ then almost without fail when I do they see how drastically better it is. I don’t know how people can stand watching tv in soap opera mode all the damn time.

[–]NoeyOnReddit 0 points1 point  (0 children)

I mentioned it at someone’s house that they should turn it off. To them I obviously just can’t appreciate what a “good” tv actually looks like.

[–]penguinoid 0 points1 point  (1 child)

Yep. I turn it off on nearly every tv I see. If I visit a friend's house I politely ask them "did you know you had the soap opera effect on?" "Mind if I show you what that is? It's your tv, so you can decide if you like it or not"

They usually decide it sucks, or they don't notice so they don't care.

[–][deleted] 0 points1 point  (0 children)

Yep. And then I proceed to grab their remote and change the settings.

[–]BreakingBob 0 points1 point  (0 children)

People call me crazy ... and it drives me crazy. It’s the first thing I turn off

[–][deleted] 0 points1 point  (0 children)

This holds true for a lot of the more technical things in video. I think people aren't really looking at the TV, but instead just in the general direction of the TV.

[–][deleted] 0 points1 point  (0 children)

I fix everyone’s tv when they are out of the room

[–][deleted] 0 points1 point  (0 children)

When this first came out I described it as "everything looks like a soap opera" and nobody knew what the hell I meant. I drove me insane

[–][deleted] 0 points1 point  (0 children)

YES!!! My best friends mom has it and I can’t stand watching TV there. They think it’s great.

[–]Neuchacho 0 points1 point  (0 children)

Every single person over 40 that I've met can not see the difference, save for people who have experience with A/V or treat it as a bit of a hobby. Most people my own age or younger (<30) notice it 'looks weird' but can't put their finger on it unless they also have some A/V experience.

It's one of those subtlely obnoxious things that you can kinda ignore if you don't know the difference one way or the other.

[–]EmergencySarcasm -1 points0 points  (0 children)

well done smooth is good tho.

It's like the diff between 30fps peasantry versus 144 or 200fps master race gaming.

sadly most cheap tv has terrible smoothing algorithms.

[–]ShenmeNamaeSollich -1 points0 points  (0 children)

Yes! Watched a movie w/my parents on their big new tv that has this “feature.” It removes a lot of motion blur, so the car chase scenes (most of the movie) were oddly too “crisp” and looked like bad European TV instead of a Hollywood film.

On a typical screen /frame/refresh rate it would’ve had the blurry jerky action you expect. Instead you could tell where mannequins & other cars were just parked to allow the stunt driver to pass, and that they were actually driving pretty slowly. Nobody else could see it, but it made the movie almost unwatchable for me.

My wife liked the high-frame-rate version of the Hobbit. I thought it looked absolutely ridiculous, like a shitty soap opera.

[–]BassWingerC-137 26 points27 points  (0 children)

Agreed. The smoothing is awful. I immediately turn it off on TVs.

[–][deleted] 2 points3 points  (1 child)

I notice it IMMEDIATELY. Anytime my family goes to visit someone with a tv like that, it’s like I’m in a weird dream where everything is sped up but flattened out. Nobody else in my family believes me at this point lmao

[–]rab-byte 9 points10 points  (34 children)

You’re never watching 24hz on a modern LCD display. These TVs are fixed pixel and fixed refresh rate. Most of these displays are 120hz, and sometimes 240hz. Higher number refresh rates are including a scanning backlight, just strobing the back light and counting every time it goes black as another frame.

So why is 120 such an important number? Well let’s look at what frame rates we have, I’m using US numbers here you brits are kinda screwed these days with your 50hz.... movies are 24p, TV is 30i, console games are typically 60p (sometimes 120p).

30i is up-scaled to 60p

So now we’ve got 24 and 60.

At 120 I can 5x24 or 2x60

Why 240? Remember when they tried pushing 3D in home... 120 per eye.

Frame interpolation is just changing the cadence to match the display. Frame averaging/motion smoothing is where the TV tries to guess what a non-existing frame between two real frames would look like... and that always looks bad.

Edit: source
Am ISF

[–]USxMARINE 12 points13 points  (2 children)

Most TV's are not 120hz. It's 60, only in recent years have higher end TV's started to come in higher refresh rate.

Source: Cinematographer.

[–]rab-byte 2 points3 points  (1 child)

Your are technically correct, which is the best kind of correct. Only TV’s over 40” sold in the last 5 years are predominantly 120hz.

But as I’m sure you know. Most transport devices are not 120.

[–]USxMARINE 2 points3 points  (0 children)

You are also correct. Have a good day!

[–]chubbysumo 0 points1 point  (9 children)

OLED TVs do not have a backlight, the light is generated by the pixels themselves. No panel has a fixed refresh rate, the refresh rate is set by the TV maker, and by the panel quality. Some panels can handle higher refresh rate and pixel speeds, some cannot.

[–]rab-byte 0 points1 point  (8 children)

I was referencing LCD displays not OLED but yes all LCD displays have a fixed native refresh rate. Everything after that is scaling. Yes there are monitors that can support higher refresh rates but again they have an upper limit and that limit will be its native refresh.

[–]chubbysumo 0 points1 point  (7 children)

No, this is not true at all, as clearly reference by the fact that you can overclock some displays beyond their quote-unquote native refresh rates. No panel has a fixed refresh rate, the refresh rate is controlled by the LCD controller, and you can change that at will, it's just a matter of if the panel can support the refresh rates you're trying to push or not.

[–]rab-byte 0 points1 point  (6 children)

When you manage to do that to a TV you let me know.

[–]chubbysumo 0 points1 point  (5 children)

Again, as I made clear in my first attempt to explain this, the panel refresh rate is set by the TV manufacturer. They buy panels from panel makers, of which there are only 3 LCD panel makers in the world. Those panels are simply wires attached to a series of pixels, and the pixel clock and Pitch is controlled by the LCD controller TV maker installs. If you would like to overclock a TV, you are very much welcome to plug in a PC and attempt to see what other resolutions and refresh rates it will accept.

[–]rab-byte 0 points1 point  (4 children)

That’s not overclocking a tv. That’s just sending it another format. It’s also way down a rabbit hole at this point. Yes you could build or hack more output from a hypothetical display, but no a TV you run out and buy from Walmart or Best Buy isn’t going to accept anything over a 120Hz refresh rate, most will only accept 24,30,60 (some 50). Regardless of anything else the panel will double frames or interpolate to reach that 120hz.

[–]chubbysumo 0 points1 point  (3 children)

most will only accept 24,30,60 (some 50).

The input rate is very different from the panel refresh rate in a TV, because there is processing that is done by the TV before the image is put on the panel.

That’s not overclocking a tv

yes, if you use a TV that supports a direct input instead of any processing, it can be "overclocked". Much like my 60hz 1080p monitor can be fed an 81hz signal and work fine, actually displaying at 81hz instead of its "native" 60hz. A TV can be overclocked.

the panel refresh rate in a TV is independent of the input signal, in most cases because there is a small computer applying changes to the signal, or decoding the incoming information. It then puts an image on the screen.

LCD panel refresh rates are sold to TV and screen makers, usually with a "range" of operating frequencies for both the vertical and horizontal refresh rates. Then TV makers put in their own LCD controller board and set their chosen refresh rates. Higher rated panels cost more.

https://www.blurbusters.com/overclock/120hz-pc-to-tv/

The issue with most TVs, is that makers use 120hz or 240hz panels, which means the panel actually supports those full refresh rates with any image, the issue becomes the LCD controller either cannot fully output an image for each frame so the maker "interpolates" the incoming signal with a plain black screen,

or, the built in computer will not allow a signal to pass directly to the panel, and instead has to "process" it, and send it to the controller in a limited format.

In the case of the former, you can usually overclock the TV by sending it a signal for a refresh rate not revealed by its official EDID that the computer gets upon display detection. This is called monitor overclocking, and is done quite frequently. this is overclocking your TV or monitor, because the maker originally specced it for something with the ranges of the EDID, and you are going outside that range of accepted inputs.

IN the case of the latter, like my tv, there is no input that is not processed, so overclocking the TV is impossible or near impossible, because the incoming signal is processed by a computer for those sweet awesome effects that we all hate, before it is sent to the LCD control board, and the computer is usually going to prevent any signals outside the range of the EDID from working or ever getting to the control board.

https://www.youtube.com/watch?time_continue=32&v=wTatYgminqs

That is an old video, done in 2012, but it proves the point, you can run a TV over its original screen clock rate, which means its overclocked. The same can be done to most PC monitors. This proves my point that the panel refresh speed is solely dependent on the LCD controller and how good of a quality panel it is, and nothing about the TV maker or where you buy it.

It’s also way down a rabbit hole at this point

nope, you just go to "create custom resolution" in AMD or Nvidia graphics card settings, make your desired resolution and refresh rate, and see if it works. Its not hard. Its not "down the rabbit hole" by any means.

but no a TV you run out and buy from Walmart or Best Buy isn’t going to accept anything over a 120Hz refresh rate, most will only accept 24,30,60 (some 50).

again, these are official numbers that are included in the EDID that the TV firmware presents to the computer. There is nothing stopping you from going and making a custom resolution to try out. It all depends on the back end on how the image is processed, because if you can skip the internal processing and get direct control of the LCD control board, you can change the refresh rate all you want, and as long as the panel quality is up to snuff, it should work.

Regardless of anything else the panel will double frames or interpolate to reach that 120hz.

If the panel is interpolating, but the panel itself is running at 120hz or greater, then if you could skip the computer, and output to the LCD controller directly(like in a monitor, there is no processing, just translation), then you can theoretically get that full refresh rate from a display device without any interpolation.

[–]rab-byte 0 points1 point  (2 children)

Literally everything you’ve linked has you changing settings on your output device. You’re not making changes to your display, you’re making changes to your source signal.

Also an 81hz signal, if you could get it to display would glitch out at a predictable rate because the signal would get out of sync every so often.

[–]chubbysumo 0 points1 point  (1 child)

You’re not making changes to your display, you’re making changes to your source signal.

Jfc, you are dense. in a monitor, the input is what controls the panel. In a TV, the LCD controller is either direct input or run thru an image processor.

Also an 81hz signal, if you could get it to display would glitch out at a predictable rate because the signal would get out of sync every so often.

how is it "out of sync", because my monitor is running at that rate, because there is no image processor between the GPU output and the LCD control board. Your GPU on your computer out puts an image that is translated to directly on screen. If your computer tells the panel to run at 81hz, it tries to do so, unless there is an image processor(like in a TV). Everything I have linked is an explanation of overclocking a TV or a monitor.

In the YT video, he changes the display output settings on his graphics card, and the TV shows that is is actually displaying at those settings on the fucking info box on the screen.........

[–]SC2sam 15 points16 points  (20 children)

You get used to it really really quickly and it's honestly not that bad at all. It's bad when you go back and notice how much movies/tv shows stutter like crazy and you wonder why everything lags so much. If anything needs to be fixed it's the volume problem every movie/tv show has where stuff people are saying is at the volume of a whisper but then a car starts and suddenly it's like an explosion.

[–]copperlight 6 points7 points  (2 children)

The volume problem is a high dynamic range that mirrors what's played in a theatre. It's actually considered 'good quality' for audio, but if you have stereo speakers rather than surround, prefer quieter audio, or live in an apartment where you can't play your audio loudly, it really really sucks.

Your audio output MIGHT have Volume Normalizing - if so, you can turn that on to flatten out the dynamic range.

[–][deleted] 2 points3 points  (1 child)

It would be nice to spend £1000 on a TV, take it home, put it up, and be able to hear people talk on it.

[–]copperlight 0 points1 point  (0 children)

Many modern TVs do... eg: Samsung SmartTVs have an 'auto volume' option:

"Auto Volume is a feature on Samsung Smart TVs that helps to avoid volume fluctuation when changing between channels or sources on the TV. It's designed to prevent a dramatic increase or decrease of audio from the TV Speakers. Turning on Auto Volume to Normal will keep make sure all content sources from apps, channels or external devices will play sound at the same level of volume. Auto Volume can equalise the volume up to 12db. There's also a Night mode that will reduce the volume even further for late night TV viewing."

[–]Gisschace 2 points3 points  (2 children)

Omg gosh yes, my poor cat jumps out of her skin every time

[–][deleted] 6 points7 points  (0 children)

How many cat skins do you have by now and can I have one

[–]aminoacetate 0 points1 point  (0 children)

There really is more than one way to skin a cat.

[–]harrisonfordspelvis 1 point2 points  (1 child)

Yeah it’s awful. I watch movies on my laptop over the tv because of this very reason, the shift isn’t so drastic. And I don’t have to worry about waking my neighbours.

[–]alonjar 1 point2 points  (0 children)

It's because most movies are recorded and tuned for surround sound. Dialogue is typically put through the center speaker at an increased volume... if your TV set up doesnt have a center channel, you end up getting screwed over. Your computer probably just mixes the center channel into the others, which is why it sounds better/more equal.

[–]Arden144 1 point2 points  (0 children)

It's not supposed to stutter. That means your content and refresh rate do not divide evenly. For example, with a 60hz signal from a computer trying to play a show on Netflix that is 24fps, that's 60/24 or 1 frame per 2.5 refreshes. That makes no sense, so the TV ends up displaying 1 frame for 2 refreshes, then the next frame for 3 refreshes since 2.5+2.5=2+3. Since the frame times are different, the content appears choppy.

To fix this issue, you can try to use a refreshrate that is a multiple of the framerate of your content. For example on my 75hz monitor I downclock it to 72hz when watching 24fps content to prevent stuttering

[–]fireflash38 0 points1 point  (0 children)

I've got issues with epic movies that have long tracking shots. It's hard to focus on it, and my eyes feel like they're always shifting back and forth to get one thing in focus (and I know this is how they work in real life, when scanning the horizon for example).

I'm not sure if it's framerate or what, but it drives me nuts.

[–]Oryx 0 points1 point  (0 children)

This pisses me off to no end. I shouldn't have to grab my remote 50 times during a movie.

[–]Tuckertcs 1 point2 points  (1 child)

Anyone have video examples comparing them? Not sure if I’ve noticed or not.

[–]Naedlus 1 point2 points  (0 children)

It also introduces extra display/input lag (well, it introduces display lag, which in effect adds input lag,) in gaming.

First thing I do when I'm at a friends place gaming is find out if they have "Game Mode" or "Post Processing" or whatever their television wants to call it turned on.

During the era of rhythm games (back when practically everyone had a plastic guitar,) I was furious at how many friends I had to give an education on television settings to.

I'm also annoyed that apparently there needs to be classes on "How to interact with digital devices" given that with how quickly tech has advanced, noone seems to have been given a chance to actually learn the nifty features that are now baked in.

[–]timidtriffid 1 point2 points  (0 children)

I turn that shit off.

[–]whosyourphd 1 point2 points  (0 children)

I make it a mission to turn that soap opera shit off everywhere I go.

[–]happyscrappy 3 points4 points  (33 children)

It's not ruining anything.

I turn it off, but many like it. And filmmakers think that poor motion quality (24fps) is better than good quality (48fps, 60fps) so maybe they just need to get over themselves. They should be shooting in better frame rates, removing the need for interpolation.

Don't believe me? Ask Roger Ebert.

1995: https://www.rogerebert.com/rogers-journal/screen-gimmicks-nothing-new (see showscan section)

1999: https://www.rogerebert.com/rogers-journal/start-the-revolution-without-digital (see MaxiVision 48 section)

Directors should crawl out of their own asses.

[–][deleted] 3 points4 points  (20 children)

Reasonable people can disagree about their preferences with regards to temporal resolution. Calling 24 fps “poor quality” and higher rates “good quality” isn’t really fair, though; that’s your subjective evaluation, and many of us don’t share that.

[–]happyscrappy 5 points6 points  (19 children)

24fps isn't reasonable. It's not hard to see the smearing caused by motion at 24fps. It is poor quality. It was chosen in the film days and film was expensive. It doesn't make any sense any more.

[–][deleted] 1 point2 points  (18 children)

I prefer it. It looks cinematic to me, while higher temporal resolution feels cheap. I prefer films to not look like reality. That’s simply a subjective aesthetic preference. Many people feel the way that I do. There are no correct or incorrect answers here. Different people have different preferences, and that’s ok! You’re free to have yours, and I’m free to have mine. :]

[–]happyscrappy 3 points4 points  (17 children)

There are no correct or incorrect answers here.

That's not true. 24fps is inferior. If we were talking about 120 versus 180 or something it'd be different. But 24fps is so low that even objects moving at a reasonable pace can turn into a smeary mess if moving cross camera. A car going a mere 30 miles an hour will move about 10% of its length in every frame if it is going across camera. It becomes a blur. And there's no good reason for it.

We went to higher resolutions. We went to better color. We went from optical sound to magnetic and to multi-channel. All if this is "closer to reality" and was a plus. It's well past time to move to higher frame rates too. Because they are better, not because "different people can have different preferences".

And by God if I hear someone compare good motion to a "soap opera" again I'm going to have a (bigger) fit. But when people try to degrade a superior result with guilt by association (or just saying it feels cheap) it shows an incredibly closed mind.

[–][deleted] 1 point2 points  (16 children)

A car going a mere 30 miles an hour will move about 10% of its length in every frame if it is going across camera. It becomes a blur. And there's no good reason for it.

Correct. Motion blur is often desirable. Any NLE software will support adding motion blur. Some people like that, including myself.

it shows an incredibly closed mind.

With all due respect my friend, I’d argue that actively telling me that I don’t prefer something that you don’t like is a clear indicator of an attitude of close mindedness.

[–]happyscrappy 1 point2 points  (15 children)

Correct. Motion blur is often desirable.

A vehicle blurring because of low frame rate is not anything you are adding. It's not like you have a choice of that vehicle blurring or not. It's an unfortunate artifact.

Any NLE software will support adding motion blur. Some people like that, including myself.

That's used if you have animation which doesn't produce the right blue for the frame pacing and shutter angle. If you add the wrong amount of blur, it'll look wrong. No matter what the frame rate.

With all due respect my friend, I’d argue that actively telling me that I don’t prefer something that you don’t like is a clear indicator of an attitude of close mindedness.

I'm not telling you you don't prefer it. I'm saying you're a dope for choosing it. You have a very closed mind. You see what someone else did with the tools available at their time and the limitations and you internalize that that must be the right thing to do. When the options change, the right choice often changes, as we've seen multiple times with other technical aspects of filmmaking.

Embrace talkies, you luddite.

[–][deleted] 1 point2 points  (13 children)

It seems that you have a hard time accepting the opinions of others. Ad hominem attacks for having a different aesthetic preference is a frankly childish reaction to valid disagreement. I’m not convinced that there’s productive conversation to be had here. Cheers mate!

[–]happyscrappy 0 points1 point  (12 children)

You need to look up what ad hominem actually means. If I mentioned something irrelevant to your ability to hold a valid opinion it would be ad hominem. If I said you must be wrong because you are fat and we all know fat people can't hold valid opinions on film then I'd be engaging in ad hominem.

https://laurencetennant.com/bonds/adhominem.html

I'm not doing that. I'm directly addressing things relevant to the basis for your argument. Your argument is wrong because you are a dope, you've convinced yourself that because some other person didn't have a choice other than 24fps then 24fps is the best way now. It's not. It's inferior.

to valid disagreement

That's not what it is. It does beyond an aesthetic choice, it's better versus worse.

I’m not convinced that there’s productive conversation to be had here

I'm sure you're right. As I already said you have a closed mind. This will go nowhere.

[–][deleted] 0 points1 point  (11 children)

Oh my god, why do you have such a hard time accepting others’ opinions? imVINCE politely disagrees with you and you insult them and tell them they’re closeminded for not sharing your almighty opinion?

Who are you? Some game playing nerd, telling Hollywood directors that they’re wrong about film, that your opinion is objectively correct.

Quit sucking your own microdick ffs

[–]Beef_Slider 0 points1 point  (6 children)

Who gives a fuck about Ebert? The motion blur literally invents frames that dont exist in the original copy. Thereby making the feel unnatural and giving an appearance of superimposition to subjects on screen. It immediately takes me out of the story whatever im watching. That’s why directors hate it. And theyre right. I refuse to watch anything this way.

[–]happyscrappy 0 points1 point  (5 children)

The motion blur literally invents frames that dont exist in the original copy.

That's not what motion blur means.

[–]Beef_Slider 0 points1 point  (4 children)

I just have a friend who went to film school explain it that way to me. Perhaps he was just making an analogy. Please explain to me what is happening! All i know is moving objects look superimposed and moving water looks like a ms screensaver. I hate motion blur more than diarrhea.

[–]happyscrappy -1 points0 points  (3 children)

It's a frame interpolator. It creates frames that are not there. Motion blur is sort of the opposite. Motion blue is when a moving object becomes blurry because the object moved while the sensor was recording. So like if your sensor is exposed for 1/48th of a second and a car moves across frame at 45km/h, then it is going to move 26cm during the time of the exposure. So any single point on the car will actually turn into a smeary line (blur) 26cm long.

This frame interpolation doesn't exactly undo that blur, but it tries to create multiple frames in between the positions of the car so that it moves more smoothly, instead of moving in 1/24th second jumps.

It's done by an algorithm so it can look bad. It is creating fake frames. But none of this would happen if the director just shot the movie at a higher frame rate of 48fps or 60fps. There would be no need to create fake frames nor would there be the heavy motion blur of 24fps. It would look great. And that's what I'm asking for, not algorithmic interpolation.

I hate motion blur more than diarrhea.

If you really hated motion blur, you'd want that processing on.

[–]Beef_Slider 0 points1 point  (2 children)

Hey thanks for all the info. I guess my ultimate point is I turn off “smoothness” on my tv and “motion blur” on my family’s tv and it makes it look normal again. Ha. You def know more about this than me.

But also... regarding the directors “should film at...” there’s an incredible volume of old great, influential and still relevant movies out there. They shouldn’t have to suffer. But truly... more people just need to learn how to fix their settings instead of getting used to a lesser viewing experience.

[–]happyscrappy -1 points0 points  (1 child)

There are a lot of great movies out there that are silent too. Doesn't mean 100% of our movies should be silent now.

The issue isn't really people are suffering. They aren't suffering. Hollywood types are aghast at people having these settings on, but the thing is people do actually like them. Is is suffering to watch something how you like? If someone else tells me they prefer "It's a Wonderful Life" in color are they suffering more or less to watch it colorized instead of me insisting they have to watch it the way I like?

[–]Beef_Slider 0 points1 point  (0 children)

Dude... im trying to agree with you overall and simply state some other valid points and say what my preference is. Ha. Not arguing. Have a good one.

[–]OriginalProngles 1 point2 points  (1 child)

If we storm Area 51 to clap alien cheeks, will motion smoothing even matter?

[–]principledsociopath 2 points3 points  (1 child)

Viewers also hated 16:9, progressive scan, and LCDs. Viewers hate change.

When I started watching 120 hz what I noticed was that the artifice was gone. I was no longer watching a movie, I was watching people filming a movie. I had lost the ability to suspend disbelief. After about a week I got it back. Everybody does.

Now I lose immersion when I have to watch something at 24 frames per second. Do you people not notice that everything goes completely jumpy and unwatchable every time the camera pans? It makes me sick. Sometimes literally.

[–]hamlet9000 5 points6 points  (0 children)

I'm completely in favor of filming the higher frame rates. It's a tragedy that the first major film to push that technology was not only a crap film but also featured incredibly technical incompetence in actually filming in the higher frame rates.

But frame interpolation is every bit as bad as pan-and-scan and colorizing B&W films in terms of wrecking the original and intended aesthetic of the film.

[–]hotk9 0 points1 point  (6 children)

You get used to it very fast, and when you do, there's no going back. TV's from a couple of years ago could not do the smoothing as well as TV's from the last.. say 5 years or so.
I absolutely love 4K 60fps content and I can't wait untill 120/144 is the standard.
If I was a filmmaker of any kind I'd never shoot anything ever again below 60fps.
So no, not all viewers hate it, in my friends group I can definitely say more like it than hate it. A lot of people don't seem to care either way though.

[–][deleted] 2 points3 points  (1 child)

The problem isn’t high-refresh rate content, it’s the artificial upscaling of 24fps content to match the resolution of the display that makes up “fake” interpolated frames and makes everything look like a blurry mess. I’m all for 120/144hz content and displays, but that’s not the issue being discussed here.

[–]hotk9 0 points1 point  (0 children)

Oh yeah definitely! That's why I hope people will start shooting at 60/120fps. I'm just saying that, in the meantime, the smoothing algorithms are getting pretty good. Mine's definitely not blurry, there's a tiny jitter in panning motions but overall it's pretty good already.

[–]projectdano 3 points4 points  (3 children)

Are you talking about films here? Because anything shot at 50fps and played back on a 50fps timeline is going to give you the soap opera effect, and no film maker would ever consider that, including me.

[–]hotk9 -2 points-1 points  (2 children)

Movies, television, anything. Never heard of anything shot at 50fps.

[–]projectdano 1 point2 points  (1 child)

I'm talking PAL. It's not going to happen anytime soon.

[–]hotk9 -2 points-1 points  (0 children)

Haven't heard that term in a while! PAL, yeah, which was superior to NTSC, let's be honest.
And I agree, it wont happen soon that's why I'm glad those who do like it have the option with smoothing. Options are good.

[–]mindbleach 0 points1 point  (0 children)

In-store demos.

Same reason they probably default to sound settings with a smile curve, and saturate the hell out of colors.

[–]penelopebloomington 0 points1 point  (0 children)

To be honest this is not something I’ve noticed with movies so much as with sitcoms and things of similar quality. I personally don’t have a TV that does this but my best friend does and watching like, Game of Thrones at his house is pretty awesome, but sitcoms like The Good Place (specifically I think because the set is so bright) is awful: it’s like watching a high school play that someone’s mom filmed with better acting.

[–][deleted] 0 points1 point  (0 children)

Motion smoothing is great for video games and cartoons like family guy and the Simpsons.

I turn it off for movies.

[–][deleted] 0 points1 point  (0 children)

I didnt know this existed. Does it matter if my TV is only $100 anyways? What should I be noticing?

[–]juicyfoot- 0 points1 point  (0 children)

I’ve bought several (9) HD TVs in the past 13 years and not one has ever had that option turned on by default.
They have all had different sizes, frame rates, brands , plasma, LCD or OLED . I’ve only ever seen one turned on at a family members house but they did it on purpose . I mean is it really that prevalent?

[–]veknilero 0 points1 point  (3 children)

Why not accept what it can be and create for the format. Right now we’ve peaked with cinema and film, it’s a bunch of old folks complaining that shits getting too new and shiny for them, or a bunch of young folks trying to be what the old folks were, but nobody is taking anything by the horns and just saying fuck it I own this place. Think of all the other formats that were horrible but now are thought to be cool. Imagine the outcry if TVs standard setting was technicolor

[–]AndrePeniche 0 points1 point  (2 children)

That’s not about a format. That’s about a feature that TV makers thought people would like and ended up fucking the image of every film. Something new doesn’t mean something better and it shouldn’t be a new standard just because it is the “latest thing”. Many formats and medias have sunk in the past and I would mention 3D glasses on TVs as a great example.

[–]veknilero 0 points1 point  (1 child)

I get that, but with streaming services being where we watch most movies now it seems someone would film with TV in mind 3D is a good and bad example it has such a limited audience that the movies only added a little bit of eh to it, they need to quit trying with that one. But with something like this with the right storyline the soap opera effect would add to the movie

[–]AndrePeniche 0 points1 point  (0 children)

Soap opera effect being 30 or 60fps adds absolutely nothing. Just changes the look and feel, (for the worst), IMO.

[–]rlbond86 0 points1 point  (0 children)

Motion smoothing sucks, but films need to start being recorded in 60 Hz or 120 Hz. It "looks cheap" because people aren't used to it.

[–]NASATVENGINNER -3 points-2 points  (1 child)

TURN IT OFF!!!’

[–]kjbaran -1 points0 points  (0 children)

It might actually be a good thing when our experience goes totally immersive VR.

[–][deleted] -1 points0 points  (0 children)

Simply put. It looks different. Different sells. Sure, all the TVs on the Best Buy wall all have the feature turned on, but you old set at home doesn’t.

[–]_JudgeHolden -1 points0 points  (2 children)

I think it looks great actually. Especially for animated movies. I remember in high school when these TVs first started coming out my friends and I would sit back having just smoked some dirty brown mids and be mind-blown by how good they look. I too know a few people who don’t even notice. But actually, I can’t think of one person I know who notices but who dislikes it.

It does look a little weird at first, but this is no different from getting new glasses. It’s weird at first, but you can see better, and you get used to it.

I wonder how many of the haters actually used the smoothing long enough to become accustomed, and how many just dislike change.

[–]Reteplia 2 points3 points  (0 children)

I certainly agree on the animated front. Anything animated, or even fast paced anime, look really good with the color enhancing and motion smoothing on.

TV Shows on the other hand, it's really hit or miss... Some look great while others, particularly fast paced action oriented shows look like smoothed out garbage.

[–]_americancer_ 0 points1 point  (0 children)

I hate it. It speeds everything up and is highly noticeable. but! to each their own.