24 FPS with movies is completely different than 24 FPS in a game.
Movies use "motion blur" so it looks smooth from frame to frame, video games do not have that ability because they draw the screen one frame at a time. When you read reports of people testing games at 60 FPS versus the same game at 120 FPS, the people pick the 120 FPS as far smoother than the 60 FPS every time.
I know, because I can tell the difference between a game at 60 FPS or when I set the refresh rate on my monitor to 144 Hz since faster is always smoother.
"Motion blur" is not a method, technique, or system used to achieve or accomplish anything. It's actually a (generally undesired) side effect of capturing moving objects in still images (or a sequence of 'frames'):https://en.wikipedia.org/wiki/Motion_blur
Yes, it does contribute to fooling your eyes in a way, but that doesn't mean it's intentional, or even desireable. They don't do it to smooth anything, it happens regardless of what they do.
In short, it is an effect, not a cause.
Many types of technology have been used to overcome
motion blur. Asus' website says "NVIDIA Ultra Low Motion Blur (ULMB) technology decreases motion blur..." (https://www.asus.com/support/faq/1014609
).It makes no sense at all that anyone would spend time to develop a technology to reduce or eliminate something, if it were planned, intended, or even desirable in the first place.
It should be evident, then, that motion blur is not an intentional act of anyone to smooth anything. It wasn't developed, and it isn't used, to smooth moving images. It happens because movement is absolutely analog, while video frame captures are not.