Quote:
...A gpu may handle a game at 2k or 4k but if that means it will run at a framerate of 10fps then its still unplayable......or if a steady framerate of 30fps drops to 15-20fps when busy is equally bad...


Agree.

I feel a true framerate of 30+ with no drops into the teens is "visibly" smooth to the most human eyes (some folks have exceptional vision) -- movies were 24fps for decades -- TV was 30fps -- home movies were 15fps -- and virtually no one complained. I would claim virtually no one can really tell 90fps from 60fps in a "blind test". Most monitors in use have 60fps refresh -- so, above 60 is invisible on those monitors (even though one can "measure" that the card is attempting to put out 100 fps) -- yet, folks still play and enjoy the games at 60 or less.

Thus, yes, it is a matter of the definition.

I go by what I can actually "see" with unaided eyes -- not by "measurements" when deciding if a game is playable. However, some folks go by "measurements" and are unhappy if their GPU is not "measuring" above some number.

No way is wrong. We all do it for fun. Beauty is in the eye of the beholder. I simply think most will be happy with a smooth 30fps (and up) -- if they don't turn themselves off by "measuring" the FPS.


Sapphire Pulse RX7900XTX, 3 monitors = 23P (1080p) + SAMSUNG 32" Odyssey Neo G7 1000R curve (4K/2160p) + 23P (1080p), AMD R9-7950X (ARCTIC Liquid Freezer II 420), 64GB RAM@6.0GHz, Gigabyte X670E AORUS MASTER MB, (4x M.2 SSD + 2xSSD + 2xHD) = ~52TB storage, EVGA 1600W PSU, Phanteks Enthoo Pro Full Tower, ASUS RT-AX89X 6000Mbps WiFi router, VKB Gladiator WW2 Stick, Pedals, G.Skill RGB KB, AORUS Thunder M7 Mouse, W11 Pro