Allen
Hotshot
Joined: Oct 1999
Posts: 8,854
Ohio USA
Quote:
...The trouble with this sort of propaganda graph is that it doesn't actually show anything relative or even useful...
It nominally wasn't propaganda. It was an article at a site that is relatively neutral.
True, a "bang per buck" graph does not tell a gamer enough to know what to pick.
But, it does say, "take a look at this before you decide" -- if money is an issue.
My Reference RX-480 plays all my games fast enough at "ultra" (for that game) settings at 2K (1440p) resolution (I don't play "all the games" -- so, I'm not saying all games would do that well).
60fps is the fastest my monitor can refresh (like many monitors). And, I see 60 up there most of the time (have Steam set to show FPS all the time). Only if I add high-res 3d party textures and so forth in my games do I occasionally drop below 30. However, down to 15fps (true minimum, not average) it still plays almost smooth to me (my eyes may be slow). If I get 15fps on rare occasions, its okay and does not bother me. However, I realize others want a "perfect" experience at all times.
Some games (particularly flight simulations) are more CPU limited and the GPU matters less, but none that I am playing lately push my system.
For reference, some TV is 30fps and some movies are 24fps. Back in the day, home-filmed 8mm movies were 15fps and looked okay to most.
With a 60fps refresh monitor, I really couldn't make use of 90fps. I doubt many folks could tell the difference between 60 and 90 in a "blind test" -- I couldn't. However, for those that can, they should go for it.
All the foregoing to say, we all do what's fun and have different needs and desires. "Bang per buck" matters to some of us (not everyone)