After years of playing with my Predator monitor, it has finally bit the dust. I was wondering if anyone had any suggestions on monitors that helped up the visual factor not only WOFF, but for other games as well. I've been googling some to get an idea, but I'd like to hear some reviews and suggestions from experience. Anything would be appreciated.
I got fired as the door man at a sperm bank. Apparently it's in poor taste to tell leaving customers "Thanks for coming."
Former U.S. Army Medic - SGT.
#4497315 - 11/15/1911:28 PMRe: OT: End of an era with my old gaming monitor
[Re: AceMedic88]
One thing that comes to mind is GSync. The technology synchronizes GPU frame rates with monitor refresh rates, and the result is most impressive. The only real obstacle to ownership has been cost, thanks mostly to the overhead of licensing through Nvidia. Most people who have this feature will say it's worth the cost (but of course, cost is subjective).
However, just after Gsync became very popular, another very similar technology was showing up, and has now been officially endorsed by Nvidia as well. Variable refresh rate (much like Gsync) is an intrinsic part of the DisplayPort 1.2a spec. AMD uses the same technology in their implementation, called "FreeSync". The difference is that the Nvidia Gsync requires a hardware module in the monitor (and the associated costs), where Freesync uses DisplayPort 1.2a protocols (software commands), thus not requiring the cost. Interestingly, even though the protocols are part of the DisplayPort specs, Freesync supposedly works with HDMI connections as well as DP (I haven't explicitly tried this). I've used both, own both types of monitors, and they both work. How well and to what extent can be much more subjective than simply whether they do work.
Nvidia has (finally) recognized the DisplayPort 1.2a protocol frame sync method, by testing and approving what they're now calling "Gsync Compatible" monitors. They don't use the hardware module that a true Gsync module uses to do this, and they aren't supported by as wide a range of Nvidia GPUs. While Gsync works on many cards all the way back into the 600 series, only the 10-series cards support "Gsync Compatible" monitors. The good news is it's an equally impressive technology, and it's cheaper than true Gsync (because the hardware and associated licensing aren't required). It's technically not as capable as true Gsync; for example the range of frame synchronization is less than with true Gsync, but as I said I own and have used both types of monitors and IMHO the Gsync Compatible setup is plenty impressive on it's own, especially considering the cost is generally much less than true Gsync.
There is also the AMD arrangement, FreeSync...although I do have some AMD GPUs and have tested them with a FreeSync monitor, I haven't spent a lot of time with it. That said, however, I'm sure if you're a fan of AMD graphics, FreeSync is definitely worth considering. It uses the same open-source protocols as 'Gsync Compatible" so I'd fully imagine it's impressive as well.
Another feature I've become very fond of after purchasing one myself, is an "UltraWide" 21:9 aspect ratio monitor. The extra width really adds to immersion by filling more of your normal field of vision, IMHO. Mine's a curved 34" 1080p 21:9, and even though it's Gsync, the cost for what is (currently around $500) is well worth it, I feel.
One final thought, IMHO: You sometimes see people bragging about 4k monitors. I would honestly avoid this...it might be great for TV - definitely a remarkable picture, without doubt. But when it comes to gaming, the extra cost associated with driving a 4k monitor at even close to a solid 60FPS (never mind anything beyond 60Hz) is still way on the ridiculous side. To me. If you have money to burn, good for you, but I still don't think it's worth it, and a lot of "competitive gamers" agree: The video is dated, but still applies, for the reasons it explains 4k Gaming Is Dumb (And before anyone starts crying about it, that is the video's title as posted by the author, not me - and this guy/the site is fairly widely respected for their opinions, BTW).
The thing is, as the resolution increases, the number of pixels goes up dramatically, which means it takes a more and more powerful GPU to drive all that at a decent frame rate - this can easily run you up near a $1000(+) budget for a GPU, and that doesn't count the cost of the monitor itself. Also, most gamers by far strongly prefer the responsiveness of higher-refresh rate monitors, and just like with GPU and 4k resolution, it gets awfully expensive to buy high refresh rate, high resolution monitors. If you want additional premium features like a fast response time, a certain type of panel technology, or a curved screen...well...make sure you've got a ton of cash. And, in my experience, as well as all the gamers I've built systems for and many of the reputable sources online , it's just not worth what it costs...unless of course your objective is absolute bragging rights. And there's always plenty of that on the internet
I also recently aquired an Acer ED273 Abidpx unit. I like it a lot, though I haven't worked with it for long. It cost a fraction of what the LG monitor did, and is a great "value"...16:9 27" curved Gsync Compatible 144Hz, and less than $200
The move up to 2560x1400 really seems to help spot and track targets. It uses the Freesync technology and yet still keeps frame rates up with a gtx 1060 6GB. I do get a slight ghosting exclusively on WOFF with TrackIR that doesn't show up on other programs, so I assume it is some setting I just need to change. It hasn't bothered me enough to spend the time tracking it down yet, though. (running it on display port at the moment, kksnowbear) You will want to look into the difference between IPS vs TN vs VA displays (There was actually a lot more to picking a new monitor than I thought!) This was really one of the only monitors that checked all my boxes at my price point (under $500) Now, I haven't tried WoTR at that resolution late in the battle yet, which is more likely to cause slowdown, but all my other programs seem to be running at the higher resolution no problem, including The Outer Worlds, which I just got.
The older I get, the more I realize I don't need to be Han, Luke or Leia. I'm just happy to be rebel scum...
The move up to 2560x1400 really seems to help spot and track targets. It uses the Freesync technology and yet still keeps frame rates up with a gtx 1060 6GB. I do get a slight ghosting exclusively on WOFF with TrackIR that doesn't show up on other programs, so I assume it is some setting I just need to change. It hasn't bothered me enough to spend the time tracking it down yet, though. (running it on display port at the moment, kksnowbear) You will want to look into the difference between IPS vs TN vs VA displays (There was actually a lot more to picking a new monitor than I thought!) This was really one of the only monitors that checked all my boxes at my price point (under $500) Now, I haven't tried WoTR at that resolution late in the battle yet, which is more likely to cause slowdown, but all my other programs seem to be running at the higher resolution no problem, including The Outer Worlds, which I just got.
Yep, the different panel types are another factor...you're right, there's more to monitors than one might think at first. IPS panels, I think, offer the clearest, brightest, most saturated colors, but that's only a few of the considerations. There's also how black the blacks are, contrast ...really a good idea to study online resources to find out what fits your needs. As I understand it, each one has pros and cons.
BTW, unless I'm mistaken, most of the Nvidia GPUs won't do variable refresh over HDMI, where AMD does. The Nvidia 20-series cards support HDMI 2.1, which if I've understood, will support variable refresh rates. AMD said a while ago they would support HDMI 2.1, but I'm not sure if it's happened yet.
Another good point you've made is that 2560 x1440 is a reasonable compromise between needing higher resolutions for anything 27"+ on the desktop (due to viewing distance), and the cost/performance hit with 4k. Regardless of the GPU, it will certainly do far better driving 2k than trying to drive more than twice as much (4k = 8.29 million pixels, where 2k is only 3.68). Ultimately I decided certain factors (curved, 21:9, Gsync) were going to dictate my budget, and going with 1080 was where I could keep costs down and performance up. Plus I'm blind as a bat anyway , so even though I can visually appreciate higher-res displays, I really don't notice anything lacking about 1080.
#4497354 - 11/16/1901:51 PMRe: OT: End of an era with my old gaming monitor
[Re: AceMedic88]
Joined: May 2012 Posts: 4,879RAF_Louvert
BOC President; Pilot Extraordinaire; Humble Man
RAF_Louvert
BOC President; Pilot Extraordinaire; Humble Man
Senior Member
Joined: May 2012
Posts: 4,879
L'Etoile du Nord
.
Well since we're talking about this and we have some of the computer gurus weighing in, I'd like an opinion as well. I've been thinking about moving up to a curved 34" 21:9 with G-Sync and would like an opinion as to what might be the best match for my current flying rig, in particular the resolution level. Specs are as follows and have remained the same now for about two years:
CPU: Intel Core i7-2600K Sandy Bridge 3.4GHz LGA 1155 95W Quad-Core, OC’d to 3.8GHz CPU Fan: Arctic Super Cooler Memory: Corsair Vengeance 16gb (4 x 4gb) 240-Pin DDR3 SDRAM DDR3 1600 (PC3 12800) Mobo: ASUS Sabertooth P67 Hard Drives: 2 Western Digital 640 GB Caviar Black SATA Opti Drive: LG 22X DVD+/RW Dual Layer SATA Rewrite Video Cards: Two EVGA GeForce GTX 970 04G-P4-3975-KR 4gb cards with HB SLI bridge PS: Corsair HX 850 Watt OS: Windows 10 Pro 64-bit LG 27" flat screen LCD monitor, 1920 x 1080 native resolution Four large case fans, plus the PS, CPU, mobo, and card fans Saitek AV8R joystick Saitek Pro Flight rudder pedals Track IR4 camera with latest IR5 software
Any direction and advice on this will be most appreciated. Also, I'd like to keep it at $500 or less.
.
Three RFC Brass Hats were strolling down a street in London. Two walked into a bar, the third one ducked. _________________________________________________________________________
Former Cold War Warrior, USAF Security Service 1974-1978, E-4, Morse Systems Intercept, England, Europe, and points above. "pippy-pahpah-pippy pah-pip-pah"
#4497356 - 11/16/1901:57 PMRe: OT: End of an era with my old gaming monitor
[Re: AceMedic88]
KK, I've always been on the fence of for ultrawide. Your 34" is definitely worth looking into. I definitely want something that'll be worth it for years to come, but at the same time, I'm not breaking the bank for it.
I got fired as the door man at a sperm bank. Apparently it's in poor taste to tell leaving customers "Thanks for coming."
Former U.S. Army Medic - SGT.
#4497361 - 11/16/1902:53 PMRe: OT: End of an era with my old gaming monitor
[Re: AceMedic88]
Oh one problem is with high res - the dots/pixels are often smaller so harder to see the desktop details as your desktop fonts are smaller too. I have to wear glasses more often than I did to read the screen. You can change font sizes in windows options though but not ideal for some.
Pol: I had a smaller Acer Predator gsync. I was considering your monitor for the replacement. Ultrawide and curved is something that I am considering.
Luckily, it looks like the jump between 1080 and 1440 isn't that much of a burden. I wasn't going to consider 4K. It appears that a majority of the gamers I know have said don't bother with it, and I'm glad that notion is reaffirmed here. I guess what it will come down to is which product line is more reliable? Acer Predator seems like a common one. I see Alienware offers the same specs but a tad cheaper (but at what cost?).
You folks have me sold on the ultrawide, though. I've been thinking for a long time that since I love flight sims that this is the way to go.
I got fired as the door man at a sperm bank. Apparently it's in poor taste to tell leaving customers "Thanks for coming."
Former U.S. Army Medic - SGT.
#4497368 - 11/16/1903:41 PMRe: OT: End of an era with my old gaming monitor
[Re: AceMedic88]
Philips Momentum 32” is 2560x1440 Has a 4 year warranty Excellent color representation. Blacks are dark. And no light bleeding. Under $400. Goes on sale sometimes.
Last edited by orbyxP; 11/16/1903:42 PM.
#4497370 - 11/16/1904:12 PMRe: OT: End of an era with my old gaming monitor
[Re: AceMedic88]
Pol: I had a smaller Acer Predator gsync. I was considering your monitor for the replacement. Ultrawide and curved is something that I am considering.
Luckily, it looks like the jump between 1080 and 1440 isn't that much of a burden. I wasn't going to consider 4K. It appears that a majority of the gamers I know have said don't bother with it, and I'm glad that notion is reaffirmed here. I guess what it will come down to is which product line is more reliable? Acer Predator seems like a common one. I see Alienware offers the same specs but a tad cheaper (but at what cost?).
You folks have me sold on the ultrawide, though. I've been thinking for a long time that since I love flight sims that this is the way to go.
Yes, cost is probably the determining factor in most cases. Like you, I didn't want to 'break the bank' and what mine cost at the time ($600 vs $500 now) was pretty much my limit.
FWIW, be aware that going from a FHD (16:9, 1920x1080) display to a 21:9 ultra wide (if you're talking about 21:9 @ 1440) actually is a pretty big jump in pixels...238% more, to be exact. On the other hand, staying at 1080 only represents a 33% increase. Not as much by far, but it really depends on how well your GPU copes in any case.
Lou, you're already using a 1080p panel, and that being the case, a 21:9 34" model is going to have roughly the same "short side" measurement as a 16:9 27" (about 13.5")...what this means is that the overall size, including the pixel density, would almost certainly look comparable to what you're using now, without having to go to a higher-res monitor - which in turn helps keep costs down.
The real change will be in the display's width, as you know, and that means roughly 31% more viewable area; in my opinion this adds tremendously to the immersion factor since your field of view is now much closer to being filled.
As for performance, you should be fine: SLI typically adds about ~50% performance, and (according to data I've collected over the years), two 970s would put you right about at the level of a 1070 or 1070 Ti... so going to 21:9 1080p should be no problem. (I use a 1070Ti with a 34" 21:9 1080p, and it manages frame rates up over 100 for many games, and ~60 for even the most demanding graphics).
If you're a eagle-eyed sort who can see pixels from across the room, then going to a 3440x1440 (21:9) might be the choice for you...but I'm not sure you can keep the budget to less than $500. What I see by way of 1440 34" GSync curved monitors seem to start around $650 and that's with a 100-120Hz refresh rate (depending on response time).
Even then, as discussed above: Going up in resolution like that will also demand higher GPU performance to keep a reasonable frame rate. Since the bump in resolution is fairly big from 1980x1080p (16:9) to 3440x1440 (21:9), you might have to consider an upgraded GPU. Cost, cost, cost *uggh*
However, if you are OK with your current resolution/viewing distance, then IMHO there's no real need to go higher (and incur all that goes with).
As wide as possible *resolution wise) is the way to go. But like others have said - Pixel size on a monitor of the same physical size is smaller - which can make things harder to see. I love the wider field of view, being able to glance without turning head for TrackIR. After a while, the bezels between the screens are not actually noticed.
Three, three year old 27" Predator XB271HU's running 7780x1440
Last edited by Stache; 11/16/1905:04 PM.
Insanity: doing the same thing over and over again and expecting different results. A. Einstein
Oh one problem is with high res - the dots/pixels are often smaller so harder to see the desktop details as your desktop fonts are smaller too. I have to wear glasses more often than I did to read the screen. You can change font sizes in windows options though but not ideal for some.
That's a genuinely gorgeous monitor, no doubt. I actually looked at the Predator series and ultimately picked the LG more because of cost (on sale) than anything else.
However, as discussed above: There's a healthy increase in pixels (almost 80% more) going even from 21:9 1080 to 21:9 1440. For many people, that difference is probably going to mean upgrading their GPU. I'd say at a minimum to get even halfway decent frame rates, it would require a 1070 or better. Pretty sure if you want something near 60FPS you're probably looking more in the range of the new 2070 "Super", or a 2080. This potentially adds $500+ to the budget. After spending $750 on a monitor, I'm pretty sure Her Royal Highness would be doling out a royal a$$-whipping, if I topped it off with another $500 on a new GPU.
Yes, GPU = "Graphics Processing Unit"...sorry, just easier to type than the longer names, but it is the graphics card
What determines GPU, though...well, that's a very long (and often hotly contested) discussion...I rely on two things, mostly:
- My own (350+ record) database of GPU benchmarks compiled over time
- The many, many online videos and reviews and comparisons, showing FPS of practically any card, usually as compared to practically any other card (or any other 3+ cards, often, all in the same video).
Mind you, what's important is rather than taking any one set of info as gospel, you're much better off looking at lots of different reports/comparisons. If you see one claim that card X will do (whatever), that's one thing...but if you see similar results from a multitude of sources, then it's probably fairly reliable.
Ah. I have a 1080 ti. With the 27" Predator 1440, I never had any issues with fps and could max out most of the newer titles and never go below 60. I would think going to curved and wide at the percentage increase you mentioned wouldn't hurt me too bad.
I got fired as the door man at a sperm bank. Apparently it's in poor taste to tell leaving customers "Thanks for coming."
Former U.S. Army Medic - SGT.
#4497384 - 11/16/1905:30 PMRe: OT: End of an era with my old gaming monitor
[Re: HumanDrone]
Oh, no.. this isn't a bookmark at al! What made you think so? (my rig was built at the end of 2011, it may be getting time here...)
but these curved beauties bring up the old ray-tracing problem of where to put your head vs, screen curvature vs. FOV adjustments....
I see the point regarding curved screens, but I honestly see this as a simple question: Your head pivots essentially around a fixed point, at least horizontally, most of the time. So, I think having some curve in the monitor makes sense in terms of distance to the pivot point. That said, I think the curve radius would actually have to be a good deal more than they currently are to actually keep the screen surface at exactly the same distance all the way around the range you can turn your head (depending on how big and how close the screen is, of course).