Hellshade, remember most modern games do not need the fast CPU because they are GPU bound. But when you come to Flight Sim's (especially WOFF/CFS3) they are still CPU bound, so it does make a difference for these game.
An example is when I went from my i7 5930X (OC'ed to 4.0 GHz) to a i7 6700K (4.0 to 4.2 GHz) my FPS jumped by an average of 10% across the board using the same GPU, monitor (2560x1440) and WOFF settings. All because I had a faster CPU and nothing else.
CPU = i9 11900K GPU = RTX 3080 Ti Monitor = ASUS ROG Swift PG32UQX 2160p G-sync
#4326704 - 01/07/1707:47 PMRe: What's the new standard for PC harware?
[Re: OvStachel]
Panama/Hellshade.....I'm no expert but had my kids (my Christmas pesent) kit me out w/ a GTX1080, an i76700cpu, and an Asus B150 mobo. After the usual hiccoughs, it was installed, WOFF reinstalled.....and the results exceeded my wildest dreams. All settings maxed...and seeing 60fps at all times...maybe mid 50's in 1918 over the front in a furball....but I've been looking at a slideshow for so long when it counted most, read w/ envy those guys that had big fps #'s, that I felt all the $ invested (donations, purchases), agony and angst over crashes and dll bad news.....was finally vindicated. WOFF looks and plays as beautifully as I'd ever hoped so don't hesitate...spend lavishly and carry on!!! What else are we going to spend our ill booten gotty on, aside from whiskey, before we're dead?
#4326715 - 01/07/1708:38 PMRe: What's the new standard for PC harware?
[Re: OvStachel]
I went from a i5-2500k OC to 4,8Ghz to the i7-6700 OC to 4,4Ghz and I do not see any difference in load per core. Geuss it is all about plain Ghz.
For hardware upgrade I would wait for the AMD next gen. release on both CPU and GPU and if it is another AMD debacle, do not worry it will always drop Intels/nVidia prices.
For the higher budget is al been written down as the i7-7700/GTX1080 and maybe the nVidia Titan.
If this budget is out of range I would go for an: CPU i5-6xxx or i5-7xxx skip the k versions if you are not an overclocker. GPU AMD RX480/RX470 (8gb or 4Gb) nVidia GTX1060 (6Gb or 3Gb). If buying the AMD RX480 or the RX470 check always if it does have the extra 6pin connector, so not all the load will go over the PCIe slot.
For flysims like WoFF/RoF/BoS/Clod I would not buy the GTX1050(ti) or the RX460.
#4326721 - 01/07/1709:08 PMRe: What's the new standard for PC harware?
[Re: OvStachel]
dutch: You are absolutely correct, as far as WOFF/CFS3 was concerned, you went backwards when you went from 4.8 GHz to 4.4 GHz since it is all about single core speed for WOFF/CFS3.
The main benefit of the "K" chips (even though you may never overclock) is that they are faster than the same "non-K" chip, and if you want WOFF/CFS3 to have the highest FPS possible, you need the highest single core GHz you can afford.
My FPS actually went up when I moved to a higher CPU even though the GPU went down. Which proved again that it is still all about single core speed for WOFF/CFS3.
CPU = i9 11900K GPU = RTX 3080 Ti Monitor = ASUS ROG Swift PG32UQX 2160p G-sync
#4326731 - 01/07/1709:47 PMRe: What's the new standard for PC harware?
[Re: OvStachel]
Yep, it is all about Ghz for WoFF but he does also play other sims. In case of RoF my old i5-2500k did not need any overclock, here it was the GPU that needs the boost.
Forget one thing, if you are not an overclocker, modern mobo's do have oc profiles, so if have a good CPU cooler it is just a matter of selecting the right OC profile. No rocket science, just clicking.
#4326769 - 01/08/1712:45 AMRe: What's the new standard for PC harware?
[Re: OvStachel]
Currently have a 21” (TN) LG monitor with max res of 1080 p (1920 x 1080 max resolution), 60 hz
With the new generations coming out, the price on everything above is going down, and I can run this game pretty much flawless at full max everything. Only time I see any stutter is occasionally at 12x speed (when I use that), and that is mostly in cloud fog (believe it or not, the stuttering, and even tearing can be more monitor related then anything else). For immersion, playing the game as intended? Runs great, looks amazing.
Monitors also matter, but no one seems to talk about them, so let me confuse the absolute hell out of you! (My degree is electronic engineering, but I’ve been out of it for some time so I’ve included some geek links to help and clarify).
I’m splurging a bit and buying an (IPS) Predator XB271HU 1440 p (2560 x 1440) that cooks up to 165 hz. So I’ll be able to unleash my gpu, uses G-sync (you want a g-sync monitor with a geforce, and a freesync monitor with AMD) so there will be no sim, or any game that I won’t be able to run on max (but not all, especially some of the insane graphic intense games on “Ultra”) setting
TN monitors are most noted for video gaming, mostly because of response time (down to 1 ms gtg). That means one milisec for pixels to go from one gray to another, or really black to white. They are quicker, but not as pretty as they cannot reproduce an actual 6.7 million (6 bit processing rather then the ips 8 bit processing) colors but use “dithering” to attain the effect. They also suffer from viewing angle and tend to wash out as you move to the sides (top mostly). Not really noticeable on small monitors, but maybe slightly on larger.
IPS monitors are slower on the conversion (can be up to 10 ms), but full color reproduction and amazing viewing angles (in short, they look better). They also suffer just slightly from Backlight bleed (some light bleeding through on the edges, but only really noticeable on total black screen in a dark room, and more so in lower end models)
That 1 ms speed prevents “ghosting” in twitch reflex games on crack (lower number the better) Keep in mind though, that there are two aspects to response time. First is the 1ms they advertise (gtg). There is also input lag, or signal processing delay (delay to respond to mouse or keyboard clicks through your system and to the monitor), which will give the feel of the game lagging if too slow. I’ve seen reviews where, at the top end, and IPS monitor was actually faster than a TN because it made up the difference with less signal processing delay.
To make it simple? Overall, based on all my reading up on this, unless you are going with a high end ($800.00 price tag) IPS monitor (with no more than 4 ms response gtg), I’d stay with a TN for gaming. I’ve had them for years and they look good enough because you are usually sitting directly In front of them, will see less “ghosting” and “blur” especially now that we can turn our heads quickly via the use of Tracir type devices, and without an IPS right next to them that you can compare colors with, you won’t notice the difference.
That’s what I’ve uncovered in my few days of research on monitors, and should save you some time (and head shaking) if you start looking into monitors. They can be confusing as hell, worse the video cards IMHO.
Hey Blackard, Is that a typo, or are you really running a 6700k with a 512mb 9800 gtx?
In regards to monitors, I haven't used one for home use for about 10 years at least. Big HDTV's all the way from me, I don't think I could ever go back to huddling over a monitor.
I7 4770k@4.6ghz WC H80i Asus Maximus Hero VII Corsair 1000w rm1000 psu 16gb Crucial Ballistix Tactical@2100 Creative zx soundcard 2 x MSI 970 GTX SLI 55" Sony Bravia 4k HDR Corsiar 760t
#4328244 - 01/13/1703:17 AMRe: What's the new standard for PC harware?
[Re: OvStachel]
I've had a 27 in IPS (QNIX QX2710) for almost 2.5 years now. It has a display lag between 15 - 20 ms. Besides a small area of backlight bleeding in the corners, I have not noticed any ghosting/blurring of images on any of my games. Picture is sharp enough not to need much AA and great colors with deep blacks and bright whites. I've owned a TN before that for many years and the difference in picture quality/colors is like night and day - blacks are grayish and whites are dull.
Regarding CPUs, I'd recommend any Haswell or Skylake. For GPUs, I'd go with any Maxwell chip for nVidia or wait for the vega from AMD.
#4328254 - 01/13/1704:37 AMRe: What's the new standard for PC harware?
[Re: gaw1]
Hey Blackard, Is that a typo, or are you really running a 6700k with a 512mb 9800 gtx?
LOL, Typo. Nice catch. Took me a second to realize what you were asking. I'm that dense.
That is what is on my old XP machine. I don't think you could even run this game with 512mb, could we?
I'm running a Nvidia GeForce 980 GTX 4 GB SLI. I almost went with a 6Gb MSI but when I did side by sides (at the time) my card pretty much smoked it in everything beyond OC, especially at 1440p, which is what my new monitor will be running. I'm sure overclocking the 6 GB card would have moved it past mine, but just wasn't worth the extra cash to have to OC a card to meet mine, but the 4 gig does everything I want and more. I'm not much into OC anyway. I know that lots of people are, and that's great, but just not my thing.
Thanks for the heads up. Bet you were like, Wha?
#4328258 - 01/13/1705:14 AMRe: What's the new standard for PC harware?
[Re: OldHat]
Read a lot of good things on the QNIX QX2710, Old Hat. Reasonable price and still has 4 ms gtg, if I remember correctly.
I think that it is actually a Korean PLS monitor, rather then true IPS (still comparable), yes?
Its' crazy. we are in another one of those tech firestorms. This time with the monitor revolution. So much has changed recently and with the power draw of games these days, the panel tech is jumping. It is the main reason I'm upgrading from my LG. It is V-synce, no G-sync or Freesync, and no display port to run G-sync, but I like my GeForce cards so I paid the extra price for the third party chip that GeForce needs (though I think their business model will be their undoing eventually (think Voodoo cards).
The imports are starting to produce them now, and I think that it will, once again, bring the prices down on panels across the board.
Last edited by Blackard; 01/13/1705:16 AM.
#4328267 - 01/13/1705:57 AMRe: What's the new standard for PC harware?
[Re: OvStachel]
One thing to be wary of buying a new monitor now, is 2017 is going to be the year of HDR gaming. Support in Windows 10 is coming (already available on the insider builds), and all Nvidia 9 and 10 series cards support HDR,and I think AMD 480 has support too. So far for me at least HDR content is more impressive than the actual 4k output itself on enabled TV's.
I7 4770k@4.6ghz WC H80i Asus Maximus Hero VII Corsair 1000w rm1000 psu 16gb Crucial Ballistix Tactical@2100 Creative zx soundcard 2 x MSI 970 GTX SLI 55" Sony Bravia 4k HDR Corsiar 760t
#4328278 - 01/13/1708:19 AMRe: What's the new standard for PC harware?
[Re: OvStachel]
This may be a silly question... I have an AMD 965 o/c from a multipler of 17 (3.4Ghz) to 19 which gives 3.8Ghz. I tried 20 but it was not reliable at that speed. The silly question is that bearing in mind that CSF3.exe runs in one cpu core, maybe I could increase the speed on that core ONLY to see if I could get WOFF UE to work faster by setting ProcessLasso to run CSF3 on that core and everything else on the other three? Normally I overclock all four cores the same speed.
When I overclock my Intel CPU's, I always go for just once core with WOFF since it takes less voltage to get it stable, so less heat and stress on the whole CPU.
CPU = i9 11900K GPU = RTX 3080 Ti Monitor = ASUS ROG Swift PG32UQX 2160p G-sync