I still disagree, the brand 'Intel' and 'AMD' are still largely irrelevant when consoles are using lower spec components and therefore it's still pointless for devs to put the time and effort into multi-threaded coding when it just can't be used to the same extent. If a game can be coded to use 18 cores, then it isn't a simple task to direct these into schedulers that utilise half of that or much less and maintain anything like a similar level of performance, and therefore the time/effort/return equation comes into effect.....and more so when some of todays games are equivalent to Triple A Hollywood blockbusters in terms of cost.

You can't therefore just say that in the future, games will run on 4 cores but will run best on 16+ cores either, the same rule applies as above and as we know with todays' games just adding cores doesn't suddenly give a huge performance increase because the games don't just suddenly start using them, they don't even know they are there unless the host operating system (software) is able to do some limited manipulation or 'tricks' but it will never be the same or as fast as what the CPU can do at a hardware level. Given the typical tasks that the CPU undertakes and the parallel computing of GPUs, I still think you're much more like to see the importance put onto the GPU especially for games many more times over than the inclusion of huge CPU core counts.

Folks who want 'maximum performance' will always have a top-end GPU paired with an average to good CPU......simply because the GPU is many more times powerful than a top-end CPU especially for games. It's the same reason why a GPU can be bottle-necked by a CPU but the other way around is rarely a issue.


On the Eighth day God created Paratroopers and the Devil stood to attention.