CPUs are getting more complex and lately even consumer chips have started gaining more cores. We have seen a good example of this with smartphone processors. Snapdragon chips have always has a lot of cores and apple only a few, but still apple outperformed the Snapdragon chips. And there are more complex reasons for that for sure, like gaster and larger cache and larger chip size enabled by apple ordering custom chips instead of every android phone maker who buys from Qualcomm.
Qualcomm has to make a profit selling that processor while apple can make a loss on the processor and make it up by increasing the price of the phone as a bundle. But i don’t think this is the reason why multicore systems don’t perform as expected.
If you look at Intel and AMD, Intel still has better performance for the average used due to having a few cores that work fast. On the other hand AMD that focused on 6 and 8 core processors and the 16 and 32… is only a better option if you are a professional and you need a workstation for specialized multithreaded calculations.
Why is it then that many cores can’t offer a better experience for everyone?
The most important reason has to be software. If software is not optimized to take advantage of many cores or if that isn’t done efficiently then some performance is lost. But it’s hard to blame someone in particular since everyday consumers rely on multiple software makers, like Microsoft and Google and their software would require a lot of work to be able to run efficiently on all possible system configurations. Although you could also say that such companies do have the budget for such a thing and they probably lack the interest.
Another important reason is hardware design. By adding more cores, the architecture has to change so that communication between the cores and synchronization is perfect. The increased complexity in certain processes might cause a decrease in performance. And it is not that simple to design a chip in such a way. It wouldn’t be possible to make it entirely out of graphene or gold, as those materials are expensive and even working with large amounts of such expensive materials could double or triple the final cost of the chip.
Despite all that we see some new applications that work perfectly with many cores even if they are not as optimized as they could be. Deep learning, Image manipulation and data analysis can benefit from many cores and from architectures different than CPUs and more like GPUs. But as always, the everyday consumer cannot use such technologies. On the other hand though, once those technologies get tested by scientists in scientific scenaria, and the get discovered by software and hardware companies they can be modified and introduced in a version of Windows or Adobe After Effects.
You also have to keep in mind that hardware designs and even algorithms may be held under a patent for years until they become available for others to use them and even then they may cost a lot. That’s why big companies like Intel and Microsoft try to do their own research to own their patents. Apple already does something similar and it has proven very profitable
For links to sources and more information check the sources below and for more science and tech news follow Qul Mind.
Sources: Amdahl’s Law in the Multicore Era.