Escolar Documentos
Profissional Documentos
Cultura Documentos
It turns out that increasing clock frequency increases power consumption of the device. In fact, it scales proportionally with it. Increased power consumption heats up the device, which leads to even more power dissipation in the form of increased saturation currents through the devices. If the trend to increase clock frequency had continued, by now some parts of the chip would be as hot as the Suns surface!
Clearly, such an increase was not sustainable, so engineers found other ways to increase the performance of the chip.
One of most prominent and widely employed alternative is to use more than one processor in the same system. This idea exploited by having multiple processor cores on the same die (A die is a small piece of semiconductor material on which a functional circuit is fabricated). With a multi core processor, with each core running independent of the other, more than one task can be in the execution at the same time. Effectively, in each clock, twice the amount of work can be done by a dual core processor than a single core processor. Thus operating two cores at a smaller frequency gives the same benefits as increasing the clock frequency. Now you must be asking, if I use two cores instead of one, wouldnt I burn more power? Well, yes, as compared to a single core processor running at the same frequency. No, if the frequency of the dual core processor is lower. This stems from the fact that a larger voltage is required to support higher frequencies, while the power dissipation increases as a square of voltage. (See figure x)
PICTURE
This however, does not mean one can indiscriminately increase the number of cores. One way to see the problem is to realize that these two cores are not completely independent. The tasks which these cores perform are often dependent on each other. Also, there are overheads associated with distributing the tasks between the cores and ensuring coherency with other sub systems of a computer, most prominently the memory. The crux of the matter is that to exploit multiple cores, there must be parallel instructions. If we have a task which is divided into instructions which must be performed sequentially, then having multiple cores does not improve the performance. Only those applications which make extensive use of parallel instructions benefit from multiple cores. But software parallelism is one of the toughest challenges in computing in general for which no general purpose solution in terms of productivity has been found yet.
In the current scenario, thus, the number of such applications is limited. Video games are CPU and GPU intensive where parallelism is naturally expected, but in this case, the research has focused primarily on improving the GPU since it is easier than parallelizing complex game engines. Other examples are CAD tools and multimedia applications like video encoding. On the other hand, web data centers and scientific computing are natural applications where the effort of software parallelization is justified. Recent studies, such as one by ST Ericsson, have shown that over the last decade most demanding games, office application, web browsing and multimedia playback make good use of only two processors only. Rest cores are not utilized to the optimum, and thus performance gain is not as drastic as expected. (See figure x2) So why should you care about all of this? The answer is to appreciate the fact that more cores dont necessarily mean better performance. Right now the dual core processor is the king in the smart phone market. And they indeed much superior to single core CPUs. But when the quad core processors for smart phones are launched, dont hasten to buy them, since they might not be as good you expect them.