High Performance Computing: Roadblocks and Ways Forward

Sunday, February 19, 2017: 1:00 PM-2:30 PM
Room 210 (Hynes Convention Center)
Randy Bryant, Carnegie Mellon University, Pittsburgh, PA
For over 50 years, the semiconductor industry has followed the trend known as “Moore’s Law” in which the number of devices (specifically, transistors) that can be integrated into a single system has doubled every two years. This progress now means that the smartphones we carry in our pockets have far more computing power than the largest supercomputers of the mid-1970s. They have also enabled continued progress in supercomputers, key tools for scientific research, industrial design, and defense.

It is instructive to ask the question “Can Moore’s Law continue for another 50 years?” Today, the largest chips contain around 5 billion transistors. Extrapolating the trend another 50 years would lead to systems with 100 quadrillion (10 to the power 17) devices. We really have no way to imagine how a computer system of such scale would operate or be organized, but there’s no doubt it would have capabilities that far exceed today’s most powerful systems.

Unfortunately, a straightforward extrapolation of current semiconductor technology, in which transistors are fabricated on the surface of a 2-dimensional chip, leads to physical impossibilities. It would require transistors to be spaced closer together than the individual atoms in a silicon crystal. Indeed, some experts have already stated that “Moore’s Law is dead,” based on the challenges of squeezing more transistors on a 2-dimensional chip. It’s possible to imagine, however, that new technology will enable devices to be fabricated as layers in a 3-dimensional system. Such a capability would require many advances in manufacturing technology, but it lies within the realm of physical possibility.

Perhaps the greatest challenge in continued progress in electronic technology is power consumption. Current smartphones consume less than two watts of power. Large supercomputers require over 10 megawatts of power to operate, equivalent to the total power consumed by a small town. The scaling trends for power consumption based on current circuit technology do not look promising.

It is instructive to look to mammal brains as an inspiration for building low-power systems. The human brain contains around 84 billion neurons and operates with less than 15 watts. As a computer, it relies on very different principles than transistor circuits, using massive parallelism to store and process information, rather than fast, individual devices. Some research projects are building computer systems along these principles and are achieving greatly reduced power consumption. Even these approaches, however, will not keep the power low enough to enable a 100-quadrillion device system. Reaching that goal will require major scientific breakthroughs.