Computing Community Consortium Blog

The goal of the Computing Community Consortium (CCC) is to catalyze the computing research community to debate longer range, more audacious research challenges; to build consensus around research visions; to evolve the most promising visions toward clearly defined initiatives; and to work with the funding organizations to move challenges and visions toward funding initiatives. The purpose of this blog is to provide a more immediate, online mechanism for dissemination of visioning concepts and community discussion/debate about them.

Could the Next Big Thing “Take Longer to Arrive”?

August 2nd, 2011 / in big science, research horizons, Research News / by Erwin Gianchandani

Advancing Computer Architecture ResearchIt’s not everyday the national news media envisions computing research. But it happened on Sunday, when New York Times‘ writer John Markoff penned a story about the future of computer architecture — picking up on a paper presented at the International Symposium on Computer Architecture (ISCA 2011) earlier this year that forecast a 24-fold gap from the expectations of Moore’s Law by the year 2024 and concluded, “Regardless of chip organization and topology, multicore scaling is power limited to a degree not widely appreciated by the computing community.”

Markoff writes:

For decades, the power of computers has grown at a staggering rate as designers have managed to squeeze ever more and ever tinier transistors onto a silicon chip — doubling the number every two years, on average, and leading the way to increasingly powerful and inexpensive personal computers, laptops and smartphones.


Now, however, researchers fear that this extraordinary acceleration is about to meet its limits. The problem is not that they cannot squeeze more transistors onto the chips — they surely can — but instead, like a city that cannot provide electricity for its entire streetlight system, that all those transistors could require too much power to run economically. They could overheat, too.


The upshot could be that the gadget-crazy populace, accustomed to a retail drumbeat of breathtaking new products, may have to accept next-generation electronics that are only modestly better than their predecessors, rather than exponentially faster, cheaper and more wondrous.


Simply put, the Next Big Thing may take longer to arrive.


“It is true that simply taking old processor architectures and scaling them won’t work anymore,” said William J. Dally, chief scientist at Nvidia, a maker of graphics processors, and a professor of computer science at Stanford University. “Real innovation is required to make progress today.”

From there, Markoff delves into the “dark silicon” phenomenon and the potential challenges on the horizon:

…Even today, the most advanced microprocessor chips have so many transistors that it is impractical to supply power to all of them at the same time. So some of the transistors are left unpowered — or dark, in industry parlance — while the others are working. The phenomenon is known as dark silicon.


As early as next year, these advanced chips will need 21 percent of their transistors to go dark at any one time, according to the researchers who wrote the paper. And in just three more chip generations — a little more than a half-decade — the constraints will become even more severe. While there will be vastly more transistors on each chip, as many as half of them will have to be turned off to avoid overheating…


The problem has the potential to counteract an important principle in computing that has held true for decades: Moore’s Law. It was Gordon Moore, a founder of Intel, who first predicted that the number of transistors that could be nestled comfortably and inexpensively on an integrated circuit chip would double roughly every two years, bringing exponential improvements in consumer electronics.


If that rate of improvement lags, much of the innovation that people have come to take for granted will not happen, or will happen at a much slower pace. There will not be new PCs, new smartphones, new LCD TVs, new MP3 players or whatever might become the new gadget that creates an overnight multibillion-dollar industry and tens of thousands of jobs.

But as Markoff also notes, it’s not all doom and gloom. These challenges present a terrific opportunity for the field moving forward:

Today, some of the pioneering designers believe there is still plenty of room for innovation. One of them, David A. Patterson, a computer scientist at the University of California, Berkeley, called dark silicon a “real phenomenon” but said he was skeptical of the authors’ pessimistic conclusions.


“It’s one of those ‘If we don’t innovate, we’re all going to die’ papers,” Dr. Patterson said in an e-mail. “I’m pretty sure it means we need to innovate, since we don’t want to die!”

Read the full New York Times story here, as well as the original ISCA ’11 paper — “Dark Silicon and the End of Multicore Scaling” by Hadi Esmaeilzadeh, Emily Blem, Renée St. Amant, Karthikeyan Sankaralingam, and Doug Burgerhere.

And please also check out the CCC’s recent Advancing Computer Architecture Research (ACAR) visioning activity — led by Josep Torrellas and Mark Oskin, and with several dozen other leading computer architecture researchers participating — which has resulted in a roadmap that lays out a research agenda for the field in the years ahead.

(Contributed by Erwin Gianchandani, CCC Director)

Could the <em>Next Big Thing</em> “Take Longer to Arrive”?

Comments are closed.