Computing Community Consortium Blog

The goal of the Computing Community Consortium (CCC) is to catalyze the computing research community to debate longer range, more audacious research challenges; to build consensus around research visions; to evolve the most promising visions toward clearly defined initiatives; and to work with the funding organizations to move challenges and visions toward funding initiatives. The purpose of this blog is to provide a more immediate, online mechanism for dissemination of visioning concepts and community discussion/debate about them.


Another Perspective on the White House NSCI Workshop

November 3rd, 2015 / in CCC, policy, Research News, workshop reports / by Helen Wright

Doug Burger, MicrosoftThe following is a special contribution to this blog by Doug Burger, Director of Hardware, Devices, and Experiences at Microsoft Research.

I served as a panelist at the White House National Strategic Computing Initiative Workshop (NSCI) on October 20-21. I took away a number of points about the consensus of the group that I thought worth sharing with the broader community.

1) It is clear that CMOS is coming to an end.  That was a striking consensus of the group, both on per-transistor costs and scaling. The semiconductor researchers are looking for extreme ultraviolet lithography (EUV) to come online, although they’ve been struggling with it for 10 years, still don’t have it working, and once it works, cost will still be an issue. It may never get there.

2) It is clear that there is no replacement for CMOS devices within the next ten years (also a strong consensus of the assembled group). None of the nearer-term devices under consideration look to have a significantly superior power/speed profile to CMOS, and those candidates that show significant theoretical improvements haven’t yet been demonstrated in the lab.

3) People view quantum as a good long-term bet for a disruptor. Both Microsoft and Google are now making major investments in that space. The group had concerns about whether quantum would produce killer apps (I, for one, do not share those concerns; I think a number of them have been identified).

 4) No new computational models outside of quantum, neuromorphic and approximate computing were discussed.  Approximate computing was viewed as having potential, but is unlikely to produce enough benefit to pick up the post-CMOS slack. (My own view on the neural space is that it’s incredibly important, but that neuromorphic (“brain inspired”) architectures are premature. The next five years will continue to produce large gains in non-neuromorphic machine learning (e.g. CNNs/DNNs/RNNs/LSTM). On the neuromorphic side, what we really need are advances in neural theory and increased understanding of the brain’s underlying computational mechanisms.)

5) There was a general concern among the attendees in how legacy codes can benefit in this “New Normal” of computing, how supercomputers can continue to scale, and how we can continue to increase the talent pool in the HPC community.

6) The scale and explosive growth rate of the cloud market was surprising to many of the attendees. The scale of the cloud market will enable investments in system design that are otherwise unaffordable. This shift may be analogous to how the growth in the PC market allowed the commodity (“killer micro”) CPUs to wipe out most of the supercomputing companies and custom vector machines starting in the early 1990s. The key question here is: will HPC systems have to shift to the cloud to be competitive?

7) There is going to be increasing complexity in systems as people specialize, move to heterogeneous systems, etc., in response to the slowing or end of Moore’s Law.  The HPC community is really worried about the challenge of productivity in that environment. The NSCI program should provide significant support for academic research of novel prototype systems and stacks, since the New Normal will require risky and aggressive exploration down many paths. Given the large and growing scale of the cloud providers, our star systems researchers will need significant support to construct alternative systems that have eventual impact.

See the Computing Community Consortium (CCC) blog about this workshop for more information and their newly released whitepaper in support of the related White House Grand Challenge.

Another Perspective on the White House NSCI Workshop

1 comment

  1. Bill Rider says:

    I like your summary of many of the challenges for computing hardware coming at us. Given all of these challenges I am continually perplexed by the approach we take toward high performance computing. Why focus so much attention on the hardware when it is likely to be utterly futile? The computers of future appear to almost impossible to use and quite impressively suboptimal for many important applications.

    Our codes have continually produced diminishing returns in performance compared to peak for the last 20 years, and this trend will likely accelerate (or decelerate) with new platforms. Other forms of progress in performance are almost completely ignored such as algorithmic scaling where lack of resources has resulted in stagnation. This is despite showing equal gains to hardware historically. In summary our HPC efforts are completely unbalanced.

    Our entire strategy seems to be hardware and its impact on everything upstream from it. Simply making existing code work at all on future platforms is going to swallow almost all the available resources. Meanwhile other proven approaches are completely bypassed.