Computing Community Consortium Blog

The goal of the Computing Community Consortium (CCC) is to catalyze the computing research community to debate longer range, more audacious research challenges; to build consensus around research visions; to evolve the most promising visions toward clearly defined initiatives; and to work with the funding organizations to move challenges and visions toward funding initiatives. The purpose of this blog is to provide a more immediate, online mechanism for dissemination of visioning concepts and community discussion/debate about them.

Computing Research in Astronomy’s Decadal Survey

August 27th, 2010 / in Uncategorized / by Erwin Gianchandani

Every 10 years, U.S. astronomers come together to generate what has become a highly influential report recommending which astronomy and astrophysics projects should be funded by Federal agencies in the following decade. This year’s “decadal survey” – assimilated from 9 appointed panels, 17 town hall meetings, and 324 white papers – was released by the National Research Council (NRC) on Aug. 13.

Unlike previous decadal surveys that have been defined by lengthy “wish lists,” the 2010 report unveiled just 8 projects, all focused on the study of dark matter and dark energy. And for the first time, the survey included independently vetted estimates of project costs. (See and for more.)

A call for data-driven science

Something else noteworthy about the new NRC report is the unprecedented amount of data astronomers envision collecting through the proposed projects over the next 10 years — and the resultant data analytics tools that will be necessary to make sense of this enormous wealth of information. Consider, for example, the Large Synoptic Survey Telescope (LSST) highlighted in the report. By the time it is completed in 2015, this ground-based facility — a giant 8.4-meter telescope equipped with a 3.2-gigapixel camera – will be capable of sweeping the entire visible sky every three days. With it, astronomers will construct a 100-petabyte database — that’s petabyte, or one billion megabytes — of the stars and constellations in our galaxy, enabling a three-dimensional model of our universe that will inform our understanding of dark matter and dark energy. Similarly, the decadal survey proposes other telescopes that would yield data containing snippets of information likely to be elucidated only via intelligent data mining and machine learning, as well as improved visualization strategies.

In this way, the decadal survey illustrates the need for fundamental research that advances core techniques underlying data-driven science (often termed eScience) — through highly collaborative, interdisciplinary efforts that, in this case, bring astronomers and astrophysicists together with computer scientists. This theme is consistent with a series of white papers, From Data to Knowledge to Action, being produced by the CCC (see Over the past several decades, simulation-oriented computational science has joined theory and experiment as a fundamental paradigm for discovery in many branches of science and engineering. Today, whether it’s in astroinformatics or matinformatics or bioinformatics, we’re at the dawn of a second computational revolution in discovery, driven by data and the automated analysis of that data – a revolution that will have an even more pervasive effect than the first computational revolution.

An example of visioning

And one other point: an editorial in Nature observes that the 2010 report stands in stark contrast to other recent decadal surveys:

The latest survey has clearly rescued the decadal process from torpidity. The list is relevant and affordable… The panelists [also] hope to address… potential flaws with a more flexible decadal process. Rather than treating their document as carved in stone until 2020, the researchers call for a standing committee to carry out periodic reassessments…

As the editorial staff of Nature further writes, the report is so improved that “other disciplines planning their own reviews should follow its lead, as it promises to be a steady guide for a bumpy decade ahead.”

The astronomy decadal survey is of importance because many aspects of astronomy, like many aspects of physics, are driven by the need for very large-scale instrumentation and infrastructure. Other fields — like computing — don’t have the need for this costly infrastructure support, and thus for the prioritization that must accompany it. But Nature is right about one thing: a relevant, affordable, long-term vision of groundbreaking research can go a long way in marshaling forces, shaping a research agenda, and advancing a field or sub-field by leaps and bounds.

What do you think? Share your thoughts below.

(Contributed by Erwin Gianchandani, Director, CCC)

Computing Research in Astronomy’s Decadal Survey
  • Djorgovski

    That is indeed the good stuff. But if you believe that science in the 21st century is being profoundly transformed by computation and IT (and by computation I do not mean just the ol’ time number crunching to solve a lot of PDEs, but having to do more with manipulation of data and information), then the report shines much less. In fact, I’d say that it reveals a complete lack of a vision and understanding in this arena. I have written a bit more about this in: