We’d like your help with a brainstorming exercise: Identify about a dozen game-changing advances from computing research conducted in the past 20 years. Here’s what we mean:
- The advance needs to be “game changing,” in the sense of dramatically altering how we think about computing and its applications.
- The importance of the advance needs to be obvious and easily appreciated by a wide audience.
- There needs to be a clear tie to computing research (or to infrastructure initiatives that build upon research and were sponsored by computing research organizations).
- We’re particularly interested in highlighting the impact of federally-funded university-based research.
We’re focusing on work carried out in the past 20 years or so, in part because of the upcoming 20-year celebrations for the CISE directorate at NSF. Of course, lots of great fundamental research can take more than 20 years before the impact becomes obvious, but even in such cases there is usually continuing influences on more recent research that can be cited here.
To get your juices flowing, here are four game-changers that we definitely think belong on the list. Use these to think about others that belong on the list, or feel free to argue with our choices.
The Internet and the World Wide Web as we know them today
In 1988 — 20 years ago — ARPANET became NSFNET. At the time, there were only about 50,000 hosts spread across only about 150 networks. In 1989, CNRI connected MCImail to the Internet — the first “commercial use.” In 1992, NCSA Mosaic triggered the explosive growth of the World Wide Web. In 1995, full commercialization of the Internet was achieved, with roughly 6,000,000 hosts spread across roughly 50,000 networks. Today, there are more than half a billion Internet hosts, and an estimated 1.5 billion Internet users.
While many of the underlying technologies (digital packet switching, ARPANET, TCP/IP) predate the 20-year window, the transition from the relatively closed ARPANET to the totally open Internet and World Wide Web as we know them today falls squarely within that window. NSF-supported contributions included CSnet, NSFNET, and NCSA Mosaic.
The Internet and the World Wide Web are game-changers.
Where once we filed, today we search
The vast majority of the world’s information is available online today, and we find what we need — whether across the continent or on our own personal computer — by searching, rather than by organizing the information for later retrieval.
Research on the retrieval of unstructured information is based on decades of fundamental research in both computer science theory and AI. But the paradigm shift that is web crawling and indexing and desktop search is much more recent. It traces its roots to university projects such as WebCrawler, MetaCrawler, Lycos, Excite, Inktomi, and the NSF Digital Libraries Initiative research which begat Google.
Search is a game-changer.
At the risk of offending our many computer architect friends, we’re going to assert that cluster computing is the most significant advance in computer architecture in the past 20 years.
A decade ago, Jeff Bezos was featured in magazine advertisements for the DEC AlphaServer, because that’s what Amazon.com ran on — the biggest shared-memory multiprocessor that could be built. Similarly, the AltaVista search engine was designed to showcase the capabilities of big SMP’s with 64-bit addressing.
Today, this seems laughable. Companies such as Google and Amazon.com replicate and partition applications across clusters of tens of thousands of cheap commodity single-board computers, using a variety of software techniques to achieve reliability, availability, and scalability.
The notion of hardware “bricks” probably can be traced to Inktomi, a byproduct of the Berkeley Networks of Workstations project. The software techniques are drawn from several decades of research on distributed algorithms.
Cluster computing is a game-changer.
The transformation of science via computation
The traditional three legs of the scientific stool are theory, experimentation, and observation. In the past 20 years, computer simulation has joined these as a fundamental approach to science, driven largely by the NSF Supercomputer Centers and PACI programs. Entire branches of physics, chemistry, astronomy, and other fields have been transformed.
Today, a second transformation is underway — a transformation to data-centered eScience, which requires semi-automated discovery in enormous volumes of data using techniques such as data mining and machine learning, much of which is based on years of basic research in statistics, optimization theory, and algorithms.
Computational science is a game-changer.
Quantum computing. There is huge potential here, but the impact hasn’t been felt yet.
Simultaneous multithreading. We claim that this, and many other important advances in computer architecture, are dominated by cluster computing. (Remember, we’re trying to be provocative here! Blame Dave Ditzel, who put this idea into Ed’s head.)
Your part goes here!
What’s your reaction to the four game-changers that we’ve identified? Do you agree that they belong on the list? If not, why not? If so, what do you think were the principal components of each — the key contributing research results?
Even more importantly, give us eight more! What are your nominees for game-changing advances from computing research conducted in the past 20 years?
Give us your thoughts!