Computing Community Consortium Blog

The goal of the Computing Community Consortium (CCC) is to catalyze the computing research community to debate longer range, more audacious research challenges; to build consensus around research visions; to evolve the most promising visions toward clearly defined initiatives; and to work with the funding organizations to move challenges and visions toward funding initiatives. The purpose of this blog is to provide a more immediate, online mechanism for dissemination of visioning concepts and community discussion/debate about them.


Game-Changing Advances from Computing Research

November 4th, 2008 / in Uncategorized / by Peter Lee

We’d like your help with a brainstorming exercise: Identify about a dozen game-changing advances from computing research conducted in the past 20 years. Here’s what we mean:

  • The advance needs to be “game changing,” in the sense of dramatically altering how we think about computing and its applications.
  • The importance of the advance needs to be obvious and easily appreciated by a wide audience.
  • There needs to be a clear tie to computing research (or to infrastructure initiatives that build upon research and were sponsored by computing research organizations).
  • We’re particularly interested in highlighting the impact of federally-funded university-based research.

We’re focusing on work carried out in the past 20 years or so, in part because of the upcoming 20-year celebrations for the CISE directorate at NSF. Of course, lots of great fundamental research can take more than 20 years before the impact becomes obvious, but even in such cases there is usually continuing influences on more recent research that can be cited here.

To get your juices flowing, here are four game-changers that we definitely think belong on the list. Use these to think about others that belong on the list, or feel free to argue with our choices.

The Internet and the World Wide Web as we know them today

In 1988 — 20 years ago — ARPANET became NSFNET. At the time, there were only about 50,000 hosts spread across only about 150 networks. In 1989, CNRI connected MCImail to the Internet — the first “commercial use.” In 1992, NCSA Mosaic triggered the explosive growth of the World Wide Web. In 1995, full commercialization of the Internet was achieved, with roughly 6,000,000 hosts spread across roughly 50,000 networks. Today, there are more than half a billion Internet hosts, and an estimated 1.5 billion Internet users.

While many of the underlying technologies (digital packet switching, ARPANET, TCP/IP) predate the 20-year window, the transition from the relatively closed ARPANET to the totally open Internet and World Wide Web as we know them today falls squarely within that window. NSF-supported contributions included CSnet, NSFNET, and NCSA Mosaic.

The Internet and the World Wide Web are game-changers.

Where once we filed, today we search

The vast majority of the world’s information is available online today, and we find what we need — whether across the continent or on our own personal computer — by searching, rather than by organizing the information for later retrieval.

Research on the retrieval of unstructured information is based on decades of fundamental research in both computer science theory and AI. But the paradigm shift that is web crawling and indexing and desktop search is much more recent. It traces its roots to university projects such as WebCrawler, MetaCrawler, Lycos, Excite, Inktomi, and the NSF Digital Libraries Initiative research which begat Google.

Search is a game-changer.

Cluster computing

At the risk of offending our many computer architect friends, we’re going to assert that cluster computing is the most significant advance in computer architecture in the past 20 years.

A decade ago, Jeff Bezos was featured in magazine advertisements for the DEC AlphaServer, because that’s what Amazon.com ran on — the biggest shared-memory multiprocessor that could be built. Similarly, the AltaVista search engine was designed to showcase the capabilities of big SMP’s with 64-bit addressing.

Today, this seems laughable. Companies such as Google and Amazon.com replicate and partition applications across clusters of tens of thousands of cheap commodity single-board computers, using a variety of software techniques to achieve reliability, availability, and scalability.

The notion of hardware “bricks” probably can be traced to Inktomi, a byproduct of the Berkeley Networks of Workstations project. The software techniques are drawn from several decades of research on distributed algorithms.

Cluster computing is a game-changer.

The transformation of science via computation

The traditional three legs of the scientific stool are theory, experimentation, and observation. In the past 20 years, computer simulation has joined these as a fundamental approach to science, driven largely by the NSF Supercomputer Centers and PACI programs. Entire branches of physics, chemistry, astronomy, and other fields have been transformed.

Today, a second transformation is underway — a transformation to data-centered eScience, which requires semi-automated discovery in enormous volumes of data using techniques such as data mining and machine learning, much of which is based on years of basic research in statistics, optimization theory, and algorithms.

Computational science is a game-changer.

Some non-inclusions

Quantum computing. There is huge potential here, but the impact hasn’t been felt yet.

Simultaneous multithreading. We claim that this, and many other important advances in computer architecture, are dominated by cluster computing. (Remember, we’re trying to be provocative here! Blame Dave Ditzel, who put this idea into Ed’s head.)

Your part goes here!

What’s your reaction to the four game-changers that we’ve identified? Do you agree that they belong on the list? If not, why not? If so, what do you think were the principal components of each — the key contributing research results?

Even more importantly, give us eight more! What are your nominees for game-changing advances from computing research conducted in the past 20 years?

Give us your thoughts!

Ed Lazowska and Peter Lee

Game-Changing Advances from Computing Research

56 comments

  1. Peter Lee says:

    By the way, you can think of this as an update of the National Academies’ exercises that led to the famous “Tire Tracks Diagrams” of 1995 and 2003 (see http://books.nap.edu/openbook.php?record_id=10795&page=6 and http://books.nap.edu/openbook.php?record_id=10795&page=7 for the 2003 version), except that “it’s a billion dollar business” is not a requirement for inclusion.

  2. It seems like cryptography is a significant game changer. Specifically, public-key cryptography. Now the main results were more than 20 years ago, buy since you’re including information retrieval, this would be another reasonable inclusion.

    Arguably the web would not have been a friendly place for commerce without RSA and the like.

    Another area on which the jury might still be out (but is further along than (say) quantum computing in terms of impact) is the computational perspective on game theory and economics, and how it has led to developments in auction theory, search and advertising.

  3. “Computational science” seems rather to be evidence of a changed game, rather than a “game changer”. The enabling technologies for computational science are widespread and diffuse.

    I agree with Suresh that crypto is a game changer.

    I would also put forward wireless networking (broadly defined) as a game changer. Being able to connect on a laptop or cell phone without being tethered to a building is certainly changing my game!

  4. Andrew Ferguson says:

    The text analytics work from the WebFountain work, which defined the UIMA platform should definitely be considered.

  5. Paul Resnick says:

    I think you should consider recommender systems/collaborative filtering as a game changer.

    It dramatically altered how we think about computing applications by introducing the idea that the actions and preferences of other people could be a useful resource in computations intended to support someone else’s activities.

    It is easily appreciated by a broad audience (anyone who has used Amazon’s “people who bought this also bought…” or other social features; a somewhat narrower audience will also appreciate that a major improvement in search engine performance occurred when they started taking into account link structures and then click behaviors).

    There’s a clear tie to computing research, both in work on algorithms for using data from other people, and in interfaces for collecting it and presenting predictions or recommendations. The idea was first articulated in CACM and in the ACM CSCW and CHI conferences, and there are now thousands of papers about it and for the last two years an ACM RecSys conference devoted just to it.

    There has been a lot of NSF-funded university research in this area, including some of the early work. (There has also been a lot of work in industry).

  6. Joshua Grochow says:

    I would include the Large Hadron Collider as an important example under computational science. Without massive computing infrastructure, the data we get from the LHC would be far too massive for us to have a hope of learning something from it.

    Other things that we can’t quite do but can almost due, thanks to computing research: in silico drug discovery, and genomics-based medicine.

    In response to Rance Cleaveland: I agree that computational science is a changed game rather than a game-changer. But perhaps the title “Computational Science” is the best way to describe the confluence of ideas that went into changing this game: cluster computing, numerical analysis, lots and lots of algorithms research, … In that sense, computational science might be a good argument for CS in general, rather than any specific game-changing aspect of CS.

  7. Micha Hofri says:

    Image processing: Our interaction with visual inputs of nearly any type has been changed, in many cases transformed.

    I am uncertain about the role of NSF in video game development (DOD is big there), but it is a large industry shaping behavior on a global scale; think medical imaging, satellite imaging for weather and climate research; digital photography (pace Kodak).

    Should desktop publishing come here too? A major game-changer, though its main impact came when tied with the web, and the role of Federal funding was not seminal here, I suppose.

  8. How about:

    Machine Learning
    Statistical Machine Translation
    Data-driven Natural Language Parsing
    Mobile Computing
    Relational Databases
    Cryptography
    Social/Online Networking
    Social Media
    Collaboration Software (e.g., Wikis)

  9. John Hules says:

    If computational science makes your final list, you should acknowledge the role of DOE supercomputer centers. After all, DOE founded the first unclassified supercomputer center (now known as NERSC) in 1974, which served as a model for the NSF centers.

  10. Kevin Sullivan says:

    Software design architecture.

    Federally funded research in software design architecture has had a significant practical impact on industrial development of software and software-intensive systems, and a major intellectual impact on software engineering and languages research.

    Continuing advances in this area appear to be needed to manage complexity and harness the power of computation in highly complex, cyber-physical-social systems of the future: in transportation, environment, energy, health care, defense, communication, and so on.

  11. 1) GPUs and associated graphics/games algorithms.

    2) Computational Science has become a third pillar of the scientific enterprise, a peer alongside theory and physical experiment. See the PITAC Report: http://www.nitrd.gov/pitac/reports/20050609_computational/computational.pdf

  12. Robert Drost says:

    Mobile computing has been a game changer. Bringing around a digital persona (laptop) that contains >300GB of work data, projects, pictures, videos along with the GUI interface, and processing power to modify and use it has changed the way that we work, live, and communicate. Occasional wireless access is clearly a bonus that makes it easy to update and communicate often with others, but the raw ability to carry around our digital life and work as well as substantial processing power changes how we live and work what we accomplish on a daily basis.

    Just for fun, here’s one of my wish list items that I hope is one the follow-on list in 20 years: Autonomous, intelligent cars that can drive us to work or wherever. Not only could these coordinate better with respect to intelligent traffic systems and keep a cooler head and be safer by not being susceptible to distractions, but many of us would get around an additional hour a day, or 250 hours a year to work or rest, adding about 12% or productivity to the typical 2000 hour work-year.

  13. Software Everywhere

    Over the last 20 years, software has crept in in all aspects of our
    lives and now controls everything: computers, phones, networks,
    communications, airplanes, power, traffic lights, cars, elevators,
    elections, etc. Today, millions of software engineers develop all this
    software. Software as a whole is arguably the most complex artifact
    ever engineered by human beings. Software engineering jobs
    (developers, testers, etc.) might very well outnumber engineering jobs
    in all other engineering disciplines combined (???). How is all this
    software being written? Well, poorly, some might joke. But the truth
    is, the software industry would not have been able to sustain its
    tremendous growth over the past 20 years without significant advances
    in software engineering, through better languages and abstractions,
    common platforms and standardization, shared libraries and the
    internet, and modern design and analysis tools which are required to
    master complexity and boost productivity.

    Just my (biased) 2 cents…

  14. Bob Futrelle says:

    The development of methods to analyze huge databases of genome data. This has led to enormous advances in biology and medicine and to our understanding of the living world.

    There are important related techniques that have been developed, e.g, molecular structure visualization (but > 20 y old)

  15. Lawrence Brandt says:

    The ubiquity of computing. It’s familiarity to all sectors of society and the related potential for changing the world via remote education, health, entertainment etc. The invisibility of computers in, for example, automobiles, communications devices, medical implants.

  16. Jim Waldo says:

    Two networks that I would add to the list– sensor networks, that allow new ways of gathering data, and the cell phone network, which we will soon (or perhaps are beginning to) realize is the largest sensor network of them all.

    And, of course, GPS is definitely a game changer.

  17. anon says:

    I strongly believe that the list must include at least one theoretical game changer, though it will be hard to sell for non-computer scientists.

    The use of logic in various areas of computer science was unexpected, at least for non-theorist. Formal Methods, AI, NLP, … Just take a look at handbook of logic in computer science and handbook of logic in AI and logic programming.

    Martin Lof’s Type Theory, Denotational Semantics, …

    Open Source movement was/is also an important game changer in a higher level.

  18. Ruzena Bajcsy says:

    Wireless sensors and their distributed processing is a game changer in monitoring elderly and childern (the most vulnerable) and georgrpahiclaly distributed communication (meeting and intercating in Virual worlds)

  19. Keith Cooper says:

    The complex of events (both theoretical advances and deployment of practical, useful software) that allow a user to type a credit card number into a web browser and be reasonably assured of its safety deserves consideration. Here, I am thinking of the theory and practice of public-key encryption, up to and including the tools that allow my mom to obtain a public key without needing expertise in software engineering.

    Clearly, this family of related results changed the game, making secure communication and secure commerce a reality for (potentially) all users of the Internet. Without these artifacts, we would have no amazon.com, no ebay, …

  20. Ken Forbus says:

    Off-the-shelf representation resources. WordNet, for example, revolutionized natural language research and applications by providing a broad-scale, open-license resource that anyone could use. VerbNet and OpenCyc/ResearchCyc, both of which are much newer, look like they will have similar impact over time. Want a million-fact knowledge base? Download it from SourceForge! That changes what researchers can do.

  21. Eugene Charniak says:

    Statistical machine learning and the reformulation of many aspects of AI (natural-language process, vision) as applied statistical learning.

  22. Eugene Charniak says:

    Statistical machine learning and the reformulation of many aspects of AI (natural-language processing, computer vision) as applied statistical learning.

  23. Edward Feigenbaum says:

    In the robotics area of AI, the Stanley autononous vehicle that won the first DARPA Grand Challenge; and the CMY vehicle that won the 2nd (urban) DARPSA Grand Challenge.

  24. Bruce Buchanan says:

    Expert Systems Become Ubiquitous

    Thousands of routine decisions daily are made by computer systems that have specialized knowledge of a problem area. In the past, rule changes at a central office — e.g., the IRS, or the headquarters for a corporation — were incorporated slowly into practice. With expert systems, the people making the decisions have the benefit of codified knowledge bases that reflect current policy and practices.

    Research on expert systems began in the 1970’s with support from DARPA, the National Institutes of Heath, and NSF. Expert systems have subsequently become an essential part of the IT toolkit for every major company. Help desks, credit checking and equipment troubleshooting are examples of systems that have been replicated many times over and are routinely saving money for business and public institutions.

    Expert systems technology is a game changer.

  25. Mead and Conway introducing VLSI to the academic community through their influential book, followed by the revolution in CAD/VLSI (design rule checkers, routers, switch level simulators, Binary decision diagrams and the verification technology, all the way through modern delay and power estimators, the founding of MOSIS for University Design projects, …) has been a game changer.