Computing Community Consortium Blog

The goal of the Computing Community Consortium (CCC) is to catalyze the computing research community to debate longer range, more audacious research challenges; to build consensus around research visions; to evolve the most promising visions toward clearly defined initiatives; and to work with the funding organizations to move challenges and visions toward funding initiatives. The purpose of this blog is to provide a more immediate, online mechanism for dissemination of visioning concepts and community discussion/debate about them.


Symposium on Accelerating Science: A Grand Challenge for AI

July 28th, 2016 / in CCC, Research News, robotics / by Helen Wright

The following is a guest blog by Vasant G Honavar, a Computing Community Consortium (CCC) council member and a Pennsylvania State University professor.

The emergence of “big data” offers unprecedented opportunities for not only accelerating scientific advances but also enabling new modes of discovery. Some have gone so far as to suggest that “big data” makes the scientific method that has characterized natural science since the 17th century, consisting of systematic observation, measurement, and experiment, and the formulation, testing, and modification of hypotheses, obsolete. Nothing could be farther from truth.

 

The reality is that, in many disciplines, the emergence of big data exacerbates the gap between our ability to acquire, store, and process data and our ability to make effective use of the data to advance discovery. Despite successful automation of routine aspects of data management and analytics, most elements of the scientific process currently require considerable human expertise and effort.

As a recent CCC white paper noted, instead of making scientific method obsolete, accelerating science to keep pace with the rate of data acquisition and data processing calls for the development of a suite of computational lenses, i.e., algorithmic or information processing abstractions, coupled with formal methods and tools for modeling and simulation of natural processes. It also underscores the dire need for cognitive tools for scientists, i.e., computational tools that leverage and extend the reach of human intellect, and partner with humans on a broad range of tasks in scientific discovery (e.g., mapping the state of knowledge in a discipline and identifying gaps, formulating and prioritizing questions; designing, prioritizing and executing experiments; drawing inferences and constructing explanations and hypotheses from the literature, databases, knowledge bases, expressing and reasoning with scientific arguments of variable certainty and provenance; synthesizing findings from disparate observational and experimental studies; formulating new questions, in a closed-loop fashion). This calls for the formalization, development, analysis, of algorithmic or information processing abstractions of various aspects of the scientific process; the development of computational artifacts (representations, processes, software) that embody such understanding; and the integration of the resulting cognitive tools into collaborative human-machine systems and infrastructure to advance science, including tools for documentation, replication and communication of scientific studies, collaboration, team formation (incentivizing participants, decomposing tasks, combining results, engaging participants with different levels of expertise and abilities), communicating scientific results (across disciplinary boundaries and across levels of abstraction), tracking scientific progress and impact.

The resulting cognitive tools could dramatically increase the efficiency of the scientific process, improve the quality of science that is carried out (by reducing error, enhancing reproducibility), enable new modes of discovery that leverage large amounts of data, knowledge, and automated inference.

Successful development of such cognitive tools presents a grand challenge for Artificial Intelligence (AI), the enterprise of understanding and building intelligent systems, because it requires not only fundamental, integrative, and coordinated advances across all of subfields of AI, including perception, knowledge representation, automated inference, and information integration (including semantic web), machine learning, natural language processing, planning, decision making, distributed problem solving, robotics, human-human, and human-machine (including human-robot) communication, interaction, coordination.

Against this background, the Symposium on Accelerating Science: A Grand Challenge for AI (co-sponsored by Association for Advancement of Artificial Intelligence (AAAI), the Computing Community Consortium and the North East Big Data Hub) aims to bring together researchers in relevant areas of artificial intelligence, high performance data and computing infrastructures and services, and selected application areas to discuss progress on, and articulate a research agenda aimed at addressing, the AI grand challenge of accelerating science. The symposium is part of the 2016 AAAI Fall Symposia to be held in Arlington, VA, USA during November 17-19, 2016.

The symposium will consist of: an opening session to introduce the symposium topics, goals, participants, and expected outcomes, and several sessions consisting of invited as well as contributed talks and panels, breakout sessions, and a concluding plenary session summarizing the symposium.

Abstracts for consideration for contributed talks are due on August 7, 2016. Additional details can be found on the symposium webpage.

Symposium on Accelerating Science: A Grand Challenge for AI

Comments are closed.