Computing Community Consortium Blog

The goal of the Computing Community Consortium (CCC) is to catalyze the computing research community to debate longer range, more audacious research challenges; to build consensus around research visions; to evolve the most promising visions toward clearly defined initiatives; and to work with the funding organizations to move challenges and visions toward funding initiatives. The purpose of this blog is to provide a more immediate, online mechanism for dissemination of visioning concepts and community discussion/debate about them.


Humanitarian Response and CRICIS — A Report from a Grassroots Workshop

September 24th, 2012 / in policy, research horizons, workshop reports / by Kenneth Hines

Robin Murphy, Texas A&M UniversityThe following is a contribution to this blog from Robin Murphy, Raytheon Professor of Computer Science and Engineering and director of the Center for Robot-Assisted Search and Rescue at Texas A&M University. Back in April, Robin co-organized a visioning workshop about the role of computing in disaster management (including preparedness, prevention, response, and recovery). In this blog entry, Robin describes her participation at a workshop held last week in DC on Connecting Grassroots to Government for Disaster Management. 

I participated in the Wilson Center’s workshop on Connecting Grassroots to Government for Disaster Management last week where I briefed 60 physical and 150 remote participants on the NSF/CCC Workshop on Computing for Disaster Management and the subsequent report, Critical Real-Time Computing and Information Systems (CRICIS). The NSF/CCC CRICIS report succinctly identified unique fundamental computing research questions in disasters. The Wilson Center workshop illustrated why humanitarian response needs those fundamental research questions answered — and soon.

Humanitarian response is a community that truly lives socially, where a great deal of emphasis is on rapidly engaging cadres of volunteer experts such as Crisismappers and NetHope to mine, transform, and display data for disasters. The term “disaster” is taken in its broadest sense: earthquakes, tsunamis, hurricanes to be sure but also infant mortality and nutrition, medical care in political refugee camps, and preservation of the environment.

Just as the humanitarian community lives socially, they also think computationally. I saw too many useful and creative ideas to list here, but one in particular was a striking instance of computational thinking.  A marine biologist built a web-based system MarineMap, now SeaSketch, which allows California residents to go to a map of the shore and propose a marine protected area by interactively drawing the boundaries. The computer system then automatically evaluates the appropriateness of the area, provides feedback to the designer, and ranks the area for the agency.  Think of the hidden advances in practically every aspect of computing, and in education, that enabled that program!

The big lesson from the workshop is that the humanitarian response community needs computing advances beyond the low-hanging fruit of consumer-oriented apps and infrastructure. Their computing needs (and creativity) are exceeding what an average engineer or scientist can do in a year, thus out of range of their budgets or the time an expert can donate.  These are largely the same issues that were identified in the NSF/CCC workshop. The problems of extreme scales of time, space, data, and number of stakeholders were repeatedly discussed. The volunteers are getting overloaded with too much data and their algorithms are no longer fast enough. Security and privacy take on new meaning when working with displaced political refugees. Who owns the data collected by international responders?  How many layers in a mashup are enough? Are mashups the right visualization for all situations? What are the metrics for measuring the value to overall decision making in complex socio-technical systems? How can solutions be tested before a disaster? What data is really needed by what agencies and when do they need it in the prevention, preparedness, response and recovery lifecycle? The list is practically endless.

Two years ago I was told by a prominent computer scientist that disasters didn’t require any special computing, that a response agency or humanitarian group just needed to grab a smart kid for a couple of months to generate mashups. This view used to be shared by the humanitarian response community but, through their own success, they are now encountering real barriers as they work at extreme scales of time, geography, volumes of data, and numbers of agents. CRICIS computing will break down those computing barriers and enable more users to apply computational thinking to society’s problems.

Humanitarian Response and CRICIS — A Report from a Grassroots Workshop

Comments are closed.