Our friend and colleague, Information Technology and Innovation Foundation (ITIF) President Rob Atkinson, has written a great blog post over on the Innovation Policy Blog arguing for the utility of “Big Data”/analytics in the public sector:
Recently more attention has been drawn to the emergence of “Big Data” — large scale data sets that businesses and government are using to unlock new value using today’s computing and communications power. As a McKinsey Global Institute (MGI) study recently showed, Big Data offers a wide range of commercial opportunities in virtually every sector of the economy for the United States. To take one example, the MGI estimates that better use of big data in health care could generate an additional $300 billion, with approximately two-thirds of that values coming from more efficient delivery of health care.
The use of Big Data should not be confined to just the private sector; data offers incredible new opportunities to the public sector as well. Policymakers have the opportunity to use Big Data to improve government in areas such as public safety, public health, public utilities and public transportation…
Better use of data can help government agencies, from city agencies to federal bureaucracies, operate more efficiently, create more transparency, and make more informed decisions. And government can use cloud computing to more efficiently develop online systems that provide anytime, anywhere access to information. However, government officials should do more to spur uses of data. Taking advantage of these opportunities will require federal government leadership, such as the Department of Commerce creating a data policy office to spur data innovation and overcome obstacles to adoption, all the while protecting privacy. And going forward, government agencies will increasingly have to deal with issues such as data security and identity management, so these issues do not become impediments to successful utilization of data analytics. Local governments can help pioneer the use of data as well. For example, the city of Boston city sponsored the development of a mobile app “Street Bump” to automatically determine where potholes are based on data collected using citizen’s smart phones equipped with GPS and accelerometers. Tools like these are helping create “smart cities” and build a world that is alive with information.
Although there have been many successes in this area, much more can be done…
Overall, more investment in data infrastructure and analytics will enable government to better provide and efficiently deliver values and services to its citizens…
Atkinson’s list of public-sector examples is quite impressive:
- Electric power utilities can use data analytics and smart meters to better manage resources and avoid blackouts;
- Food inspectors can use data to better track meat and produce safety from farm to fork;
- Public health officials can use health data to detect infectious disease outbreaks;
- Regulators can track pharmaceutical and medical device safety and effectiveness through better data analytics;
- Police departments can use data analytics to target crime hotspots and prevent crime waves;
- Public utilities can use sensors to collect data on water and sewer usage to detect leaks and reduce water consumption;
- First responders can use sensors, GPS, cameras and better communication systems to let police and fire fighters better protect citizens when responding to emergencies; [and]
- State departments of transportation can use data to reduce traffic, more efficiently deploy resources, and implement congestion pricing systems.
Meantime, IBM’s Vice President of Technology Policy Tim Sheehy spoke to The Washington Post about the value of Big Data analytics for government:
The big data analytics piece is particularly good for government. They have a lot of data and can tap a good bit of it at the moment. But big data analytics lets you have all that data at your disposal in real time. We’ve already started to see applications of it. It’s very good for prevention of fraud or just finding mistakes in government program. Right now, we’re working with states such as New York to root out errors in their Medicare and Medicaid systems. We’ve been having success in applying complicated models of data streams to these systems. This is about squeezing as much of the data as you can.
It’s also useful for things like smarter transportation. Known technology we already have in the pipeline can predict traffic flow and traffic disruption. We’re using that in Rio de Janeiro, which has got both the Olympics and the World Cup coming up. Right now, we’re working with public safety officials to apply some of this stuff there.
For more information, check out Atkinson’s blog post or Sheehy’s interview in The Washington Post, linked above.
And as we’ve previously noted in this space, the CCC produced a series of white papers last fall articulating the “Big Data” challenges in several areas of national priority, including healthcare, energy, transportation, and national defense. Check out the white papers here.
(Contributed by Erwin Gianchandani, CCC Director)