Computing Community Consortium Blog

The goal of the Computing Community Consortium (CCC) is to catalyze the computing research community to debate longer range, more audacious research challenges; to build consensus around research visions; to evolve the most promising visions toward clearly defined initiatives; and to work with the funding organizations to move challenges and visions toward funding initiatives. The purpose of this blog is to provide a more immediate, online mechanism for dissemination of visioning concepts and community discussion/debate about them.


Pacemaker Recall Exposes National Need for Research and Education in Embedded Security

September 8th, 2017 / in Announcements, pipeline, policy, research horizons, Research News / by Helen Wright

Kevin Fu and Denis Foo Kune test pacemaker security with a synthetic cadaver as part of the NSF THAW.org project.

The following is a guest blog post from CCC Council Member and Cybersecurity Task Force Chair Kevin Fu from the University of Michigan.

“From pacemakers to autonomous vehicles, national computing research and education initiatives for embedded security will lay a crucial foundation for the Internet of Everything era,” says Fu.

Last month, the Food and Drug Administration (FDA) issued the first major recall of a medical device because of a cybersecurity risk. Nearly half a million pacemakers were recalled for a software update that clinicians will apply during a patient’s in-clinic visit. Our team has spent a decade analyzing security problems and solutions in pacemakers and other medical devices. While software recalls by FDA are not new, the recall highlights the need to build security into the Internet of Everything that is powered by embedded computing systems. Pacemakers represent the canary in the coal mine—warning of the pressing challenges for embedded security research and education.

I interviewed Computing Community Consortium (CCC) Council members and distinguished researchers on how the nation can lead embedded security research and education rather than follow and be surprised by the next medical device or automobile cybersecurity recall.

“It used to be that just computers were computers. Today, almost everything is a computer—your phone, your TV, your car, your child’s toys, your home, your refrigerator, and your medical device,” explains CCC Council member Dr. Nadya Bliss, Director of the Global Security Initiative at the Arizona State University.  “While the Internet of Things brings exciting advances to various aspects of our lives ranging from healthcare to transportation, defense, finance, manufacturing, and critical infrastructure, they also create vulnerabilities in critical infrastructure at an unprecedented scale,” warns Dr. Farinaz Koushanfar of Electrical and Computer Engineering at UC San Diego. Dr. Sam H. Fuller, CTO Emeritus of Analog Devices and member of the National Academy of Engineering points out, “As devices and systems are increasingly interconnected, security has become a critical property of their embedded hardware and software.”  

In addition to these challenges, there is a large gap in the research and education needed to build security into embedded systems ranging from medical devices and autonomous vehicles to the Internet of Things.

My colleagues from Dartmouth College, Illinois, Johns Hopkins, Michigan, and Vanderbilt investigated how to move beyond hacking medical devices to discovering constructive approaches that improve embedded security for healthcare as part of the Trustworthy Health and Wellness Frontiers Project sponsored by the National Science Foundation (NSF). A number of research groups, companies, and federal agencies now serve this marketplace of ideas, products, and services. While the world calls leaders and organizations in the U.S. for advice on medical device security, our country struggles to be on speed dial for world leadership in embedded security.

Because of early investments in embedded security by industry and government, Europe and Asia are home to the top embedded security university labs in the world such as COSIC in Belgium, the EMSEC Lab in Germany, the TAMPER Lab in the U.K., and the USS Lab in China. Embedded security faculty have departed the U.S. for these brighter research opportunities overseas. The U.S. government even outsourced the manufacture of electronic passports to a European-based embedded security company. While an economic boon for some countries, this outsourcing trend will likely worsen as U.S. households and businesses embrace IoT technology, and seek products and security solutions from vendors overseas.

“If the U.S. hopes to be competitive in any technology area, it cannot cede embedded security research and education,” says Michael Westra who leads connected and in-vehicle embedded security at Ford Motor Company.

What needs to be done for the U.S. to catch up?

Dr. Srini Devadas from the Massachusetts Institute of Technology advises, “We need a coordinated, nation-wide effort in embedded security research and education to address these challenges.”

Academic partnerships with industry are key to improving embedded security. “Industry needs to develop the equivalent of a building code for embedded systems to withstand attacks already at our doorstep. Embedded software must be up to code,” cautions Dr. Carl Landwehr who previously developed and managed major cybersecurity programs at NSF and IARPA.

Dr. Greg Morrisett, Dean of Computing and Information Sciences at Cornell University, reminds us to respect safety, “Wherever computation can have a direct effect on human safety (embedded or otherwise), and especially where that computation is connected to the broader Internet (which is the common case now), we desperately need new research ideas and new practical methodologies for gaining assurance.”

Safety is not the only casualty from lack of embedded security. “For embedded systems, hacking can also result in loss of data that could be used to hurt the individual or their property,” according to Dr. Keith Marzullo, Dean of the University of Maryland College of Information Studies. “Not only do we need to educate designers, developers, and operators, we also need to educate legislators, lawyers, and policy makers on the current limitations and the possible ways forward.”

In the U.K., Dr. Ross Anderson is Professor of Security Engineering at Cambridge University and is known for analyzing embedded security as well as the vulnerabilities in bank cards and ATMs. Anderson poses an unsettling question, “What does it mean for a thermometer to be secure, or an air-conditioner?” It’s not realistic to ask an ordinary citizen to worry about cybersecurity of their domestic appliances, their cars or their medical devices. Embedded security must be built in.

But there’s a skill and workforce gap. While the number of students who can write apps has exploded, there aren’t enough university programs that supply students with the skills necessary to protect pacemakers and other embedded devices. This unmet need for education led Dr. Sujeet Shenoi to create interdisciplinary embedded security programs at the University of Tulsa. “The work is super technical. It needs strong hardware, software, and domain engineering expertise. This takes years of learning and doing. My customers have an insatiable need for Ph.D.s in embedded security,” explains Shenoi.

Will another medical device or automobile or airplane security hack emerge? Almost certainly. The cost will be borne on everyday citizens, but there is a small window of opportunity to fix these problems before the security vulnerabilities are permanently baked into millions of devices that will persist for decades in our homes, cars, and bodies.

It will take forethought and leadership to make strategic national investments in embedded security research and education so that people can trust in their cars, pacemakers, and the coming tsunami of IoT products.

Pacemaker Recall Exposes National Need for Research and Education in Embedded Security