Computing Community Consortium Blog

The goal of the Computing Community Consortium (CCC) is to catalyze the computing research community to debate longer range, more audacious research challenges; to build consensus around research visions; to evolve the most promising visions toward clearly defined initiatives; and to work with the funding organizations to move challenges and visions toward funding initiatives. The purpose of this blog is to provide a more immediate, online mechanism for dissemination of visioning concepts and community discussion/debate about them.


NSF/VMware Partnership on Edge Data Computing Infrastructure

February 27th, 2018 / in Announcements, NSF, pipeline, policy / by Helen Wright

Contributions to this blog were provided by Gera Jochum, Communications Specialist, in the Computer and Information Science and Engineering (CISE) Directorate at the National Science Foundation and CCC Vice Chair Mark D. Hill from University of Wisconsin, Madison.

The National Science Foundation (NSF) and VMware have come together to create a public/private partnership on edge data computing infrastructure.

See the synopsis of the program below.

The proliferation of mobile and Internet-of-Things (IoT) devices, and their pervasiveness across nearly every sphere of our society, continues to raise questions about the architectures that organize tomorrow’s compute infrastructure. At the heart of this trend is the data that will be generated as myriad devices and application services operate simultaneously to digitize a complex domain like a smart building or smart industrial facility. A key shift is from edge devices consuming data produced in the cloud to edge devices being a voluminous producer of data. This shift reopens a broad variety of system-level research questions concerning data placement, movement, processing and sharing. Importantly, the shift also opens the door to compelling new applications with significant industrial and societal impact in domains such as healthcare, manufacturing, transportation, public safety, energy, buildings, and telecommunications.

Edge computing is broadly defined as a networked systems architectural approach in which compute and storage resources are placed at the network edge, in proximity to the mobile and IoT devices. The approach offers advantages, such as improved scalability as local computation reduces the volume of data transported, reduced network latency and faster compute response times as data is processed on local compute nodes, and arguably improved security and privacy where data requirements preclude access and exchanges beyond the edge. Edge computing infrastructure may consist of IoT gateways, telephone central offices, cloudlets, micro data centers, or any number of schemes that support the provisioning of communication, compute and storage resources near edge devices.

This solicitation seeks to advance the state of the art in end-to-end networked systems architecture that includes edge infrastructures. The central challenge is to design and develop data-centric edge architectures, programming paradigms, runtime environments, and data sharing frameworks that will enable compelling new applications and fully realize the opportunity of big data in tomorrow’s mobile and IoT device environments. Researchers are expected to carefully consider the implications of edge computing’s multi-stakeholder context, and the need for security and privacy as first order design and operational considerations.

Researchers are expected to develop prototypes of their proposed approaches to explore implementation aspects of their designs and to demonstrate the effectiveness of their approaches empirically. Prototypes should leverage existing software tools and frameworks where possible, and avoid unnecessary re-invention. Some notable multi-tenancy frameworks include VM-based cloudlets, EdgeX Foundry from the Linux Foundation, and Open Edge Computing. Having said that, researchers are welcome to develop new software frameworks whenever required by their investigations. Proposals should explain the gap and justify the need for developing something entirely new.

Researchers are encouraged to use existing testbeds for deploying and testing prototypes of their work, and for collecting data to demonstrate the effectiveness of their approach. Such testbeds include but are not limited to:

  • Global Environment for Network Innovations (GENI); and
  • NSF FutureCloud with projects Chameleon and CloudLab.

CCC Vice Chair Mark Hill adds, “These public-private partnerships can be a win-win-win for companies, researchers, and the public they both serve. Just ask Chris Ramming from VMware who is playing a key role facilitating these at VMware and his former employer Intel.”

The full proposal deadline is May 22, 2018. For more information, see the full solicitation here.

NSF/VMware Partnership on Edge Data Computing Infrastructure

Comments are closed.