Archive for the ‘big science’ category


Ebola-Fighting Robots

October 22nd, 2014

Credit: Worcester Polytechnic Institute

Could robots really aid in the Ebola fight?

On November 7th, robotics researchers from around the country will come together to try to answer that question. They will see if robots can prevent the spread of Ebola by possibly decontaminating infected equipment and or even burying victims.

Robin Murphy, a professor of computer science and engineering at Texas A&M University and former CCC council member, is helping to set up this Safety Robotics for Ebola Workers workshop. The workshop will bring together health care workers, relief workers and roboticists. It is co-hosted by the White House Office of Science and Technology Policy, Texas A&M, Worcester Polytechnic Institute and the University of California, Berkeley.

The goal of the workshop is for the roboticists to hear directly from those who have been working on the outbreak. That way they can learn what is needed to help patients, prevent the spread of the virus, and protect aid workers from infection.

Click here to learn more and see the Computerworld article.

NIST Global City Teams Challenge Report

October 20th, 2014

smart america global city teams

The National Institute of Standards and Technology (NIST) launched their Global City Teams Challenge with a great deal of energy and enthusiasm last month.  The workshop ended with more than a dozen presentations by potential Global City Team Challenge teams and provided an opportunity for interested parties to discuss Internet-of-Things deployments in a smart city environment.

From the workshop report:

The US Ignite website now contains materials related to 20+ potential Global City Teams projects or Action Clusters.

  1. If you would like to learn more about one of the listed projects or if you are interested in becoming associated with one of the projects, please email and
  2. If you have an existing or new smart city project that you would like to add to conduct under the auspices of the Global City Teams Challenge, please email and
  3. If you have contributed to a project on the list, but have not yet been contacted by a team leader, please email

If you are interested in the Challenge, and were not able to attend the kick-off workshop, there is a webinar scheduled for Wednesday, October 22 at 10:00am (US Eastern Time). Please use this link for the upcoming webinar: To call into the webinar, please use phone: 1 (408) 650-3131 and passcode: 832557933#. The webinar will be no more than 1-hour and will include status updates from NIST and Q&A session.

For more information, see the Global City Team Challenge website and the Smart America Global City Team website.


Accelerating the Big Data Innovation Ecosystem

September 4th, 2014


In March 2012, the Obama Administration announced the “Big Data Research and Development Initiative.” The goal is to help solve some of the Nation’s most pressing challenges by improving our ability to extract knowledge from large and complex collections of digital data. The Administration encouraged multiple stakeholders including federal agencies, private industry, academia, state and local government, non-profits, and foundations, to develop and participate in Big Data innovation projects across the country.

National Science Foundation is exploring the establishment of a national network of “Big Data Regional Innovation Hubs.” These Hubs will help to sustain new regional and grassroots partnerships around Big Data. Potential roles for Hubs include, but are not limited to:

  • Accelerate the ideation and development Big Data solutions to specific global and societal challenges by convening stakeholders across sectors to partner in results-driven programs and projects.
  • Act as a matchmaker between the various academic, industry, and community stakeholders to help drive successful pilot programs for emerging Big Data technology.
  • Coordinate across multiple regions of the country, based on shared interests and industry sector engagement to enable dialogue and share best practices.
  • Aim to increase the speed and volume of technology transfer between universities, public and private research centers and laboratories, large enterprises, and SMB’s.
  • Facilitate engagement with opinion and thought leaders on the societal impact of Big Data technologies as to maximize positive outcomes of adoption while reducing unwanted consequences.
  • Support the education and training of the entire Big Data workforce, from data scientists to managers to data end-users.

The National Science Foundation (NSF) seeks input from stakeholders across academia, state and local government, industry, and non-profits across all parts of the Big Data innovation ecosystem on the formation of Big Data Regional Innovation Hubs. Please submit a response of no more than two-pages to outlining:

  1. The goals of interest for a Big Data Regional Hub, with metrics for evaluating the success or failure of the Hub to meet that goal;
  2. The multiple stakeholders that would participate in the Hub and their respective roles and responsibilities;
  3. Plans for initial and long-term financial and in-kind resources that the stakeholders would need to commit to this hub; and
  4. A principal point of contact.

Please submit responses no later than Nov 1, 2014. For more information see the NSF announcement.


Computing a Cure for HIV

June 27th, 2014

UIUCOn June 26, the National Science Foundation (NSF) released a Discovery article titled Computing a Cure for HIV, written by Aaron Dubrow, Public Affairs Specialist in the Office of Legislative & Public Affairs.  The article provides an overview of the disease and how it continues to afflict millions of people worldwide.

Over the past decade, scientists have been using the power of supercomputers “to better understand how the HIV virus interacts with the cells it infects, to discover or design new drugs that can attack the virus at its weak spots and even to use genetic information about the exact variants of the virus to develop patient-specific treatments.”

Here are 9 projects that are using supercomputing and computational power to help fight the disease:

  1. Modeling HIV: from atoms to actions
  2. Discovery of hidden pocket in HIV protein leads to ideas for new inhibitors
  3. Preventing HIV from reaching its mature state
  4. Crowdsourcing a cure
  5. Virtual screening of HIV inhibitors
  6. Membrane effects
  7. Computing patient-specific treatment methods
  8. Preparing the next generation to continue the fight
  9. A boy and the BEAST

You can read more about these projects in the full article here.

Recent ISAT/DARPA Workshop Targeted Approximate Computing

June 23rd, 2014

The following is a special contribution to this blog by by CCC Executive Council Member Mark Hill and workshop organizers Luis Ceze, Associate Professor in the Department of Computer Science and Engineering at the University of Washington, and James Larus, Full Professor and Head of the School of Computer and Communication Sciences at the Ecole Polytechnique Federale de Lausanne

ApproximateLuis Ceze and Jim Larus organized a DARPA ISAT workshop on Approximate Computing in February, 2014. The goal was to discuss how to obtain 10-100x performance and similar improvements in MIPS/watt out of future hardware by carefully trading off accuracy of a com
putation for these other goals. The focus was not the underlying technology shifts, but rather the likely radical shifts required in hardware, software and basic computing systems properties to pervasively embrace accuracy trade-offs.

Below we provide more-detailed motivation for approximate computing, while the publicly-released slides are available here.

Given the end of Moore’s Law performance improvements and imminent end of Dennard scaling, it is imperative to find new ways to improve performance and energy efficiency of computer systems, so as to permit large and more complex problems to be tackled with constrained power envelopes, package sizes, and budgets. One promising approach is approximate computing, which relaxes the traditional digital orientation of precisely stated and verified algorithms reproducibly and correctly executed on hardware, in favor of approximate algorithms that produce “sufficiently” correct answers. The sufficiency criteria can either be a probabilistic one that results are usually correct, or it can be a more complex correctness criteria that the most “significant” bits of an answer are correct.

Approximation introduces another degree of freedom that can be used to improve computer system performance and power efficiency. For example, at one end of the spectrum of possible approximations, one can imagine computers whose circuit implementations employ aggressive voltage and timing optimizations that might introduce occasional non-deterministic errors. At another end of the spectrum, one can use analog computing techniques in select parts of the computation. One can also imagine entirely new ways of “executing” programs that are inherently approximate, e.g., what if we used neural networks to carry out “general” computations like browsing the web, running simulations, or doing search, sorting, and compression of data? Approximation opportunities go beyond just computation, since we can also imagine ways of storing data approximately that leads to potential retrieval errors, but is much denser, faster and energy efficient. Relaxing data communication is another possibility, since almost all forms of communication  (on-chip, off-chip, wireless, etc) use resources to guarantee data integrity, which is often unnecessary from the application point of view.

Obviously approximation is not a new idea, as it has been used in many areas such as lossy compression and numeric computation. However, these applications of the ideas were implemented in specific algorithms, which ran as part of a large system on a conventional processor. Much of the benefit of approximation may accrue from taking a broader systems perspective, for example by relaxing storage requirements for “approximate data”. But there has been little contemplation of what an approximate computer system would look like. What happens to the rest of the system when the processor evolves to support approximate computation? What is a programming model for approximate computation? What will programming languages and tools that directly support approximate computation look like? How do we prove approximate programs “correct”? Is there a composability model for approximate computing? How do we debug them? What will the system stack that supports approximate computing look like? How do we handle backward compatibility?

DARPA Officially Launches Robotics Grand Challenge – Watch Pet-Proto Robot in Action

October 24th, 2012

Today, the Defense Advanced Research Projects Agency (DARPA) officially kicked off its newest Grand Challenge, DARPA Robotics Challenge (DRC). As Boston Dynamics robot [credit Boston Dynamics]we’ve blogged previously, the Grand Challenge calls for “a humanoid robot (with a bias toward bipedal designs) that can be used in rough terrain and for industrial disasters.” DARPA also released a video of Pet-Proto, a humanoid robot manufactured by Boston Dynamics. Pet-Proto, a predecessor to DARPA’s Atlas robot, is an example of what the agency envisions for the challenge.

Watch Pet-Proto in action, as it navigates obstacles:


More about the challenge from DARPA:

The Department of Defense’s strategic plan calls for the Joint Force to conduct humanitarian, disaster relief and related operations.  The plan identifies requirements to extend aid to victims of natural or man-made disasters and conduct evacuation operations.  Some disasters, however, due to grave risks to the health and wellbeing of rescue and aid workers, prove too great in scale or scope for timely and effective human response.  The DARPA Robotics Challenge (DRC) will attempt to address this capability gap by promoting innovation in robotic technology for disaster-response operations.


The primary technical goal of the DRC is to develop ground robots capable of executing complex tasks in dangerous, degraded, human-engineered environments.  Competitors in the DRC are expected to focus on robots that can use standard tools and equipment commonly available in human environments, ranging from hand tools to vehicles, with an emphasis on adaptability to tools with diverse specifications.


To achieve its goal, the DRC aims to advance the current state of the art in the enabling technologies of supervised autonomy in perception and decision-making, mounted and dismounted mobility, dexterity, strength, and platform endurance.  Success with supervised autonomy, in particular, could allow control of robots by non-expert operators, lower the operator’s workload, and allow effective operation even with low-fidelity (low bandwidth, high latency, intermittent) communications.


The DRC consists of both robotics hardware and software development tasks and is structured to increase the diversity of innovative solutions by encouraging participation from around the world, including universities, small, medium and large businesses, and even individuals and groups with ideas on how to advance the field of robotics.  Detailed descriptions of the participant tracks are available in the DRC Broad Agency Announcement.


A secondary goal of the DRC is to make software and hardware development for ground-robot systems more accessible to interested contributors, thereby lowering the cost of acquisition while increasing capabilities.  DARPA seeks to accomplish this by creating and providing government-furnished equipment (GFE) to some DRC participants in the form of a robotic hardware platform with arms, legs, torso and head.  Availability of this platform will allow teams without hardware expertise or hardware to participate.  Additionally, all teams will have access to a government-furnished simulator created by DARPA and populated with models of robots, robot components and field environments.  The simulator will be an open-source, real-time, operator-interactive virtual test bed, and the accuracy of the models used in it will be rigorously validated on a physical test bed.  DARPA hopes the creation of a widely available, validated, affordable, and community supported and enhanced virtual test environment will play a catalytic role in development of robotics technology, allowing new hardware and software designs to be evaluated without the need for physical prototyping.


The DRC Broad Agency Announcement was released on April 10, 2012.


The DRC kicked off on October 24, 2012, and is scheduled to run for approximately 27 months with three planned competitions, one virtual followed by two live. Events are planned for June 2013, December 2013 and December 2014.

To learn more, check out the DARPA Robotics Challenge page.

(Contributed by Kenneth Hines, CCC Program Associate)