Archive for the ‘awards’ category


Miriah Meyer Named a 2013 TED Fellow

November 12th, 2012

Miriah Meyer TED Fellow [credit: University of Utah]

Every year since 2007, The TED Fellows program has recognized young innovators from around the globe for their “insightful, bold ideas that have the potential to influence our world.” Last week, Miriah Meyer, one of our 2009 Computing Innovation Fellows, was selected as one of the 2013 TED Fellows – one of 20 fellows selected out of over 1200 applicants – for her pioneering efforts in interactive visualization:

Miriah Meyer (USA) – Science visualization designer
American designer who creates interactive visualization systems that help scientists make sense of complex data.

Miriah is being given the option to participate in either the TED Conference in Long Beach, CA, or the TEDGlobal in Edinburg, U.K.

It’s worth noting that Miriah was also featured in Technology Review’s annual list of 35 Innovators Under 35 last year.

Congratulations, Miriah, on another fantastic accomplishment!

And read more about all of the 2013 Fellows on the TED blog – “The proud, the few: the 2013 TED Fellows.

(Contributed by Kenneth Hines, CCC Program Associate)

NSF Awards $21 Million to Enable Use of Big Data

October 15th, 2012

Last week, the National Science Foundation (NSF) awarded $21.6 million to 34 institutions across the country through the foundation’s Campus Cyberinfrastructure-Network Infrastructure and Engineering (CC-NIE) program. NSF Big Data Award Photo [credit: Thinkstock]The projects will seek to improve U.S. University and college computer networks that are necessary for movement of the large data sets required for data-intensive scientific research. The awards to the 34 institutions across 23 states support two categories of awards:

Network Integration and Applied Innovation awards provide support of up to $1 million for up to two years.  These awards address the challenges of achieving end-user network performance across complex, distributed research and education environments.  They seek to integrate existing and new technologies with applied innovations by taking advantage of network research results, prototypes and emerging innovations–and using them to achieve higher levels of performance, reliability and predictability.


Data Driven Networking Infrastructure awards provide support of up to $500,000 for up to two years and address network infrastructure improvements at the campus level. These awards, for example, support upgrading and re-architecting campus networks to support movement of a wide range of large science data sets to include large files, sensor networks, distributed and real-time data.


Read the full press release from the NSF below:

The National Science Foundation recently awarded nearly $21.6 million to 34 campus-level networking projects to adapt and improve U.S. university and college computer networks that are necessary for movement of the large data sets required for data-intensive scientific research.


Made through NSF’s Campus Cyberinfrastructure-Network Infrastructure and Engineering (CC-NIE) program, the awards will enable academic research networks to run applications and share large, complex data, which are part of an expanding Big Data revolution.


Twenty-three states and 34 institutions across the country received awards.


“It’s good that so many academic institutions are taking advantage of this opportunity,” said Alan Blatecky, director of NSF’s Office of Cyberinfrastructure (OCI), which funded the awards. “We are building a phenomenal portfolio that benefits NSF’s academic research communities.”


The CC-NIE program was developed from a series of community discussions and input to enable NSF academic research communities to upgrade their campus-level fiber optic infrastructure and make improved, dynamic networks a reality. It leverages emerging networking capacities and capabilities, including research and innovation from the NSF-funded Global Environment for Networking Innovation project, which is some 250 times faster than networks available today.


“One of the goals of CC-NIE is to take advantage of network research and development results and to explore how they can be integrated and applied at the campus level,” said Kevin Thompson, CC-NIE program manager for NSF. “We see enormous potential to drive innovation in data networking this way and, at the same time, deliver usable networking services and capabilities to the NSF research and education community.”


The 34 CC-NIE projects support two categories of awards:


  • Network Integration and Applied Innovation projects that aim to achieve higher levels of performance, reliability and predictability for science applications and distributed research projects, and
  • Data Driven Networking Infrastructure for the Campus and Researcher projects that invest in improvements and re-engineering at the campus level to make use of dynamic network services that support a range of scientific data transfers and movement.

Network Integration and Applied Innovation awards provide support of up to $1 million for up to two years.  These awards address the challenges of achieving end-user network performance across complex, distributed research and education environments.  They seek to integrate existing and new technologies with applied innovations by taking advantage of network research results, prototypes and emerging innovations–and using them to achieve higher levels of performance, reliability and predictability.


Data Driven Networking Infrastructure awards provide support of up to $500,000 for up to two years and address network infrastructure improvements at the campus level. These awards, for example, support upgrading and re-architecting campus networks to support movement of a wide range of large science data sets to include large files, sensor networks, distributed and real-time data.


A complete list of awardees and projects is available on the NSF website.


(Contributed by Kenneth Hines, CCC Program Associate)

Computer Science Projects Among Popular Mechanics’ Breakthrough Awardees

October 4th, 2012

Popular Mechanics, the American Magazine which features regular articles on science and technology, released their annual breakthrough awardees earlier this week.  These awards highlight innovations that have the potential to make the world smarter, safer and more efficient. A total of ten awards were announced and at least four of the awardees feature computer science research. Four of these projects are featured below, all awardees are listed on Popular Mechanics’ webpage.

Popular Mechanics' Breakthrough Award [credit: Popular Mechanics]


MABEL, Teaching Robots to Walk - Jessy Grizzle, University of Michigan, Ann Arbor and Jonathan Hurst, Oregon State University

Walking, that fundamental human activity, seems simple: Take one foot, put it in front of the other; repeat. But to scientists, bipedalism is still largely a mystery, involving a symphony of sensory input (from legs, eyes, and inner ear), voluntary and involuntary neural commands, and the synchronized pumping of muscles hinged by tendons to a frame that must balance in an upright position. That makes building a robot that can stand up and walk in a world built for humans deeply difficult.


But it’s not impossible. Robots such as Honda’s ASIMO have been shuffling along on two feet for more than a decade, but the slow, clumsy performance of these machines is a far cry from the human gait. Jessy Grizzle of the University of Michigan, Ann Arbor, and Jonathan Hurst of Oregon State University have created a better bot, a 150-pound two-legged automaton named MABEL that can walk with a surprisingly human dexterity. MABEL is built to walk blindly (without the aid of laser scanners or other optical technologies) and fast (it can run a 9—minute mile). To navigate its environment, MABEL uses contact switches on its “feet” that send sensory feedback to a computer. “When MABEL steps off an 8-inch ledge, as soon as its foot touches the floor, the robot can calculate more quickly and more accurately than a human the exact position of its body,” explains Grizzle. MABEL uses passive dynamics to walk efficiently—storing and releasing energy in fiberglass springs—rather than fighting its own momentum with its electric motors.
The quest for walking robots is not purely academic. The 2011 Fukushima Daiichi nuclear disaster highlighted the need for machines that could operate in hazardous, unpredictable environments that would stop wheeled and even tracked vehicles. Grizzle and Hurst are already working on MABEL’s successor, a lighter, faster model named ATRIAS. But there’s still plenty of engineering to be done before walking robots can be usefully deployed, walking into danger zones with balance and haste but no fear.

IBM Blue Gene/Q Sequoia SupercomputerBruce Goodwin, Michel McCoy, Lawerence Livermore National Laboratory, IBM Research and IBM Systems & Technology Group

What is it? Sequoia, an IBM Blue Gene/Q supercomputer newly installed at Lawrence Livermore National Laboratory (LLNL) in Livermore, Calif. In June it officially became the most powerful supercomputer in the world.
How powerful are we talking about? Sequoia is currently capable of 16.32 petaflops—that’s more than 16 quadrillion calculations a second—55 percent faster than Japan’s K Computer, which is ranked No. 2, and more than five times faster than China’s Tianhe-1A, which surprised the world by taking the top spot in 2010. Sequoia’s processing power is roughly equivalent to that of 2 million laptops.


What is it used for? The Department of Energy, which runs LLNL, has a mandate to maintain the U.S. nuclear weapons stockpile, so Sequoia’s primary mission is nuclear weapons simulations. But the DOE is also using computers like Sequoia to help U.S. companies do high-speed R&D for complex products such as jet engines and medical research. The goal is to help the country stay competitive in a world where industrial influence matters as much to national security as nukes do.

Brain-Computer Interface – Michael Boninger, Jennifer Collinger, Alan Degenhart, Andrew Schwartz, Elizabeth Tyler-Kabara, Wei Wang, University of Pittsburgh; Tim Hemmes

On the evening of July 11, 2004, Tim Hemmes, a 23-year-old auto-detail-shop owner, tucked his 18-month-old daughter, Jaylei, into bed and roared off for a ride on his new motorcycle. As he pulled away from a stop sign, a deer sprang out. Hemmes swerved, clipped a mailbox, and slammed headfirst into a guardrail. He awoke choking on a ventilator tube, terrified to find he could not lift his arms to scratch his itching nose.


Seven years later Hemmes was invited to participate in a University of Pittsburgh research project aimed at decoding the link between thought and movement. Hemmes enthusiastically agreed and last year made history by operating a robotic arm only with his thoughts.


The science was adapted from work done by Pitt neurobiologist Andrew Schwartz, who spent nearly three decades exploring the brain’s role in movement in animal trials. In 2008 his research group trained monkeys with brain microchips to feed themselves using a robotic arm controlled by signals from the creatures’ brains. Movement, Schwartz explains, is how we express our thoughts. “The only way I know what’s going on between your ears is because you’ve moved,” he says.


To apply this technology to humans, Schwartz teamed up with University of Pittsburgh Medical Center clinician Michael Boninger, physician/engineer Wei Wang, and engineer/surgeon Elizabeth Tyler-Kabara, who attached an electrocorticography (ECoG) grid to Hemmes’s brain surface. Wang then translated the electrical signals generated by Hemmes’s thoughts into computer code. The researchers hooked his implant to a robotic arm developed by the Johns Hopkins University Applied Physics Laboratory (which itself won a 2007 Breakthrough Award). Hemmes moved the robotic arm in three dimensions, giving Wang a slow but determined high-five.


The team’s ultimate goal is to embed sensors in the robotic arm that can send signals back to the brain, allowing subjects to “feel” whether an object the arm touches is hot, cold, soft, hard, heavy, or light. Hemmes has an even more ambitious but scientifically feasible goal. “I want to move my own arms, not just a robotic arm,” he says. If that happens, the first thing he’ll do is hug his daughter.


[A] Thought To touch the apple, the patient imagines a simple action, such as flexing a thumb, to move the arm in a single direction.


[B] Signal Pickup A postage-stamp-size implant picks up electrical activity generated by the thought and sends the signals to a computer.


[C] Interpretation A computer program parses signals from the chip and, once it picks up on specific activity patterns, sends movement data to the arm.


[D] Action The patient can move the arm in any direction by combining multiple thoughts—flexing a thumb while bending an elbow—guiding the arm toward the apple.

CORNAR CameraRamesh Raskar and Andreas Velten, MIT Media Lab

Through two centuries of technological change, one limitation of photography has remained constant: A camera can only capture images in its line of sight. But now a team of university researchers led by MIT Media Lab professor Ramesh Raskar has built a camera that sees around corners. The CORNAR system bounces high-speed laser pulses off any opaque surface, such as a door or wall. These pulses then reflect off the subject and bounce back to a camera that records incoming light in picosecond intervals. The system measures and triangulates distance based on this time-of-flight data, creating a point cloud that visually represents the objects in the other room. Essentially the camera measures and interprets reflected echoes of light.


“For many people, being able to see around corners has been this science-fiction dream scenario,” says longtime New York University computer science professor Ken Perlin, who was not involved in the research. “Well, dream or nightmare, depending on how people use it.”


[A] Laser A beam is reflected off the door, scattering light into the room.


[B] Camera Light reflects off subject and bounces back to camera, which records time-of-flight data.


[C] Computer Algorithms create a composite image from camera data.



(Contributed by Kenneth Hines, CCC Program Associate)

NSF Invests Nearly $15 million in Big Data and New Interagency Challenge Announced

October 3rd, 2012

Today at a briefing on Capitol Hill titled, “Big Data, Bigger Opportunities“, hosted by Tech America, The National Science Foundation (NSF) and the National Institutes Tech America Logo [credit: Tech America]of Health (NIH) announced $15 million in funding for Big Data research. These awards come nearly six months after the Obama Administration released it’s substantial R&D initiative in March of this year.

The initiative committed more than $200 million in new funding through six agencies and departments to improve the nation’s  “ability to extract knowledge and insights from large and complex collections of digital data.”

Subra Suresh, Director of the National Science Foundation:

“I am delighted to provide such a positive progress report just six months after fellow federal agency heads joined the White House in launching the Big Data Initiative,” said NSF Director Subra Suresh. “By funding the fundamental research to enable new types of collaborations–multi-disciplinary teams and communities–and with the start of an exciting competition, today we are realizing plans to advance the foundational science and engineering of Big Data, fortifying U.S. competitiveness for decades to come.”

Suzanne Iacono, Senior Advisor at the National Science Foundation, announced the eight award recipients, listed below:

BIGDATA: Mid-Scale: DCM: A Formal Foundation for Big Data Management
University of Washington, Dan Suciu
This project explores the foundations of big data management with the ultimate goal of significantly improving the productivity in Big Data analytics by accelerating data exploration. It will develop open source software to express and optimize ad hoc data analytics. The results of this project will make it easier for domain experts to conduct complex data analysis on Big Data and on large computer clusters.


BIGDATA: Mid-Scale: DA: Analytical Approaches to Massive Data Computation with Applications to Genomics
Brown University, Eli Upfal
The goal of this project is to design and test mathematically well-founded algorithmic and statistical techniques for analyzing large scale, heterogeneous and so called noisy data. This project is motivated by the challenges in analyzing molecular biology data. The work will be tested on extensive cancer genome data, contributing to better health and new health information technologies, areas of national priority.


BIGDATA: Mid-Scale: DA: Distribution-based Machine Learning for High-dimensional Datasets
Carnegie Mellon University, Aarti Singh
The project aims to develop new statistical and algorithmic approaches to natural generalizations of a class of standard machine learning problems.  The resulting novel machine learning approaches are expected to benefit other scientific fields in which data points can be naturally modeled by sets of distributions, such as physics, psychology, economics, epidemiology, medicine and social network-analysis.


BIGDATA: Mid-Scale: DA: Collaborative Research: Genomes Galore – Core Techniques, Libraries, and Domain Specific Languages for High-Throughput DNA Sequencing
Iowa State University, Srinivas Aluru
Stanford University, Oyekunie Olukotun
Virginia Polytechnic University, Wuchun Feng
The goal of the project is to develop core techniques and software libraries to enable scalable, efficient, high-performance computing solutions for high-throughput DNA sequencing, also known as next-generation sequencing. The research will be conducted in the context of challenging problems in human genetics and metagenomics, in collaboration with domain specialists.


BIGDATA: Mid-Scale: DA: Collaborative Research: Big Tensor Mining: Theory, Scalable Algorithms and Applications
Carnegie Mellon University, Christos Faloutsos
University of Minnesota, Twin Cities, Nikolaos Sidiropoulos
The objective of this project is to develop theory and algorithms to tackle the complexity of language processing, and to develop methods that approximate how the human brain works in processing language. The research also promises better algorithms for search engines, new approaches to understanding brain activity, and better recommendation systems for retailers.


BIGDATA: Mid-Scale: ESCE: Collaborative Research: Discovery and Social Analytics for Large-Scale Scientific Literature
Rutgers University, Paul Kantor
Cornell University, Thorsten Joachims
Princeton University, David Biei
This project will focus on the problem of bringing massive amounts of data down to the human scale by investigating the individual and social patterns that relate to how text repositories are actually accessed and used. It will improve the accuracy and relevance of complex scientific literature searches.


The event also featured a new challenge announcement on Big Data science and engineering. The challenge is an interagency collaboration between the NSF, National Aeronautics and Space Administration (NASA) and the Department of Energy (DOE). More information on the challenge from the NSF press release:

The competition will be run by the NASA Tournament Lab (NTL), a collaboration between Harvard University and TopCoder, a competitive community of digital creators.

The NTL platform and process allows U.S. government agencies to conduct high risk/high reward challenges in an open and transparent environment with predictable cost, measurable outcomes-based results and the potential to move quickly into unanticipated directions and new areas of software technology. Registration is open through Oct. 13, 2012, for the first of four idea generation competitions in the series. Full competition details and registration information is available at the Ideation Challenge Phase website.

Read the entire press release on the National Science Foundation’s website.
(Contributed by Kenneth Hines, CCC Program Associate)

Mozilla and NSF Announce First Round of Winners for Brainstorming Phase of Ignite Challenge

September 26th, 2012

As we’ve previously blogged, Mozilla and the National Science Foundation (NSF) have teamed up for a challenge, called “Mozilla Ignite“, which focuses on the development of apps for faster, smarter internet of the future. Apps were designed to address needs in advanced manufacturing, education and workforce technologies, emergency preparedness and public safety, healthcare technologies and clean energy and transportation. The brainstorming round completed on August 23rd which brought in over 300 ideas from the community. Mozilla announced a total of eight winners, with one grand prize award.Mozilla Ignite [credit: Mozilla Foundation]

The grand prize winner from the brainstorming round went to Jeremy Cooperstock, director of the Shared Reality Lab at McGill University in Canada. Here is the description of his idea, titled “Real-time emergency response observation and supervision“, from the Mozilla Ignite blog:

This app saves lives. The goal: arm firefighters, rescue workers and first-responders with powerful new real-time data and communications, combining live, high-quality video from multiple feeds with real-time sensor data — like heat and smoke levels — plus massive computing capacity to improve decision-making and coordination.

Read the about the other winning apps on the Mozilla Ignite Blog.

The next round of the challenge is the development round, in which $85,000 will be awarded. Developers can enter the next round to help build upon one of the winners, or submit a new idea.

(Contributed by Kenneth Hines, CCC Program Associate)

NSF Awards $50 million for Cybersecure Research Projects

September 25th, 2012

Today, the National Science Foundation (NSF) awarded $50 million for research projects designed to help build a secure cyber society and protect the US infrastructure. The awards come from the Secure and Trustworthy Cyberspace Program (SaTC), which “builds on [NSF’s] long-term support for a wide range of cutting edge interdisciplinary research and education activities to secure critical infrastructure that is vulnerable to a wide range of threats that challenge its security.”

U.S. National Science Foundation (NSF) [image courtesy NSF].

The funded projects include two frontier awards — The first award, titled “Beyond Technical Security: Developing an Empirical Basis for Socio-Economic Perspectives“, is a multi-institution collaboration between Stefan Savage, University of California, San Diego, Vern Paxson, International Computer Science Institute and Damon McCoy, George Mason University. The second frontier award, titled “Privacy Tools for Sharing Research Data“, goes to Salil Vadhan and his team of multi-disciplinary researchers at Harvard University.

Read more from the NSF press release below.

» Read more: NSF Awards $50 million for Cybersecure Research Projects