Computing Community Consortium Blog

The goal of the Computing Community Consortium (CCC) is to catalyze the computing research community to debate longer range, more audacious research challenges; to build consensus around research visions; to evolve the most promising visions toward clearly defined initiatives; and to work with the funding organizations to move challenges and visions toward funding initiatives. The purpose of this blog is to provide a more immediate, online mechanism for dissemination of visioning concepts and community discussion/debate about them.


NY Times Keeps Talking Computing

December 8th, 2011 / in big science, research horizons, Research News / by Erwin Gianchandani

In addition to Tuesday’s special Science Times describing the future of computing, The New York Times has featured several other articles this week about cutting-edge work in the field.

For instance, yesterday, the Times covered University of Washington Computer Science and Engineering Professor Oren Etzioni’s electronics price prediction startup Decide — which utilizes data mining and machine learning over electronics prices to help consumers determine when it’s best to buy the electronics gadgets on their wish lists:

Shopping with a smartphone [image courtesy Matthew Ryan Williams for The New York Times via NYTimes.com].If only shopping for electronics were as easy as buying a car.

 

There was a time not so long ago that buying a car was one of the worst shopping experiences. As you drove off the dealer’s lot, you couldn’t escape the feeling that you hadn’t gotten the best deal.

 

Then something changed that made shopping for a car almost, well, fun, like shopping is supposed to be.

 

That something was information. Much to the annoyance of the dealers, a car buyer could obtain reliable data on auto prices. For only a few dollars, Consumer Reports estimates what a dealer paid for the car and what, after a fair markup, the buyer should pay. A shopper often got enough information to know when it would be the best time to buy the car.

 

Buying a camera or a smartphone isn’t as easy because we lack information about prices. There is the illusion that the Internet has provided what we need to know about the prices, but that jumble of information makes any buying decision more confusing and anxiety producing.

 

The fact of the matter is, the shopper is not on a level playing field with the retailers and the manufacturers. They know, thanks to the Internet, more about you and your intentions than you could ever know about theirs. Every time you look at a product on a site, every time you buy a product online, you are providing valuable signals.

 

When companies scrape the Web and collect those billion of signals and sophisticated software collates the data and interprets it, you don’t stand a chance. It is just too difficult and expensive for you to gather all the information. And in any case, you probably lack the degree in mathematics and computer science needed to parse the data.

 

Yet nearly everything you spend money on is determined by the algorithms they create. The prices of your airline ticket and hotel room, even your rent, are determined this way. Electronics and other consumer goods are priced the same way…

 

[So] perhaps the biggest consumer weapon arrived this year in the form of Decide.com. It is a Web site, and more recently an app for mobile devices, that collects and mines billions of transactions to determine what the best price is and whether there will be an even better price soon [read more after the jump].

 

“It’s the first time when-to-buy is addressed,” said Mike Fridgen, Decide’s chief executive officer.

 

For example, Decide.com said last week with 81 percent confidence that the Panasonic Viera 50-inch plasma TV (model TC-P50S30), a popular model, would drop within the next two weeks. It also predicted, with 62 percent confidence, that a new model would come along within three months…

 

“We are not clairvoyants,” said [Etzioni]. “We give consumers visibility.”

 

Decide is run by many of the same people who built Farecast, a site that gave consumers a fighting chance against the airlines, which are constantly changing prices to match demand…

And on Monday, Times‘ tech writer Steve Lohr authored a story about bioinspired design — a topic we’ve previously featured on this Blog — and some groundbreaking work being funded by the Defense Advanced Research Projects Agency (DARPA) that looks at utilizing biology to advance artificial intelligence. In his article, Lohr writes:

Dharmendra S. Modha of IBM leads a team developing chips that structurally resemble the brain [image courtesy Tony Avelar/Bloomberg News via NYTimes.com].Ever since the early days of modern computing in the 1940s, the biological metaphor has been irresistible. The first computers — room-size behemoths — were referred to as “giant brains” or “electronic brains,” in headlines and everyday speech. As computers improved and became capable of some tasks familiar to humans, like playing chess, the term used was “artificial intelligence.” DNA, it is said, is the original software.

 

For the most part, the biological metaphor has long been just that — a simplifying analogy rather than a blueprint for how to do computing. Engineering, not biology, guided the pursuit of artificial intelligence. As Frederick Jelinek, a pioneer in speech recognition, put it, “airplanes don’t flap their wings.”

 

Yet the principles of biology are gaining ground as a tool in computing. The shift in thinking results from advances in neuroscience and computer science, and from the prod of necessity.

 

The physical limits of conventional computer designs are within sight — not today or tomorrow, but soon enough. Nanoscale circuits cannot shrink much further. Today’s chips are power hogs, running hot, which curbs how much of a chip’s circuitry can be used. These limits loom as demand is accelerating for computing capacity to make sense of a surge of new digital data from sensors, online commerce, social networks, video streams and corporate and government databases.

 

To meet the challenge, without gobbling the world’s energy supply, a different approach will be needed. And biology, scientists say, promises to contribute more than metaphors. “Every time we look at this, biology provides a clue as to how we should pursue the frontiers of computing,” said John E. Kelly, the director of research at IBM.

 

Dr. Kelly points to Watson, the question-answering computer that can play “Jeopardy!” and beat two human champions earlier this year. IBM’s clever machine consumes 85,000 watts of electricity, while the human brain runs on just 20 watts. “Evolution figured this out,” Dr. Kelly said.

 

Several biologically inspired paths are being explored by computer scientists in universities and corporate laboratories worldwide. But researchers from IBM and four universities — Cornell, Columbia, the University of Wisconsin, and the University of California, Merced — are engaged in a project that seems particularly intriguing [more after the jump…].

 

The project, a collaboration of computer scientists and neuroscientists begun three years ago, has been encouraging enough that in August it won a $21 million round of government financing from the [DARPA], bringing the total to $41 million in three rounds. In recent months, the team has developed prototype “neurosynaptic” microprocessors, or chips that operate more like neurons and synapses than like conventional semiconductors…

Read the rest of these articles — on Decide and bioinspired AI — and don’t forget to check out this week’s New York Times‘ Science Times.

(Contributed by Erwin Gianchandani, CCC Director)

<em>NY Times</em> Keeps Talking Computing

Comments are closed.