Computing Community Consortium Blog

The goal of the Computing Community Consortium (CCC) is to catalyze the computing research community to debate longer range, more audacious research challenges; to build consensus around research visions; to evolve the most promising visions toward clearly defined initiatives; and to work with the funding organizations to move challenges and visions toward funding initiatives. The purpose of this blog is to provide a more immediate, online mechanism for dissemination of visioning concepts and community discussion/debate about them.

“Imagining Tomorrow’s Computers Today”

July 26th, 2012 / in big science, research horizons, Research News / by Erwin Gianchandani

Following a talk at the Euroscience Open Forum earlier this month, Intel principal engineer and futurist Brian David Johnson sat down with ScienceNOW to discuss his forecasts about “the interaction between humans and computers.” Noting he’s focused on the year 2020, Johnson had the following to say as part of the Q&A:

Brian David Johnson, Intel Corporation [image courtesy EuroScience Open Forum via ScienceNOW].Q: You study the interaction between humans and computers. What do you foresee in the next 10, 15 years?


B.D.J.: Looking at the past, technology has been about command and control. In the future it will be about relationships. Our technologies will get to know us and we’ll become more tightly connected. That has an impact on what we do productivity-wise, but even more it connects us to the things and people we love. Siri, the personal assistant built into your iPhone, is an early example of that. You literally talk with your phone and it can talk back to you.


Q: In what way does the development of chips play a role in this?


B.D.J.: As we move closer to 2020, the size of computational chips is becoming so small that it is approaching zero. This means we could literally turn anything into a computer. Your tea glass, the table, you name it. There is a switch coming, where we do not have to ask: “Can we turn that into a computer?” but we know we can and we wonder: Is there a use to do it? That is what we have the social scientists for. We do not study markets, we study people.


Q: Have the technical challenges to do that been overcome?


B.D.J.: They will be. Siri is far from perfect. Currently if you have a different accent than people like me who live around Silicon Valley, it will not work well. With more computational power, things will improve, but there is a lot more to be done.


Q:There are downsides to the rapid development of technology. People are said to develop attention deficit hyperactivity disorder (ADHD), due to the information and interaction overload. Do you study that as well?


B.D.J.: Yes, we do. That project is called The Future of Fear. The thing is: These communication technologies are very new. We do not have rules and norms on how to handle them. Those norms will take shape in the coming years. I already talked to people telling me they have set maximum screen time.


Q: And what about the ability to think deep or to remember? Do you consider these issues?


B.D.J.: Yes, we do that. An interesting study by the University of British Columbia published in Science magazine last July showed that indeed we are off-loading our memory to our devices. It is already happening, they said. We have lower rates of recall to information but higher recall rates for access to the information. And what they also said: This is not new. We have been off-loading our oral history to books. That is not bad — it’s progression.

Check out the full interview here. And share your thoughts on Johnson’s perspective in the space below.

(Contributed by Erwin Gianchandani, CCC Director)

“Imagining Tomorrow’s Computers Today”

Comments are closed.