Tuesday, July 12, 2011

Intel: Human and computer intelligence will merge in 40 years

Intel: Human and computer intelligence will merge in 40 years
On the company's anniversary, a future of sensors, robots and new thinking
By Sharon Gaudin
July 23, 2008

Computerworld - At Intel Corp., just passing its 40th anniversary and with myriad chips in its historical roster, a top company exec looks 40 years into the future to a time when human intelligence and machine intelligence have begun to merge.

Justin Rattner, CTO and a senior fellow at Intel, told Computerworld that perhaps as early as 2012 we'll see the lines between human and machine intelligence begin to blur. Nanoscale chips or machines will move through our bodies, fixing deteriorating organs or unclogging arteries. Sensors will float around our internal systems monitoring our blood sugar levels and heart rates, and alerting doctors to potential health problems.

Virtual worlds will become increasingly realistic, while robots will develop enough intelligence and human-like characteristics that they'll become companions, not merely vacuum cleaners and toys.

Most aspects of our lives, in fact, will be very different as we close in on the year 2050. Computing will be less about launching applications and more about living lives in which computers are inextricably woven into our daily activities.

"What we think of as a computer and what we think of as IT, in general, is likely to change," said Rattner, who has been at Intel for 35 of the company's 40 years. "The intelligent systems will move from being information systems to intelligent systems that will carry out a whole variety of tasks that we just won't think of as computing tasks.... The technology will find its way into so many things we do, and we won't even think about it. The explicit way we've done computing in the past will be there, but it will be a very small subset of what we'll be doing."

Intel hit its 40th anniversary last Friday. The company launched its first microprocessor in 1971, developed a processor with more than 1 million transistors in 1989, and late in 2007 packed 820 million transistors onto a single chip.

While chip advancements will continue throughout the semiconductor industry, technology advancements in general will start to change, according to Rattner.

"When you think back on where we were [decades ago] ... computers were still things that largely sat in big rooms behind big windows and were attended to by computing gurus or priests," he added. "In the 40 years, we've just completely changed the way people think about computers and computing. It's gone from a very expensive, very exclusive kind of technology to something that is unquestionably ubiquitous -- from the computers on our desks to the computers in our cell phones."

In the next 40 years, computer chips will extend beyond our computers and phones, as people want to become more entrenched in virtual worlds and computers learn to react to our motions and thoughts.

"When you see how intense the reaction is to things like the iPhone, with its use of touch and its sensitivity to motion, you begin to get a sense of, 'Gee, if machines understand the physical world and were capable of reacting to our voices, to our movements and gestures and touch, how much closer would we feel to them?'" asked Rattner. "At the same time, of course, we would like the ability to become more a part of these artificial or virtual worlds that are created entirely within the machine. We're starting to see, with things like Second Life and now Lively from Google, the ability of these machines to create these worlds that are much more comfortable for us to experience and be a part of."

As machine learning and computer perception progresses, machines will take on more and more human-like characteristics, he added. Recently, scientists have been putting electrodes into living neurons in living brains, but some researchers are working on ways to transfer brain waves and organic information without the electrodes, which wouldn't be physically intrusive.

"You can imagine a future where, in fact, not just our very senses will be engaged, but our thoughts will drive machine behavior," said Rattner. "You can see how that boundary starts to soften and begins to blur.... There's no question in my mind that the technology will bring these two unique and distinct forms of intelligence together."

http://www.computerworld.com/s/article/9110578/Intel_Human_and_computer_intelligence_will_merge_in_40_years

No comments:

Post a Comment