Tuesday, December 13, 2011

“Airplanes Don’t Flap Their Wings”


When inventors mirror, expand and “improve” on inherent abilities in nature, the general trend is to create functioning machines that are quite different from their inspirational counterparts in nature. Imagine a computer that forgot stuff or got angry easily. And sitting in an aircraft with flapping wings would have to be a pretty uncomfortable journey. The above quote pretty much says it all, but sometimes, a “back to nature’s model” may be just their ticket in complex technology.

As the sum total of data seems to double every couple of days, the quest for the Holy Grail of computing seems to be taking three distinct tracks: smaller size for personal tasks, supercomputers for the biggest and most complex analytics and calculations and contained machines that teach themselves (learning computers) that don’t require the massive programming inherent in most such instruments. The latter is simply referred to as “artificial intelligence” and has obsessed science fiction writers for quite some time: Commander Data from one the Star Trek franchises and Steven Spielberg’s entire A.I. film are prime examples, although they seemed to explore the complexity of replicating human emotions. But such “human thinking” capacities have also obsessed scientists, whether combining robotic capacity or simply looking at computing power and functionality.

The computer that has infinite intelligence is not particularly containable, so computer engineers pursuing artificial intelligence have reverted to the fuzzy logic of human brains. They have long since realized that even with the tiniest microcircuits, computers that approach human functionality without the frailties of human thinking are simply too big, run too hot and consume too much power to be of much use in a smaller autonomous unit. “To meet the challenge, without gobbling the world’s energy supply, a different approach will be needed. And biology, scientists say, promises to contribute more than metaphors. ‘Every time we look at this, biology provides a clue as to how we should pursue the frontiers of computing,’ said John E. Kelly, the director of research at I.B.M.

Dr. Kelly points to Watson, the question-answering computer that can play ‘Jeopardy!’ and beat two human champions earlier this year. I.B.M.’s clever machine consumes 85,000 watts of electricity, while the human brain runs on just 20 watts. ‘Evolution figured this out,’ Dr. Kelly said.” New York Times, December 5th. Working with a number of major universities and feeding off of government (Defense Dept.) funding, IBM “has developed prototype ‘neurosynaptic’ microprocessors, or chips that operate more like neurons and synapses than like conventional semiconductors.” NY Times.

A brain does its computing with a design drastically different from today’s computers. Its processors — neurons — are, in computing terms, massively distributed; there are billions in a human brain. These neuron processors are wrapped in its data memory devices — synapses — so that the brain’s paths of communication are extremely efficient and diverse, through the neuron’s axons, which conduct electrical impulses.

“A machine that adopts that approach, Dr. [Dharmendra S. Modha, the I.B.M. computer scientist leading the project] said, would represent ‘a crucial shift away from von Neumann computing.’ He was referring to a design with processor and memory physically separated and connected by a narrow communications channel, or bus, and operating according to step-by-step sequential methods — the von Neumann architecture used in current computers, named after the mathematician John von Neumann

“It is an appealing vision, but there are [still] formidable obstacles. The prototype chip has 256 neuron-like nodes, surrounded by more than 262,000 synaptic memory modules. That is impressive, until one considers that the human brain is estimated to house up to 100 billion neurons. In [I.B.M.’s San Jose-based] Almaden research lab, a computer running the chip has learned to play the primitive video game Pong, correctly moving an on-screen paddle to hit a bouncing cursor. It can also recognize numbers 1 through 10 written by a person on a digital pad — most of the time. But the project still has a long way to go” NY Times.

Given the rate that computing power has changed over the past few decades, I only wonder what will happen in this field when the big breakthrough skyrockets this technology to breathtaking levels. Meanwhile, anyone for a game of Pong? Can I have a beer with that?

I’m Peter Dekom, and looking at technology before it breaks through can be most amusing.

No comments: