Train Your Smartphone Like You Train Your Dog: Coming Soon

Qualcomm's Matt Grob
Is your smartphone waking you up with late-night callers? Tell that phone it's a “bad phone” and someday, in the not too distant future, it may learn on its own not to do it again.
Want a special alert whenever your boyfriend texts? A little nudge of “good phone” may teach that device a few new tricks.
The technology industry is taking a cue from the natural world to make devices smarter, modeling new computer processors after the design of biological brains. In just a few years, you may be able to train a device with a Zeroth chip from Qualcomm (QCOM), for example, just like you train your dog.
“We’re looking at a processor more adept at processing sensory information, recognizing patterns and making predictions,” says Anthony Lewis, lead engineer on the Zeroth project.
Promising new directions

Computer scientists have been playing with brain-like neural networks for decades, mainly as complex software simulations running on massive supercomputers. But advancing research in both chip design and the workings of the brain is leading in promising new directions for consumer devices.
“The next evolution of our smart devices is turning them into genius devices,” says Francis Sideco, a senior director at IHS who follows developments in consumer electronics components. “They will be gaining the capability to understand and receive stimuli from the real world and act upon it.”
Qualcomm's Matt GrobQualcomm's Matt Grob
Phones have already gained a measure of these abilities from software running at big data centers. Google’s (GOOG) Google Now and Apple’s (AAPL) Siri, for example, are getting increasingly better at voice recognition via software learning systems. But the intelligence is all in the cloud, requiring an Internet connection at all times and limiting functionality.
Computer processors designed after structures in the brain, with silicon neurons and synapses, should in theory be able to handle tasks such as learning or recognizing objects in real time with vastly less computing power than software solutions require. The chips are also designed to draw far less power, extending battery life.
Standard computer chips are based on what’s called a Von Neumann design, after the famed mathematician. They process a sequence of instructions in order, or if they have multiple cores, a few sequences at a time. Inside the chip, a constant stream of electricity is needed to transmit information.
The human brain has 86 billion neurons, each linked with up to thousands of synapses, all working in parallel. Power is needed only intermittently, as neurons interconnect and send messages via the synapses in short spikes.
Brain-like architectures
That’s the model Qualcomm and other chip makers such as IBM (IBM) and Intel (INTC) are using to create more brain-like architectures right on the silicon chips.
Qualcomm’s Zeroth research project isn’t producing smartphone chips yet. That’s likely a few years away. But it has shown how the technology will work in a series of demonstrations.
At the EmTech conference at MIT this month, chief technology officer Matt Grob showed video of a robot learning to navigate its way around a carpet of multicolored squares. Human controllers gave the robot positive feedback only when it encountered white squares. Soon the robot learned to seek out only those sections.
The Zeroth project is still in the lab at Qualcomm, but the company is looking for partners to start creating more prototypes. They’ve even reached out to the Obama Administration’s brain research initiative.
IBM has demonstrated microchips with the brainpower of a garden worm. The company says the goal of its “True North” program is nothing less than building a brain in a box – a system with 10 billion virtual neurons and 100 trillion simulated synapses that takes up less than 2 liters of space and runs on 1 kilowatt of power. Demonstration applications include a computer learning to play pong and perform handwriting analysis.
Last year Intel disclosed a design for a neuromorphic chip built with tiny spinning magnets. The end result may be a chip that uses 15 to 300 times less power than a conventional computer.
Timing
None of the current projects are ready for phones just yet. It could take another two to five years before actual chips are available for mass market devices, IHS's Sideco says. “What we’re seeing now is the primordial soup of these elements," he says. "At some point, they’ll start coalescing.”
Not everyone is confident the research projects will result in usable breakthroughs, however.
Similar claims have been made in the past, warns Isabelle Guyon, an expert in machine learning who worked at AT&T’s famed Bell Labs and now runs her own consulting firm. In prior cases like speech recognition, software running on standard computers, which can be reprogrammed for many tasks, proved most economical.
“Dedicated hardware makes sense in only few cases when really it is economically viable,” Guyon says. “Versatility is important to win a sufficient market share, even at the expense of some lesser efficiency.”
Still, advances in traditional chip design that follow Moore’s law by squeezing ever greater numbers of transistors into the same space will soon reach the limits of physics. Brain-like designs could continue to improve in other ways, says Tim Chang, a venture capitalist at the Mayfield Fund.

The growth of mobile computing puts a premium on small, low-power designs while the advance of cloud computing can bring greater computing power. Both will be needed, Change says: “The evolution of computer, network and mobile technology is the ongoing yin and yang.”