SAN FRANCISCO – In 2004, Geoffrey Hinton doubled down on his pursuit of a technological idea called a neural network.
It's a way to see the world around them, recognize sounds and even understand natural language. But scientists had spent more than 50 years working on the concept of neural networks, and could not really do any of that.
Hinton, a computer science professor at the University of Toronto, is organizing a new research community. They included Yann LeCun, a professor at New York University, and Yoshua Bengio at the University of Montreal.
Over the past decade, the big idea nurtured by these researchers has reinvented the way technology is built, accelerating the development of face-recognition services, talking digital assistants, warehouse robots and self-driving cars. Dr. Hinton is now at Google, and Dr. LeCun works for Facebook. Dr.
"What we have seen is short of a paradigm shift in science," said Oren Etzioni, the chief executive officer of the Allen Institute for Artificial Intelligence in Seattle and a prominent voice in the AI community. "History turned their way, and I am in awe."
Loosely modeled on the web of neurons in the human brain, a neural network is a complex mathematical system that can learn discrete tasks by analyzing vast amounts of data.
This allows many artificial intelligence technologies to progress at a rate that is not possible in the past. Rather than coding on a computer – can it be used as a computer tool?
The London-born dr. Hinton, 71, first embraced the idea as a graduate student in the early 1970s, a time when most artificial intelligence researchers turned against it. Even his own Ph.D. adviser questioned the choice.
"We met once a week," Hinton said in an interview. "Sometimes it ends in a shouting match, sometimes not."
Neural networks had a brief revival in the late 1980s and early 1990s. After a year of postdoctoral research with Dr. med. Hinton in Canada, the Paris-born dr. LeCun moved to AT & T's Bell Labs in New Jersey, where he designed a neural network that could read handwritten letters and numbers. To AT & T subscribes to the system to banks, read about 10 percent of all the reviews written in the United States.
Though a neural network was unable to make a lot of headway with big AI tasks, like recognizing faces and objects in photos, talking words, and understanding the natural way people talk.
"They worked well only when they had lots of training data, and there were lots of training data, "Dr. LeCun, 58, said.
But some researchers persisted, including the Paris-born dr. Bengio, 55, who worked alongside LeCun at Bell Labs before taking a professorship at the University of Montreal.
In 2004, with less than $ 400,000 in funding from the Canadian Institute for Advanced Research. H invited a research program dedicated to what he calls "neural computation and adaptive perception." Bengio and Dr. LeCun to join him.
By the end of the decade, the idea had caught up with its potential. In 2010, Dr. Hinton and his students helped Microsoft, IBM, and Google push the boundaries of speech recognition.
"He is a genius and knows how to create one impact after another," said Li Deng, a former speech researcher at Microsoft who brought. Hinton's ideas into the company.
Dr. Dr. Hinton's image recognition breakthrough LeCun. In late 2013, Facebook hired the N.Y.U. Professor to build a research lab around the idea. Dr.  The theory behind the giants, but the research he oversaw in Montreal helped 19659002] Though these systems have undeniably accelerated the progress of artificial intelligence, they are still a long way from true intelligence. But Drs. Hinton, LeCun, and Bengio believe that new ideas will come.
"We need to be able to do all that." Bengio said.