Artificial Intelligence (AI) is often described as a major breakthrough that demands attention, with all companies encouraged to have an “AI Strategy.” The core AI technology producing the results that drive this view is machine learning using deep neural networks (“deep learning”). Deep learning has in fact provided significant breakthroughs, including improving the speech recognition and natural language understanding of digital assistants such as Apple’s Siri, Google Assistant, and Amazon Alexa. Deep learning has also driven more company-specific applications such as reducing energy consumption in Google’s data centers or helping Facebook remove objectionable content more quickly.
But deep learning is not based on new inventions. The core technology has been around for decades. For example, the technique of “backpropagation” that is used for finding the best deep neural network matching a data set was developed by Rumelhart, Hinton, and Williams in 1986. A 729-page book from MIT Press, Neurocomputing: Foundations of Research, published in 1988, reprints 43 papers on the subject. As the power of deep learning became obvious, researchers have of course made substantial methodological improvements, but no one claims they recently “discovered” the core technology.
If the methodology isn’t new, what caused AI to have such an impact? The answer is simple: Increasing computer power and memory has passed a threshold that allows the methods to be practical. When this author wrote a book on machine learning for computer pattern recognition in 1972, computer power was approximately a billion times more expensive than today. When a company I founded applied the technology to speech recognition development a decade later, it was typical to run one machine learning analysis for several months before it converged. This limitation led to what has been called the “AI Winter,” where research into the core technology was discredited.
Part of the growth in computer power was the improvement in computing chips as described by “Moore’s Law,” the number of transistors on a chip doubling roughly every two years. The cost of an hour of computing also has declined roughly at the same rate.
Deep learning is but one technology that increasing computing power has enabled over time. Smartphones, which have significantly impacted our lives, is an obvious example.
Will the growth in computer power continue, or is it reaching the limits of Moore’s Law? Other trends may actually be accelerating the growth of affordable computing power. One longer-term trend is quantum computing, but shorter-term trends drive improvement as well. This includes the growth of cloud computing, where computer power can be rented rather than requiring expensive investments in a server farm. In addition, specialized chips such as Graphical Processing Units are being incorporated in computing centers, providing parallel computing for specialized tasks such as deep learning. By doing multiple processes simultaneously, parallel computing provides a significant acceleration of appropriate tasks.
Further, the devices connecting cloud computing to individuals, such as smartphones and automobiles, get more on-device computing power with each new model. This further increases the overall total computing power.
AI is nothing new, it is just a reflection of increasing “computer intelligence,” as I characterized it in my recent book. If the impact of AI is but one example of the exponential growth of computer power over time, we can expect future breakthroughs as it passes new thresholds—perhaps passing those thresholds even faster than in the past.
The continuation of today’s trends, for example, will lead to digital assistants being increasingly personalized, easy-to-use, and providing increasing information and services. Connecting with computers will become increasingly like a human conversation. Children growing up with a digital assistant constantly at hand through smartphones or smartwatches will find that such “augmented intelligence” becomes almost part of being human.
Artificial Intelligence is a category of applications symptomatic of what is coming, driven by the long-term trend of increasing computer power. More generally, computer intelligence will impact our lives in increasingly surprising ways.