AI Gets Edgy: How New Chips are Pushing AI from the Cloud to Network Nodes – part 1

334

editor note: This is part 1 of a series of articles that AI Trends is publishing based on our recent webinar on the same topic presented by Mark Bunger, VP Research, Lux Research. If you missed the webinar, go to www.aitrends.com/webinar  The webinar series is being presented as part of AI World’s webinar, publishing and research coverage on the enterprise AI marketplace.

In my role at Lux I lead research in future computing platforms and industrial big data and analytics. Both of those have a lot to bring to this particular topic. AI, network structures, chips, software, all those things are really changing their roles which isn’t, in of itself, anything new. It’s always been a moving target. But if you saw the invitation to this webinar, that change is a specific thing happening right now that we want to look like around the field of AI.

In the past, AI has required a lot of fast processors, a lot of energy and big data sets to train the neural networks – basically becoming intelligent – and that’s presupposed to centralized computing structures. But more and more we’re seeing, as we have in lots of other generations of technology, that power moving out to the edges of networks. There are a lot of incumbents that are making bets in this space and there are a lot of startups that are coming into it.  What we wanted to do in this webinar is to go through some of the things that we’re seeing, and hopefully try to say something about the future and where AI is headed. This is certainly a case where if you were thinking about mobile telephony in, say, 2006, you might have thought about the emergence of smartphones. There were some smartphones out there but when they really came into full force and the changes that have happened in society and in business and, certainly, in the technology landscape since then, we see that we’re kind of on the cusp of a really similar type of transformation. That’s why it’s important to understand and try and get some bearings in where things might go.

As I mentioned, AI has kind of always required big brains. We’ve assumed that anything as important and big and smart as a human would need a lot of people tending it. People would talk about the priesthood of computer scientists that would attend to these big AI systems.  Back in the 50s and 60s and even 70s when all computers were big, they were all basically mainframes, this was kind of the picture of what AI was going to be, the evil picture which made for good movies like this one – “Colossus: The Forbin project.” In this movie, the scientists make an artificially intelligent computer brain that is basically meant to run the U.S. military and it gets its hands on the nuclear codes and things like that, gets very feisty and threatens us with massive destruction.

forbinprojectAs you can see from this movie poster (there was also a book, by the way), “Obey or” – you can guess what. We’ve built a supercomputer with its own mind and now we have to fight it to save the world.

Again, this picture of computing and in particular of AI is actually part of a pendulum swing that’s been going back and forth with a lot of technologies from this centralized to decentralized structure and, once again, reality is starting to outpace fiction. Some of the things that have happened over the last, roughly, 50-plus years that have made the centralized-distributed pendulum swing go, have been specific technologies from the vacuum tube and transistor.  And since the 60s and 70s, specific chips that have made computing power that used to be centralized, now available to smaller and smaller devices and, ultimately, in some cases, to essentially vanish. It’s just a component in another, bigger system.

Today we have things like Arduino and Raspberry Pi which are literally things that middle school kids can use that have a lot more processing power than a lot of the military grade equipment that their parents grew up with. We don’t know exactly know what this next generation of chips will bring us or what the landmark chip is going to be, but if we look at the computing architectures that these past pendulum swings have made possible, like client-server, mesh networking, mobile computing, ambient ubiquitous computing and things like that that we see more and more of, this next generation of chips, the dozen or so that I’ll be talking about right now, all have the potential to be that landmark in the history of technology hardware.

to be continued in AI Trends