AI Gets Edgy: How New Chips and Code are Pushing Artificial Intelligence from the Centralized Cloud Out to Network Nodes

Sponsored by:

Previous WebinarsJuly 20, 2016 AI has relied on energy-hogging fast processors and large datasets for training neural networks – both of which presupposed centralized computing architectures. But today, more powerful chips are letting AI escape from centralized, cloud-based systems and move out to devices at the edge of the network. Among incumbents, Intel spent a whopping $16.7 billion on AI chipmaker Altera; Google is developing an AI chip called Tensor and working with Movidius to put AI on a USB stick; and Nvidia has dropped $2 billion so far on its Tesla graphics chip for machine vision and other AI tasks.

  • To view the Cambridge Innovations Institute’s privacy statement click here.