Misconceptions about IoT Systems and Artificial Intelligence


Artificial Intelligence (AI) has become the latest buzzword in the IT industry. Everything from the dishwasher and fridge to TVs and cars has become connected due to the Internet of Things (IoT), and with AI, many think that these products are going to think like human beings. Computers have certainly become more intelligent. In 2016, Google’s AI software called AlphaGo finally beat one of the top Go players in the world, a feat that had been thought to be impossible as Go is very complex.

Given these advances in technology, IoT systems equipped with AI would likely bring even greater value – or would they? To answer this question, let’s look at the state of AI and how we use it today.

Magnitude of compute resources

AIs for IoT may require expensive, powerful computers. In AlphaGo, 1,920 CPUs, 280 GPUs and millions of past moves/positions from professional games were brought into play to beat Lee Sedol, a professional 9-dan Go player. That’s pitting lots of compute resources against a human player for a specific challenge, playing Go. Technically, it is possible to scale down the resources to tackle less complex challenges and bring the value of AI to a wide variety of IoT use cases, some of which are just as complex.

Training before use

AIs cannot solve a problem instantly today, at least without training. The truth about AI is there is actually a lot of work put into the system before it goes live. In the AlphaGo case, algorithms had to be designed to search for useful Go moves, and deep learning technique  using extensive data sets from past professional games to help “see” or evaluate a move in relation to the current board situation.

To apply AI in IoT systems, we would have to take the time to develop custom algorithms and train the AI for specific use cases. There are already such solutions available, including Fujitsu’s own Operational-Data Management & Analytics (ODMA) business application which uses machine learning technology to detect anomalies in data received from sensors, in real-time.

Things will also improve as the IT industry enhances its problem-solving capabilities over time. New techniques to achieve faster deep learning and larger neural networks have been developed recently by Fujitsu Laboratories, which can determine if drivers are too drowsy to drive, or to forecast breakdowns from equipment vibration.

Read the source article at EnterpriseInnovation.net.