The wave of AI disruption

271

Information technology evolves through disruption waves. First the computer, then the web and eventually social networks and smartphones all had the power to revolutionize how people live and how businesses operate. They destroyed companies that weren’t able to adapt, while creating new winners in growing markets.

While the exact timing and form of such waves of disruption are hard to predict, the pattern they follow is easy to recognize. Take the web/digital disruption, for example: There was a technological breakthrough (e.g. Sir Tim Berners-Lee’s WWW), which built on/took advantage of existing technologies (e.g. TCP/IP protocols and installed computer base) and gave rise, seemingly slowly, yet in fact exponentially, to new applications and platforms that disrupted existing markets (e.g. Amazon) or created new ones (e.g. Google).

Today a new wave is emerging. Much like the web took advantage of existing technologies, this new wave builds on trends such as the decline in the cost of computing hardware, the emergence of the cloud, the fundamental consumerization of the enterprise and, of course, the mobile revolution.

Furthermore, the proliferation and diversity of smart devices and “things” have enabled the ability for constant communication and sharing, while social networking natives (Snapchatters of the world unite!) have turned constant sharing and self-expression into a “need.” The result is the emergence of what we have coined as pervasive connectivity.

Pervasive connectivity leads to an explosion of ever richer and personalized data, which creates entirely new opportunities for new ways to process that data and extract valuable and actionable insights. Artificial intelligence allows for just that.

The AI opportunity — why now and how to harness it

AI is defined, rather broadly, as the capacity of machines to exhibit intelligence. It has several components, such as learning, reasoning, planning and perception, all of which have improved greatly in the last few years.

Read the source article at TechCrunch