IBM’s Watson begins to accelerate in the retail zone


by Peter White, ReThink Research, AI Trends Contributing Editor

IBM this week lifted its skirts a fraction on what is going on in its work with retailers and just how it can bring artificial intelligence and specifically its Watson project to bear on the sector.

What began as a sleepy analyst briefing rapidly heated up with the news that IBM has just completed a survey of retailers taking in 35,000 shoppers in 500 retailers (mostly bricks and mortar, but some pure play online) across a dozen countries – IBM has managed to rank each of them across a number of categories – with the store and call center experiences scoring lowest at an average of 20% and 21% respectively, all the way up to web experiences (41%), customer understanding (46%) and social engagement (53%). Other categories measured were flexibility of fulfillment and the mobile experience.

Although IBM was reluctant to quantify just how it allocated points to each category, it all comes from customer responses, which it can compare between retailers. Now it wants to use Watson to help retailers improve their scores.

Regional variations in this study also showed a wide geographical mix, with the UK scoring highest at a mere 50% across all categories, followed by Sweden and Norway on more or less the same score, and a second division of France, Ireland and the US (44% to 45%). Much of the rest of Europe and Australia then followed, and Latin America, China and Asia recorded scores around the 33% mark with India lagging down at 16%.

IBM Watson can take all the existing structured retailer supplied data, and add it to less structured data that IBM has collected from the internet and from social media, and partners, and use that to come up with retailer recommendations for change. This is across a broad spectrum with Watson able to analyze language used in social media about the retailer, but also about the top 100 products that people are talking about, and it can even look at analyzing video from the stores using Watson. It can take proprietary weather data which IBM has speculatively acquired, and use it to predict buying trends. And it can take this down to a local level.

IBM says that using hyper local data it can calculate what an inch of snow does to the sale of snow shovels in one county, versus the next and not just do this reactively, but using a weather database, it can predict it.

Again this runs through the entire presentation – while some retailers can react to the news, the weather and the top 100 trending products, Watson can predict what is going to happen across each of these categories. IBM provides this in a combination of traditional analytics and big data as well as AI, with Watson carrying out the brunt of the forecasting, and doing it using 28 AI APIs.

These fall into 4 basic categories – Language, Speech, Vision and Data Insights, but there are multiple versions of each, such as in language there are document conversions and translation, and natural language parsing or classification. – the retailer is guided to what other retailers have done with each API, and suddenly like magic they have hypotheses to test which may, or in some case may not, improve business. Where they don’t, it is for a lack of data and if you feed more data, such as the results of trials, into Watson, you will get a more refined set of hypotheses and so on.

The truth is that Watson, like all the other hungry AI beasts of Google and Microsoft, Facebook and others, needs lots of data or it needs time reviewing actions (more data). And if we had tried all of this 25 years ago when AI was last in vogue, the datasets just weren’t out there to create this kind of insight – but the internet has created them.

IBM wants retailers to use this across a huge range of applications, gathering insights on people, behavior, demand, trends, products, pricing and supply lines, and using this to make adjustments to product innovation; measuring marketing performance, even multichannel comparison; personalizing experiences for the shopper, improving fraud and security, and making price promotions more effective by finding the right price points. It should affect eCommerce strategies, the way stores are actually run, and how inventory and orders are controlled.

But the truth is that IBM has put this all together in just the last few years. It has gone from having a single programming partner to having 200, and it is the growing experience of this cohort of implementers that represents the head start Watson has.

In essence all AI is about taking on more complex tasks, which we never thought were beyond a computer, they were just beyond us humans, who had to manually build systems to winkle out every dimension of a particular problem. Usually that took a huge amount of manual intervention, getting to know and understand the lines of business problems. Throw out that idea, and instead toss data at Watson, and you can get there with very little manual intervention.

One of the points made after the recent win of the AlphaGo program running on Google’s Deep Mind, over the Go world champion, was that two versions of AlphaGo played tens of thousands of games of Go against each other. That is what accelerated the development of the base of data that led to it playing better and better, more rapidly. It had more games to analyze and tell it the outcomes of any given move. And that’s what everyone has to think of here – ways of accelerating the learning, by getting the machine to learn by either playing itself, or reaching out across the internet for more data or in some other way.

IBM gave three examples of retail programs, but never gave us enough of a feel for them to know if they are actually successes – let’s call them experiments. These were Stat Social, a company which links multiple social profiles, tracking 600 million social personae (1 person on one platform) to give a view of buying patterns across demographics and offering personality insights; Cognitive Scale; a company offering marketing insights and ways to personalize shopping – IBM said that every web page you visit could be reconfigured to be personalized for you; and; a site to help people find wine that tastes like another one they like, or brings out a flavor they describe or which suggests wines to go with different foods at the point of sale.

The truth is that all of this is desirable, but right now retailers need to have sufficient scale to afford the cost of relying on AI to make a change. IBM refused to actually give the cost of a transaction on each of its 28 APIs, but we would suggest that it is dependent upon how much data needs to be held, in order to get a good AI prediction, and how many processing cycles that particular API needs to drive a hypothesis though that data.

The feeling IBM gave was that trying to be a retailer in the future without Watson would mean that your operation was massively less efficient and less accurate in its marketing, than rivals. But given that IBM has spent most of its life as a company working with the top 100 accounts globally, we cannot help thinking that the price of reaching these giddy heights can only be afforded by those who already have their heads in the cloud, both figuratively, technically and financially.