The State of Natural Language Processing – Giant Prospects, Great Challenges


By Pawel Godula, Director of Customer Analytics,

Natural Language Processing (NLP) is one of the most dynamic areas of AI and provides business with a host of opportunities. Yet there are many challenges to overcome.

In 1950, Alan Turing proposed a test for a machine’s intelligence. In order to pass it, a machine should be able to hold a conversation that would be indistinguishable from one a human would produce.

Putting aside machines’ capacity to think, Turing considered the ability to respond appropriately and follow the conversation to be the point at which automation ends and intelligence begins. “If they find a parrot who could answer to everything, I would claim it to be intelligent being without hesitation,” claimed Denis Diderot two hundred years before Turing came along.

Pawel Godula, Director of Customer Analytics,

The idea behind the Turing test implies that natural language processing is considered a frontier for Artificial Intelligence. History and the modern approach to text processing prove that the ability to analyze, understand and think are totally different.

Where NLP is used

On the most basic level, NLP doesn’t require Artificial Intelligence and machine learning. The auto correction function found in most of today’s text software is a simple, rules-based way to analyze natural language by comparing written text with a dictionary database. It improves countless lives every day, yet has nothing to do with AI.

Automatic filtering is another example of NLP, one used primarily by media vendors seeking to avoid the swearing and hate speech in the comments sections at the ends of articles. As machines can filter out only specific words, they are unable to understand the context or see through the simple tricks used to throw them off. Is “duck” a dirty word, after all?

NLP business applications

Even with its flaws, today’s NLP is maturing rapidly and getting increasing attention from business. Automated translation (AT) is the best benchmark of the current state of NLP. While far from perfect—it tends to confuse words and lose the context of the translation, often to comic effect—it is the best benchmark of the current state of NLP and is a very useful tool.

Naturally, thousands of people use NLP every day. There are mobile apps that can translate text from a video camera in real time. Just such as tool was put to use during a court hearing, when a British court failed to provide an interpreter for Mandarin speaker Xiu Ping Yang.

NLP is also used to enhance the customer experience, with chatbots being one of the most popular tools. According to a study done by Walker, 88% of buyers are willing to pay more for a better customer experience. That knowledge compelled Domino’s Pizza to give its Dom chatbot a job helping customers order pizza through Facebook messenger or Google Home. Royal Bank of Scotland claims that the use of its mobile app increased 20% after a chatbot was incorporated.

But building a chatbot may not require that much Artificial Intelligence. Sometimes the chatbot provides only a predefined set of answers to the most common questions or automates processes like ordering a product or inquiring about opening hours.

Reputation monitoring

Tasks like online reputation monitoring or sentiment analysis require not only the ability to identify particular words in a sentence, but to understand the sentence itself. faced such a challenge while building a solution for the United Nations Office of Information and Communications Technology. The project was set up to uncover propaganda disseminated via Twitter. With fake news among today’s most pressing concerns, both companies and governments need to constantly keep abreast of stories being published online. After all, fake stories outperform their non-fake counterparts in reaching audience on every subject, from business to science to technology. With NLP, it is possible to analyze not only the constant flow of language-based data from social media, but also to determine if the news shared is real or fake.

The legal way

Another cutting edge example comes from the consultancy giants EY, PwC and Deloitte, which use NLP to review massive numbers of contracts or compliance with lease accounting standards. All that information — not only contracts and legal agreements but also emails, conversations and all text-based data — is unstructured. Gartner estimates that up to 80% of business data today is unstructured. NLP is one of the most powerful tools for analyzing it and gathering meaningful insights. As research done at the University of Rome Tor Vergata suggests, adopting NLP-based techniques results in 12% less effort spent on classifying equivalent requirements.

The NLP market has been forecast to reach $22.3 billion by 2025. The main driver behind the acceleration is building new, scalable solutions for real life problems, both for business and consumers.

NLP – key challenges and how it works

What sets NLP apart from other challenges is the nature of data. Unlike images that can be resized to 500×500 pixels, there is no way to standardize sentences to be “always seven words long”. Natural language is processed by recurrent neural networks that pick words one by one in a queue, as sentences vary in length. Some algorithms work character-by-character while others process a group of characters at once.

Transfer learning uses pre-trained neural networks and is a huge challenge in NLP. So huge, in fact, that some consider it impossible. For images, on the other hand, the first layers are mostly responsible for finding the general structure of the image. Extracting information specific to the task tends to happen in the deeper layers. Recurrent nets, however, do not produce such a clear distinction – when using a pre-trained net, data scientists often “freeze” the first layers of the network during further training. There is no obvious way to freeze a part of a recurrent neural network. Furthermore, the choice of training data is vital in this task – there is a tangible difference between the language used in newspaper articles and, say, tweets, possibly an even bigger disparity than between photos and cartoon-style drawings. In NLP, context is a crucial factor.

The need to interpret everything in a given context is the greatest challenge NLP faces. Chatbots and automated assistants are gaining in popularity. According to PwC data, 72% of business execs use automated assistants and 27% of consumers weren’t sure if their last customer service interaction was with a human or a chatbot. As chatbots are a convenient way to provide answers to the most common questions, they struggle to provide answers to questions that call for deeper context or to analyze more complex feedback. If the customer writes in social media that “this place is way too cool for me”, are they referring to the temperature or to how groovy the place is?

These examples are only the tip of the language contextuality iceberg. Conversation commonly incorporates irony, puns and cultural references, which makes the message more human but blurry and unclear for machines. In NLP, the language has to be natural.

Future perspectives – word embeddings and grammatical gender

The challenge of making words understandable to computers has been tackled by fasttext and word2ved models. In these models, every word on the internet is marked with a multidimensional vector marking its meaning and connotations (commonly known as word embedding). Vectors of a cat, a kitten and a kitty are similar and dissimilar from a car, computer or telephone. With that knowledge, it is easier to tune up the model.

The key difference between image recognition and NLP is that the former makes it possible to train a model using additional images, as our recent publication showed. With NLP, on the other hand, all the words in the network are already described, so there is no need to expand its knowledge. Instead, it is entirely up to the model and the data scientist’s skill to perform tasks with the data provided. Considering that, word embeddings can be compared to the first layers of a pre-trained image recognition network.

Because of the highly contextualized data it must analyze, Natural Language Processing poses an enormous challenge. Language is an amalgam of culture, history and information, the ability to understand and use it is purely humane.

Other challenges are associated with the diversity of languages, with their morphology and flexion. Finnish grammar with sixteen noun cases is hard to compare with English. In many European languages, there is grammatical gender to deal with.

What’s more, words’ meanings tend to vary depending on not only the context of the rest of the text, but also the social background or lifestyle of a conversation partner. Is “savage rap concert” a positive or negative review? It depends on whether it was written by a teen or a grandma, so even proper identification of a language and understanding the whole sentence doesn’t mean the interpretation will be good.

The Turing test proves that the ability to respond in a human way is not merely a question of understanding.

For more information, go to