Although sometimes overrated to the detriment of non-verbal one, language is an essential part of communication. It’s pretty apparent for us, yet we don’t even consider it worthy of explaining.
It’s a shame that things are so obvious that we do not appreciate the gift given to us.
Fear not, though. This article is not meant to be about gratefulness. Rather things you can do, when you understand when you are conscious that the ability to wield natural language is not trivial.
First of all, you should think about language as an interface. A verbal, linguistic interface. Why verbal and linguistic? We’ll come back to it later. We need to know now that natural language is not the raw textual protocol because textual interfaces have been introduced a long time ago and used almost since the beginning of computers. Natural language is something far more ancient.
Besides that, these abstract commands (words) may be, at some level, understandable even to non-specialists; their meaning is hard-coded. However not always apparent at first glance, yet leaving no room for interpretation. It’s language, but it’s not natural.
The first program that mimicked “understanding” the human was probably STUDENT written in ’64. It could “decode” input in simple language (basic algebra) and throw out the answer. However beautiful and universal, math language cannot be counted towards “the Language” as such. But solving math problems seems to be something humanlike, isn’t it?
If something performs actions usually associated with humans, we tend to think about it as something intelligent.
Hence math is not the natural language too. Still, STUDENT was only the beginning; we are going downhill from here.
Shortly after, ELIZA saw the light of the day. She was able to conduct a conversation with a human using adequate language. What can we say about the quality of this conversation? Being honest, hard to tell. A lot of “it depends” over there, but let’s start with Eliza’s overview.
One thing for sure: when introduced, Eliza amazed the scientific (but not only) community. It is crucial, at this, point to remember that it was the first chatbot ever created, so no shock that a lot of ‘users’ got fooled and haven’t noticed that they are chatting with a machine.
Another thing is ELZA was meant to perform, according to Joseph Weizbaum, its creator, “non-directional psychotherapist in an initial psychiatric interview.” When you take a look at an example conversation, you quickly notice the trick behind it.
ELIZA: Please tell me what’s been bothering you.
ME: I have problem
ELIZA: You say you have problem ?
ELIZA: You seem to be quite positive.
ME: not entirely
ELIZA: Do you feel strongly about discussing such things ?
ELIZA: You are sure.
ME: I’m confident
ELIZA: Do you enjoy being confident ?
ME: It helps a lot
ELIZA: Please go on.
The secret lies in asking matching questions. Adequate usage of language, as I mentioned above. But, behind the fun of chatting with “AI”, was my conversation (brief, but never mind that) meaningful? It leads to more or less such a mode of conversation (made-up example.)
BOT: Do you like apples?
ME: I don’t like its peal.
ME: Hurts my gums.
And shortly after…
ME: Well, The government does not provide sufficient support to small entrepreneurs, giving an advantage to the largest corporations and increasing the gap in the textile market even more. Rampaging globalization […]
Eliza’s questions aim for making conversation endless. We do not want it like this! Such deconstruction can be helpful indeed but needs to be meaningful and lead to some point when enough is enough. To make it such, the questions should be relevant. We address this issue in the following paragraph, but I feel obliged to do ELIZA some justice before we go.
This chatbot was meant to be a parody; criticism of such a psychotherapist approach.
Exploration of Natural language, the conversation itself continued with better or worse results until the first breakthrough: the introduction of machine learning techniques (broadly defined as artificial intelligence) for human language recognition.
Sadly, for a long time with no astonishing results. By astonishing, I mean according to our expectations.
Before I present the state of the art of NLU, let me explain the main obstacle in this challenge. The explanation isn’t entirely intuitive, however relatively easy to understand.
Moravec’s paradox is responsible for our misconception of natural voice importance (and not only this). What is it exactly? Shortly speaking: when it comes to AI-shing something, easy tasks are challenging, and difficult things are simple.
How it comes?
There are a few reasons, but more or less, they all come down to the simple fact. We underestimate the millennia of evolution when your brain has adapted to – in fact, tough tasks, like face recognition, balance, manipulating objects, and as you might guess, to speaking, interpreting voice. Basically, everything that we do without consciousness.
Over centuries, these skills were developed and are so firmly rooted in your “operational system” that we perceive them as something trivial.
On the other hand, we have high-level skills, like mathematics, quantum physics, solving abstract problems, playing games like Go or chess etc. Those are relatively new for our brains—these arent’ in fact, challenging, and we simply perceive them like that.
Where are we now?
In comparison to the 60′? Promising! The paradox is still the obstacle, but we figured out a couple of methods that help us to overcome it. What is even more critical, problems like laking data, computational power aren’t that critical like 60 years ago.
The current state of the art of natural language processing is called BERT.
The technique based on are so-called “transformers”: deep learning methods that, simply speaking, take into account the significance of pieces given as input, focusing on elements that matter. Transformers are especially popular in computer vision, and as we mentioned in natural language processing, provide the base for widely admired BERT.
Introduced in 2018 by Google, BERT was initially trained on BookCorpus (a huge collection of 11038 free books written by amateur authors) and English Wikipedia. According to users needs, BERT can be fine-tuned on the desired dataset.
In the course of AVA R&D, we used the ULMFiT method rather than BERT. Even though it can’t beat BERT in terms of accuracy scores, it is more useful in business applications since its computational power requirements are much lower.
What all of this have in common with natural language usage in marketing?
Let’s start with SEO. BERT is used by Google to correctly interpret and adequately respond to users queries, which is a quite gamechanger for the search optimization industry, as former SEO good practices are less and less relevant. Speaking about search – eCommerce search bars.
Customers have already adapted to a specific “search convention” in which they use keywords without bothering to formulate them into meaningful sentences. It’s a sad regress, as first search engines were able to answer complete sentences.
Nowadays, we can answer product search queries given in the natural form. This approach is critical for modern commerce, as the product’s availability (eCommerce growth factor) is no longer a thing.
NLP allows users to search the products the way they were asking salespeople in brick and mortar shops. Here are 3 main game-changing factors:
- Third-party cookies won’t be accessible soon, with the beginning of 2022. Simultaneously zero-party data will gain significance. What are zero party data? Data provided by users willingly on their own initiative – search bar, chat widget, etc.
- Natural Language Queries are a source of information far beyond the regular keyword-based ones. Customers can ask for products freely. Without hesitating to use chosen keywords (we sadly get used to). Good for them, but it’s not all.
- Freely asked questions often provide more information to the answerer.
I’m searching for three for the wedding; however, I’m not the bride.
Oh, definitely not!
As I mentioned, edrone is conducting an R&D project. The aim is to develop a natural language processing-based smart customer assistant for eCommerce, capable of un-predefined conversation for the sake of customer service and sales.
Such an assistant will be fully customizable and tailored for each different type of eCommerce, via the AVA platform. It enables the deployment of such assistants and significantly reinforces eShops’ key features taking into account Customer Care, User Experience, and Fulfillment.
You can learn more about it here:
Digital marketer and copywriter experienced and specialized in AI, design, and digital marketing itself. Science, and holistic approach enthusiast, after-hours musician, and sometimes actor.