Thinking Beyond Alexa – the Future of Human / AI Interaction
As AI and Machine Learning technology continues to advance, so do the ways in which humans interact with machines.
Last year, the world went voice search crazy, and we all became modern day Jean Luc Picards, wowing friends by switching on the heating, playing Bohemian Rhapsody or checking the latest cricket score all with a verbal command to Alexa or Siri.
It is all good fun, and there is no doubt that voice search is here to stay and will only get better, but from a business perspective, the real action is taking place in the area of text analytics. NLP Consulants will be quick to tell you that there are three enablers needed for AI to exist:
- Data – the system needs to be fed with information in order for it to “know” things.
- Computation – this is the processing system that allows the AI to draw conclusions and make decisions.
- Communication – it is no use having intelligence, artificial or otherwise, if you cannot receive and transmit knowledge.
The third aspect is often the last to cross people’s minds, but is absolutely critical to success. How can we hope to create successful AI without providing it with information and knowledge – and how can we derive any benefits from it if it cannot reciprocate the communication?
Natural language processing
Natural Language Processing (NLP) is defined as “the ability of machines to understand and interpret human language in the way it is written or spoken.” The optimum NLP system would make a machine that is as capable as humans at understanding language.
It is easy to go down a blind alley with the current voice technology. Saying that Alexa understands you because you can use the technology to switch on the lights or check the weather forecast is a little like assuming a Spanish waiter speaks fluent English because he understands whether you want tea or coffee and can wish you a nice day when you leave the cafe.
Data Mining Social Media
One area that is rapidly developing is data mining and analysis of social media. Massimo Poesio from the School for Computer Science and Electronic Engineering, University of Essex, co-published a paper that looked at how they used Natural Language Processing techniques to predict the EU referendum outcome. Where traditional polls failed, social media mining and NLP predicted that a majority of people wanted to leave the EU. You can read the paper here (pdf download).
Advances on NLP
Creating algorithms that effectively bridge the gap between machine language and human language represents a significant step in AI. Perhaps the greatest advance to date has been in the shape of IBM Watson. Plenty has been written about the business applications Watson can bring, in areas including medicine and cyber security.
But Watson can only add value through accurately understanding the data it is being fed, and that is where NLP comes in. To use a medical example, it has been estimated that it would take 160 hours of reading every week just to keep up with new medical knowledge as it is published. Watson can absorb this information in seconds, but it is only worth doing so if the software understands every nuance of what is being written.
Watson was given a very public demonstration when it took on two champions in the TV quiz show Jeopardy! and emerged victorious. But while the stunt showcased just how smart AI can be, it also demonstrated that there is still a long road ahead.
Still, the most recent news that Japanese insurance firm Fukoku Mutual has replaced 30 humans with a Watson-based AI system shows the way the wind is blowing. The backlash is inevitable, but it is worth considering that the first robots on manufacturing assembly lines probably received a similarly chilly reception.