Question-answering and Chatbots using Memory Networks

Abstract

Natural language understanding (NLU) tasks can be thought of as an umbrella term encompassing research areas that involve reasoning over text in both syntactic and semantic ways, such as text summarization, machine translation, and conversation modelling (that is, chatbots). An interesting line of natural language processing (NLP) research deals with decomposing all NLU tasks into a simple Question-Answer (QA) framework, where the model must reason over an input text (for example, a Wikipedia article on dogs), to answer questions such as, What is the most common breed of dogs? What is the summary of the article? What is the translation into French?. In this chapter, we will understand the QA task and introduce a class of deep learning models known as memory networks to build QA systems. We shall then learn about the various components of an end-to-end trained chatbot model and extend memory networks to build conversational chatbots.

Publication
Book chapter, Hands-On NLP with Python (R. Arumugam and R. Shanmugamani, eds.), pp. 175-199, Packt Publishers

Related