CREATING A WHATSAPP CHATBOT
Abstract
Conversational modeling is an important task in natural language processing as well as machine
learning. Like most important tasks, it’s not easy. Previously, conversational models have been focused on
specific domains, such as booking hotels or recommending restaurants. They were built using hand-crafted rules,
like ChatScript , a popular rule-based conversational model. In 2014, the sequence to sequence model being
used for translation opened the possibility of phrasing dialogues as a translation problem: translating from an
utterance to its response. The systems built using this principle, while conversing fairly fluently, aren’t very
convincing because of their lack of personality and inconsistent persona . In this paper, we experiment building
open-domain response generator with personality and identity. We built chatbots that imitate characters in
popular TV shows: Barney from How I Met Your Mother, Sheldon from The Big Bang Theory, Michael from
The Office, and Joey from Friends. A successful model of this kind can have a lot of applications, such as
allowing people to speak with their favorite celebrities, creating more life-like AI assistants, or creating virtual
alter-egos of ourselves. The model was trained end-to-end without any hand-crafted rules. The bots talk
reasonably fluently, have distinct personalities, and seem to have learned certain aspects of their identity. The
results of standard automated translation model evaluations yielded very low scores. However, we designed an
evaluation metric with a human judgment element, for which the chatbots performed well. We are able to
show that for a bot’s response, a human is more than 50% likely to believe that the response actually came from
the real character.