A Knowledge-Grounded Neural Conversation Model

MSR, AAAI 2018

TL;DR

  1. 過於 general
  2. 沒有提供有用資訊

解決 chatbot 回覆

的問題

Example

User input: Going to Kusakabe tonight.

Neural model: Have a great time!

Human: You’ll love it! Try omasake, the best in town.

對話生成流程

  1. identify focus in input (by keyword matching, NER, etc.)
  2. retrieve relevant facts of the focus.
  3. feed (1) conversation and (2) relevant facts into NN.

Model

Dialogue Encoder / Decoder

Seq2Seq: GRU-RNN

Facts Encoder

Memory Network Sukhbaatar et al. 2015

Facts Encoder(Cont'd)

Memory Network - RNN

Experiment

Data

Knowledge Foursquare

Conversation

  • General Dataset 23M Twitter 3-turn conv.
  • Grounded Dataset 1M [Twitter 2-turn conv. with @handle] augmented w/ Foursquare

Tasks (Multi-task learning)

  • NoFact w/o fact encoder
  • Fact train on 
  • AutoEncoder train on 

 

Multi-Task Learning (MTL)

Hard param sharing

Soft param sharing

Tasks (Multi-task learning)

  • NoFact w/o fact encoder
  • Fact train on 
  • AutoEncoder train on 

 

Systems

  • Seq2Seq NoFact-General
  • MTask NoFact-General + NoFact-Grounded 
  • MTask-R NoFact-General + Fact-Grounded
  • MTask-F NoFact-General + Autoencoder-Grounded
  • MTask-RF NoFact-General + Fact-Grounded + Autoencoder-Grounded


Perplexity

BLEU-4 & Div.

Human Eval.

Example

A knowledge-grounded neural conversation model

By qitar888

A knowledge-grounded neural conversation model

the lab report of a paper published by MSR @ AAAI '18

  • 573