List of Topics:
Research Breakthrough Possible @S-Logix pro@slogix.in

Office Address

Social List

Simulating User Satisfaction for the Evaluation of Task oriented Dialogue Systems - 2021

simulating-user-satisfaction-for-the-evaluation-of-task-oriented-dialogue-systems.jpg

Simulating User Satisfaction for the Evaluation of Task oriented Dialogue Systems | S-Logix

Research Area:  Machine Learning

Abstract:

Evaluation is crucial in the development process of task-oriented dialogue systems. As an evaluation method, user simulation allows us to tackle issues such as scalability and cost-efficiency, making it a viable choice for large-scale automatic evaluation. To help build a human-like user simulator that can measure the quality of a dialogue, we propose the following task: simulating user satisfaction for the evaluation of task-oriented dialogue systems. The purpose of the task is to increase the evaluation power of user simulations and to make the simulation more human-like. To overcome a lack of annotated data, we propose a user satisfaction annotation dataset, USS, that includes 6,800 dialogues sampled from multiple domains, spanning real-world e-commerce dialogues, task-oriented dialogues constructed through Wizard-of-Oz experiments, and movie recommendation dialogues. All user utterances in those dialogues, as well as the dialogues themselves, have been labeled based on a 5-level satisfaction scale. We also share three baseline methods for user satisfaction prediction and action prediction tasks. Experiments conducted on the USS dataset suggest that distributed representations outperform feature-based methods. A model based on hierarchical GRUs achieves the best performance in in-domain user satisfaction prediction, while a BERT-based model has better cross-domain generalization ability.

Keywords:  
Task-oriented dialogue systems
Scalability
Cost-efficiency
Prediction
GRUs

Author(s) Name:   Weiwei Sun , Shuo Zhang , Krisztian Balog , Zhaochun Ren, Authors Info & Claims

Journal name:  

Conferrence name:  Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval

Publisher name:  ACM Library

DOI:  10.1145/3404835.3463241

Volume Information: