'No, they did not': Dialogue response dynamics in pre-trained language models

Bibliographic Details
Title: 'No, they did not': Dialogue response dynamics in pre-trained language models
Authors: Kim, Sanghee J., Yu, Lang, Ettinger, Allyson
Publication Year: 2022
Collection: Computer Science
Subject Terms: Computer Science - Computation and Language
More Details: A critical component of competence in language is being able to identify relevant components of an utterance and reply appropriately. In this paper we examine the extent of such dialogue response sensitivity in pre-trained language models, conducting a series of experiments with a particular focus on sensitivity to dynamics involving phenomena of at-issueness and ellipsis. We find that models show clear sensitivity to a distinctive role of embedded clauses, and a general preference for responses that target main clause content of prior utterances. However, the results indicate mixed and generally weak trends with respect to capturing the full range of dynamics involved in targeting at-issue versus not-at-issue content. Additionally, models show fundamental limitations in grasp of the dynamics governing ellipsis, and response selections show clear interference from superficial factors that outweigh the influence of principled discourse constraints.
Comment: 12 pages, 8 figures, COLING 2022, see https://github.com/sangheek16/dialogue-response-dynamics for codes and material
Document Type: Working Paper
Access URL: http://arxiv.org/abs/2210.02526
Accession Number: edsarx.2210.02526
Database: arXiv
More Details
Description not available.