IntentionQA: A Benchmark for Evaluating Purchase Intention Comprehension Abilities of Language Models in E-commerce

Bibliographic Details
Title: IntentionQA: A Benchmark for Evaluating Purchase Intention Comprehension Abilities of Language Models in E-commerce
Authors: Ding, Wenxuan, Wang, Weiqi, Kwok, Sze Heng Douglas, Liu, Minghao, Fang, Tianqing, Bai, Jiaxin, Liu, Xin, Yu, Changlong, Li, Zheng, Luo, Chen, Yin, Qingyu, Yin, Bing, He, Junxian, Song, Yangqiu
Publication Year: 2024
Collection: Computer Science
Subject Terms: Computer Science - Computation and Language
More Details: Enhancing Language Models' (LMs) ability to understand purchase intentions in E-commerce scenarios is crucial for their effective assistance in various downstream tasks. However, previous approaches that distill intentions from LMs often fail to generate meaningful and human-centric intentions applicable in real-world E-commerce contexts. This raises concerns about the true comprehension and utilization of purchase intentions by LMs. In this paper, we present IntentionQA, a double-task multiple-choice question answering benchmark to evaluate LMs' comprehension of purchase intentions in E-commerce. Specifically, LMs are tasked to infer intentions based on purchased products and utilize them to predict additional purchases. IntentionQA consists of 4,360 carefully curated problems across three difficulty levels, constructed using an automated pipeline to ensure scalability on large E-commerce platforms. Human evaluations demonstrate the high quality and low false-negative rate of our benchmark. Extensive experiments across 19 language models show that they still struggle with certain scenarios, such as understanding products and intentions accurately, jointly reasoning with products and intentions, and more, in which they fall far behind human performances. Our code and data are publicly available at https://github.com/HKUST-KnowComp/IntentionQA.
Comment: Findings of EMNLP 2024
Document Type: Working Paper
Access URL: http://arxiv.org/abs/2406.10173
Accession Number: edsarx.2406.10173
Database: arXiv
More Details
Description not available.