Self-aware cycle curriculum learning for multiple-choice reading comprehension

PeerJ Comput Sci. 2022 Dec 5:8:e1179. doi: 10.7717/peerj-cs.1179. eCollection 2022.

Abstract

Multiple-choice reading comprehension task has recently attracted significant interest. The task provides several options for each question and requires the machine to select one of them as the correct answer. Current approaches normally leverage a pre-training and then fine-tuning procedure that treats data equally, ignoring the difficulty of training examples. To solve this issue, curriculum learning (CL) has shown its effectiveness in improving the performance of models. However, previous methods have two problems with curriculum learning. First, most methods are rule-based, not flexible enough, and usually suitable for specific tasks, such as machine translation. Second, these methods arrange data from easy to hard or from hard to easy and overlook the fact that human beings usually learn from easy to difficult, and from difficult to easy when they make comprehension reading tasks. In this article, we propose a novel Self-Aware Cycle Curriculum Learning (SACCL) approach which can evaluate data difficulty from the model's perspective and train the model with cycle training strategy. The experiments show that the proposed approach achieves better performance on the C 3 dataset than the baseline, which verifies the effectiveness of SACCL.

Keywords: Curriculum learning; Cycle training strategy; Machine reading comprehension; Multiple-choice; Self-aware.

Grants and funding

This work was supported by the Research Program of Science and Technology at Universities of Inner Mongolia Autonomous Region (No. NJZY22189), the Science Foundation for Young Scholars of Chifeng University (No. cfxyqn202149), and the Research and Innovation Team of Complex Analysis and Nonlinear Dynamic Systems of Chifeng University (No. cfxykycxtd202005). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.