论文标题

持续的BERT:持续学习,用于自适应提取性摘要COVID-19文献

Continual BERT: Continual Learning for Adaptive Extractive Summarization of COVID-19 Literature

论文作者

Park, Jong Won

论文摘要

科学界每天继续发布与Covid-19有关的大量新研究,从而导致许多文学作品,而没有几乎没有关注。为了帮助社区理解Covid-19文献​​迅速流动的阵列,我们提出了一种新颖的BERT架构,提供了简短而原始的详细摘要。该模型不断以在线方式学习新数据,同时最大程度地减少了灾难性的遗忘,因此适合社区的需求。对其性能的基准和手动检查表明,该模型提供了新科学文献的合理摘要。

The scientific community continues to publish an overwhelming amount of new research related to COVID-19 on a daily basis, leading to much literature without little to no attention. To aid the community in understanding the rapidly flowing array of COVID-19 literature, we propose a novel BERT architecture that provides a brief yet original summarization of lengthy papers. The model continually learns on new data in online fashion while minimizing catastrophic forgetting, thus fitting to the need of the community. Benchmark and manual examination of its performance show that the model provide a sound summary of new scientific literature.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源