论文标题
致力于不断学习新语言
Towards continually learning new languages
论文作者
论文摘要
当培训前所有语言都可以使用时,通常会通过批处理学习具有神经网络的多语言语音识别。在先前的培训课程上增加新语言的能力在经济上可能是有益的,但主要的挑战是灾难性的遗忘。在这项工作中,我们结合了重量分解的质量和弹性重量巩固,以应对灾难性的遗忘并迅速学习新语言。这种组合使我们能够消除灾难性的遗忘,同时仍能实现与所有语言相当的新语言的表现,在从最初的10种语言中学习的实验中,可以实现26种语言,而没有灾难性的遗忘,并且与从scratch培训所有语言相比,它没有灾难性的遗忘和合理的表现。
Multilingual speech recognition with neural networks is often implemented with batch-learning, when all of the languages are available before training. An ability to add new languages after the prior training sessions can be economically beneficial, but the main challenge is catastrophic forgetting. In this work, we combine the qualities of weight factorization and elastic weight consolidation in order to counter catastrophic forgetting and facilitate learning new languages quickly. Such combination allowed us to eliminate catastrophic forgetting while still achieving performance for the new languages comparable with having all languages at once, in experiments of learning from an initial 10 languages to achieve 26 languages without catastrophic forgetting and a reasonable performance compared to training all languages from scratch.