论文标题

通过倒置索引加速问题的问题来回答语言模型的任务

Speeding Up Question Answering Task of Language Models via Inverted Index

论文作者

Ji, Xiang, Sungu-Eryilmaz, Yesim, Momeni, Elaheh, Rawassizadeh, Reza

论文摘要

自然语言处理应用程序,例如对话代理及其提问功能,在现实世界中广泛使用。尽管大语言模型(LLM)广泛流行,但很少有现实世界中的对话代理利用LLM。 LLMS消耗的大量资源禁用开发人员将其集成到最终用户应用程序中。在这项研究中,我们利用一种倒置的索引机制与LLM相结合,以提高封闭域问题的避开问题模型的效率。我们的实验表明,使用指数将平均响应时间提高了97.44%。此外,由于搜索范围的降低,使用倒置指数时,平均BLEU得分提高了0.23。

Natural language processing applications, such as conversational agents and their question-answering capabilities, are widely used in the real world. Despite the wide popularity of large language models (LLMs), few real-world conversational agents take advantage of LLMs. Extensive resources consumed by LLMs disable developers from integrating them into end-user applications. In this study, we leverage an inverted indexing mechanism combined with LLMs to improve the efficiency of question-answering models for closed-domain questions. Our experiments show that using the index improves the average response time by 97.44%. In addition, due to the reduced search scope, the average BLEU score improved by 0.23 while using the inverted index.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源