论文标题

调查同行评审中的公平差异:语言模型增强方法

Investigating Fairness Disparities in Peer Review: A Language Model Enhanced Approach

论文作者

Zhang, Jiayao, Zhang, Hongming, Deng, Zhun, Roth, Dan

论文摘要

双盲同行评审机制已成为包括计算机科学在内的多个学科的学术研究的骨骼,但是一些研究质疑同行评审的质量,并引起了人们对此过程中潜在偏见的关注。在本文中,我们借助大语言模型(LMS)进行了对同行评审中公平差异的彻底严格研究。我们通过汇总OpenReview,Google Scholar,Arxiv和Csranking的数据来收集,组装和维护2017年国际学习表现会议(ICLR)会议的全面关系数据库,并使用语言模型提取高级功能。我们假设和研究有关多种保护性属性的公平差异,包括作者性别,地理,作者和机构声望。我们观察到差异的水平有所不同,文本特征对于减少预测建模的偏见至关重要。我们从研究借助大型LMS的研究分析中提取了一些见解。我们的数据库还提供了研究新的自然语言处理(NLP)方法的途径,以促进对同伴审查机制的理解。我们研究了针对自动机器审核系统的具体示例,并为审核生成和评分任务提供基线模型,以便可以将数据库用作基准。

Double-blind peer review mechanism has become the skeleton of academic research across multiple disciplines including computer science, yet several studies have questioned the quality of peer reviews and raised concerns on potential biases in the process. In this paper, we conduct a thorough and rigorous study on fairness disparities in peer review with the help of large language models (LMs). We collect, assemble, and maintain a comprehensive relational database for the International Conference on Learning Representations (ICLR) conference from 2017 to date by aggregating data from OpenReview, Google Scholar, arXiv, and CSRanking, and extracting high-level features using language models. We postulate and study fairness disparities on multiple protective attributes of interest, including author gender, geography, author, and institutional prestige. We observe that the level of disparity differs and textual features are essential in reducing biases in the predictive modeling. We distill several insights from our analysis on study the peer review process with the help of large LMs. Our database also provides avenues for studying new natural language processing (NLP) methods that facilitate the understanding of the peer review mechanism. We study a concrete example towards automatic machine review systems and provide baseline models for the review generation and scoring tasks such that the database can be used as a benchmark.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源