论文标题

YouTube,伟大的激进分子? YouTube建议中的审计和缓解意识形态偏见

YouTube, The Great Radicalizer? Auditing and Mitigating Ideological Biases in YouTube Recommendations

论文作者

Haroon, Muhammad, Chhabra, Anshuman, Liu, Xin, Mohapatra, Prasant, Shafiq, Zubair, Wojcieszak, Magdalena

论文摘要

社交媒体平台的建议算法通常因将用户置于(越来越多的)意识形态上有偏见的内容的“兔子孔”中受到批评。尽管有这些担忧,但有关这种算法激进化的先前证据并不一致。此外,先前的工作缺乏系统的干预措施,从而减少了建议算法中潜在的意识形态偏见。我们使用成千上万个袜子木偶对YouTube的推荐系统进行系统的审核,以确定意识形态偏见的存在(即,建议与用户的意识形态保持一致),其规模(即用户建议使用越来越多的视频与意识形态一致),并且是其意识形态的一致性),以及渐进式化(即逐渐逐步促进)。此外,我们设计和评估了自下而上的干预措施,以最大程度地减少意识形态偏见,而无需依靠YouTube的合作。我们发现,YouTube的建议确实指导用户(尤其是右倾用户)在主页上和临时建议中都在意识形态上有偏见且越来越激进的内容。我们的干预有效地减轻了观察到的偏见,从而提出了更多关于意识形态中性,多样化和不同内容的建议,但是对于右倾的用户而言,Debias尤其具有挑战性。我们的系统评估表明,尽管YouTube建议导致意识形态偏见,但可以通过我们的干预来减轻这种偏见。

Recommendations algorithms of social media platforms are often criticized for placing users in "rabbit holes" of (increasingly) ideologically biased content. Despite these concerns, prior evidence on this algorithmic radicalization is inconsistent. Furthermore, prior work lacks systematic interventions that reduce the potential ideological bias in recommendation algorithms. We conduct a systematic audit of YouTube's recommendation system using a hundred thousand sock puppets to determine the presence of ideological bias (i.e., are recommendations aligned with users' ideology), its magnitude (i.e., are users recommended an increasing number of videos aligned with their ideology), and radicalization (i.e., are the recommendations progressively more extreme). Furthermore, we design and evaluate a bottom-up intervention to minimize ideological bias in recommendations without relying on cooperation from YouTube. We find that YouTube's recommendations do direct users -- especially right-leaning users -- to ideologically biased and increasingly radical content on both homepages and in up-next recommendations. Our intervention effectively mitigates the observed bias, leading to more recommendations to ideologically neutral, diverse, and dissimilar content, yet debiasing is especially challenging for right-leaning users. Our systematic assessment shows that while YouTube recommendations lead to ideological bias, such bias can be mitigated through our intervention.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源