论文标题
女权主义对机器人学习过程的看法
Feminist Perspective on Robot Learning Processes
论文作者
论文摘要
正如不同的研究工作报告和日常生活经验所确认的那样,学习模型可能会导致偏见。偏见的学识模型通常会复制社会中的历史歧视,通常会对较少的身份产生负面影响。机器人配备了这些型号,使它们可以操作,每天执行任务更加复杂。学习过程由不同的阶段组成,具体取决于人类的判断。此外,生成的机器人决策模型依赖于记录的标记数据或演示。因此,机器人学习过程容易受到与社会行为相关的偏见。这造成了潜在的危险,尤其是当机器人围绕人类运作时,学习过程可以反映当今目前的社会不公平。不同的女权主义建议研究社会不平等,并为消除各个领域的偏见提供了基本观点。更重要的是,女权主义允许,仍然允许重新配置许多社会动态和刻板印象,这些动态和刻板印象通过他们的多样性来提倡整个人的平等。因此,我们在这项工作的机器人学习过程中提供了女权主义的观点。我们的讨论基于交叉女权主义,社区女权主义,非殖民女权主义和教学法的观点,并以女权主义机器人的方法来构图。在本文中,我们提出了一个初步讨论,以强调女权主义观点探索,预见,最终纠正有偏见的机器人决定的相关性。
As different research works report and daily life experiences confirm, learning models can result in biased outcomes. The biased learned models usually replicate historical discrimination in society and typically negatively affect the less represented identities. Robots are equipped with these models that allow them to operate, performing tasks more complex every day. The learning process consists of different stages depending on human judgments. Moreover, the resulting learned models for robot decisions rely on recorded labeled data or demonstrations. Therefore, the robot learning process is susceptible to bias linked to human behavior in society. This imposes a potential danger, especially when robots operate around humans and the learning process can reflect the social unfairness present today. Different feminist proposals study social inequality and provide essential perspectives towards removing bias in various fields. What is more, feminism allowed and still allows to reconfigure numerous social dynamics and stereotypes advocating for equality across people through their diversity. Consequently, we provide a feminist perspective on the robot learning process in this work. We base our discussion on intersectional feminism, community feminism, decolonial feminism, and pedagogy perspectives, and we frame our work in a feminist robotics approach. In this paper, we present an initial discussion to emphasize the relevance of feminist perspectives to explore, foresee, en eventually correct the biased robot decisions.