论文标题

部分监督多器官分段的边际损失和排除损失

Marginal loss and exclusion loss for partially supervised multi-organ segmentation

论文作者

Shi, Gonglei, Xiao, Li, Chen, Yang, Zhou, S. Kevin

论文摘要

在医学图像中注释多个器官既昂贵又耗时。因此,现有带有标签的多器官数据集通常在样本量中较低,并且主要标记为部分标记,也就是说,数据集具有标记的一些器官,但不是全部器官。在本文中,我们研究了如何从此类数据集的联合学习一个多器官分割网络。为此,我们提出了两种类型的新型损失功能,尤其是针对这种情况设计的:(i)边际损失和(ii)排除损失。由于实际上是部分标记的图像的背景标签是所有未标记器官和“ True”背景的“合并”标签(从完整标签的意义上),因此此“合并”背景标签的概率是边际概率,在合并之前总结了相关概率。该边缘概率可以插入任何现有的损失函数(例如横熵损失,骰子损失等)以形成边际损失。利用器官是非重叠的事实,我们提出排除损失,以评估标记的器官之间的差异和未标记器官的估计分割。在肝脏,脾,左右肾脏的多器官分割中进行了五个基准数据集的联合实验,表明,使用我们新提出的损失功能,可以在不引入任何额外计算的情况下使用我们的新提出的损失功能来改善最先进的方法。

Annotating multiple organs in medical images is both costly and time-consuming; therefore, existing multi-organ datasets with labels are often low in sample size and mostly partially labeled, that is, a dataset has a few organs labeled but not all organs. In this paper, we investigate how to learn a single multi-organ segmentation network from a union of such datasets. To this end, we propose two types of novel loss function, particularly designed for this scenario: (i) marginal loss and (ii) exclusion loss. Because the background label for a partially labeled image is, in fact, a `merged' label of all unlabelled organs and `true' background (in the sense of full labels), the probability of this `merged' background label is a marginal probability, summing the relevant probabilities before merging. This marginal probability can be plugged into any existing loss function (such as cross entropy loss, Dice loss, etc.) to form a marginal loss. Leveraging the fact that the organs are non-overlapping, we propose the exclusion loss to gauge the dissimilarity between labeled organs and the estimated segmentation of unlabelled organs. Experiments on a union of five benchmark datasets in multi-organ segmentation of liver, spleen, left and right kidneys, and pancreas demonstrate that using our newly proposed loss functions brings a conspicuous performance improvement for state-of-the-art methods without introducing any extra computation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源