论文标题

解释COVID-19诊断模型预测中的不确定性

Interpreting Uncertainty in Model Predictions For COVID-19 Diagnosis

论文作者

Murugamoorthy, Gayathiri, Khan, Naimul

论文摘要

Covid-19,由于其加速差,除了典型的实验拭子测试外,还需要使用辅助工具来更快的诊断。胸部X射线射线示例倾向于显示肺部的变化,例如地面玻璃的不透明和周围巩固,可以通过深神经网络检测到。但是,传统的卷积网络将点估计用于预测,缺乏捕获不确定性,这使得它们对采用的可靠性降低了。到目前为止,在预测胸部X射线的阳性病例方面已经有几项作品。但是,在量化这些预测的不确定性,解释不确定性并将其分解为模型或数据不确定性时,探索并不多。为了满足这些需求,我们开发了一个可视化框架,以解决不确定性及其组件的解释性,并在用贝叶斯卷积神经网络计算的预测中具有不确定性。该框架旨在了解胸部X射线图像中各个特征对预测不确定性的贡献。将其作为一种辅助工具可以帮助放射科医生了解为什么该模型提出了预测,以及该模型捕获的特定预测的感兴趣区域是否在诊断中具有重要意义。我们通过基准数据集的几个测试用例证明了该工具在胸部X射线解释中的有用性。

COVID-19, due to its accelerated spread has brought in the need to use assistive tools for faster diagnosis in addition to typical lab swab testing. Chest X-Rays for COVID cases tend to show changes in the lungs such as ground glass opacities and peripheral consolidations which can be detected by deep neural networks. However, traditional convolutional networks use point estimate for predictions, lacking in capture of uncertainty, which makes them less reliable for adoption. There have been several works so far in predicting COVID positive cases with chest X-Rays. However, not much has been explored on quantifying the uncertainty of these predictions, interpreting uncertainty, and decomposing this to model or data uncertainty. To address these needs, we develop a visualization framework to address interpretability of uncertainty and its components, with uncertainty in predictions computed with a Bayesian Convolutional Neural Network. This framework aims to understand the contribution of individual features in the Chest-X-Ray images to predictive uncertainty. Providing this as an assistive tool can help the radiologist understand why the model came up with a prediction and whether the regions of interest captured by the model for the specific prediction are of significance in diagnosis. We demonstrate the usefulness of the tool in chest x-ray interpretation through several test cases from a benchmark dataset.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源