论文标题
通过Edgeworth会计师对差异隐私的分析组成
Analytical Composition of Differential Privacy via the Edgeworth Accountant
论文作者
论文摘要
许多现代的机器学习算法由简单的私人算法组成;因此,一个越来越重要的问题是有效计算组成下的整体隐私损失。在这项研究中,我们介绍了Edgeworth会计师,这是一种分析方法,用于构成私人算法的差异隐私保证。 Edgeworth会计师首先使用$ f $ - 不同的隐私框架来无损地跟踪构图下的隐私损失,这使我们能够使用隐私损失log-ligikelihoodhiehienhiphohood(PLLRS)表达隐私保证。顾名思义,该会计师接下来使用Edgeworth扩展到上和下限PLLR的总和的概率分布。此外,通过依靠一种使用简单的技术近似复杂分布的技术,我们证明了Edgeworth会计师可以应用于任何噪声添加机制的组成。由于Edgeworth扩展的某些吸引人的功能,该会计师提供的$(ε,δ)$差异隐私范围是非呈现的,基本上没有额外的计算成本,而不是先前的方法,其中运行时间随成分的数量而增加。最后,我们证明了我们的上和下部$(ε,δ)$ - 差异隐私范围在联合分析和培训私人深度学习模型的某些制度中紧密。
Many modern machine learning algorithms are composed of simple private algorithms; thus, an increasingly important problem is to efficiently compute the overall privacy loss under composition. In this study, we introduce the Edgeworth Accountant, an analytical approach to composing differential privacy guarantees of private algorithms. The Edgeworth Accountant starts by losslessly tracking the privacy loss under composition using the $f$-differential privacy framework, which allows us to express the privacy guarantees using privacy-loss log-likelihood ratios (PLLRs). As the name suggests, this accountant next uses the Edgeworth expansion to the upper and lower bounds the probability distribution of the sum of the PLLRs. Moreover, by relying on a technique for approximating complex distributions using simple ones, we demonstrate that the Edgeworth Accountant can be applied to the composition of any noise-addition mechanism. Owing to certain appealing features of the Edgeworth expansion, the $(ε, δ)$-differential privacy bounds offered by this accountant are non-asymptotic, with essentially no extra computational cost, as opposed to the prior approaches in, wherein the running times increase with the number of compositions. Finally, we demonstrate that our upper and lower $(ε, δ)$-differential privacy bounds are tight in federated analytics and certain regimes of training private deep learning models.