论文标题

度量值回归

Metric-valued regression

论文作者

Cohen, Dan Tsir, Kontorovich, Aryeh

论文摘要

我们提出了一种有效的算法,用于在两个度量空间($ \ x $和$ \ y $)之间学习映射。只要$ \ x $和$ \ y $在拓扑上可分开,并且$ \ y $是“期望的”(我们的术语;可分离性假设可以有所削弱),我们的程序就会强烈地持续使用。在这种一般性的水平上,我们的是在不可知论环境中无限损失的第一个可学习性结果。我们的技术基于度量基础(Fréchet手段的一种变体),并与现有方法显着不同,正如我们所证明的那样,这些方法无法实现一般实例和标签空间度量标准的贝叶斯一致性。我们的证明介绍了{\ em半稳定压缩}的技术,这可能是独立的。

We propose an efficient algorithm for learning mappings between two metric spaces, $\X$ and $\Y$. Our procedure is strongly Bayes-consistent whenever $\X$ and $\Y$ are topologically separable and $\Y$ is "bounded in expectation" (our term; the separability assumption can be somewhat weakened). At this level of generality, ours is the first such learnability result for unbounded loss in the agnostic setting. Our technique is based on metric medoids (a variant of Fréchet means) and presents a significant departure from existing methods, which, as we demonstrate, fail to achieve Bayes-consistency on general instance- and label-space metrics. Our proofs introduce the technique of {\em semi-stable compression}, which may be of independent interest.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源