论文标题

用于自适应学习方法的高斯内核方差

Gaussian Kernel Variance For an Adaptive Learning Method on Signals Over Graphs

论文作者

Zhao, Yue, Ayanoglu, Ender

论文摘要

本文讨论了一种特殊的简单但可能功能强大的算法,称为单主毕业生(SKG),这是一种自适应学习方法,使用已知的节点值和网络结构来预测网络中未知的nodal值。我们旨在找出如何在应用算法时配置特殊类型的模型。更具体地说,我们专注于使用高斯内核的SKG,并指定如何找到适合内核方差。为此,我们介绍了两个变量,我们可以通过这些变量来建立高斯内核方差的要求,以实现(接近)最佳性能,并可以更好地了解SKG的工作原理。我们的贡献是,我们介绍了两个变量作为分析工具,说明了在不同的高斯内核下将如何影响预测,并提供了一种算法,找到合适的高斯内核,用于SKG,并了解培训网络。提供了真实数据集的仿真结果。

This paper discusses a special kind of a simple yet possibly powerful algorithm, called single-kernel Gradraker (SKG), which is an adaptive learning method predicting unknown nodal values in a network using known nodal values and the network structure. We aim to find out how to configure the special kind of the model in applying the algorithm. To be more specific, we focus on SKG with a Gaussian kernel and specify how to find a suitable variance for the kernel. To do so, we introduce two variables with which we are able to set up requirements on the variance of the Gaussian kernel to achieve (near-) optimal performance and can better understand how SKG works. Our contribution is that we introduce two variables as analysis tools, illustrate how predictions will be affected under different Gaussian kernels, and provide an algorithm finding a suitable Gaussian kernel for SKG with knowledge about the training network. Simulation results on real datasets are provided.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源