论文标题

快速,结构化的扩展量张量分解,用于高光谱脉络

Fast and Structured Block-Term Tensor Decomposition For Hyperspectral Unmixing

论文作者

Ding, Meng, Fu, Xiao, Zhao, Xi-Le

论文摘要

具有多线性等级的扩展量张量分解模型 - $(l_r,l_r,1)$ enter(或,简称“ LL1张量分解”)为线性混合模型下的超光谱Unmixing(HU)提供了有价值的替代方案。特别是,LL1分解可确保在不受经典矩阵分解(MF)方法支持此类保证的情况下的最终成员/丰度可识别性。但是,现有的基于LL1的HU算法使用张量的三因素参数化(即高光谱图像立方体),这导致了许多挑战,包括高读数复杂性,缓慢的收敛性和结合结构先验信息的困难。这项工作提出了基于LL1张量分解的HU算法,该算法使用张量数据的两因素重新参数化。结果,提出了针对HU的两个基于两块的交替梯度投影(GP)的LL1算法。使用精心设计的投影求解器,GP算法具有相对较低的均值复杂性。像基于MF的HU一样,我们的参数化下的因素对应于末日和丰度。因此,提出的框架是自然而然地纳入HU中出现的物理动机先验的。与现有的基于三因素参数化的HU算法相比,所提出的算法通常达到高度加速和实质性HU性能提高。

The block-term tensor decomposition model with multilinear rank-$(L_r,L_r,1)$ terms (or, the "LL1 tensor decomposition" in short) offers a valuable alternative for hyperspectral unmixing (HU) under the linear mixture model. Particularly, the LL1 decomposition ensures the endmember/abundance identifiability in scenarios where such guarantees are not supported by the classic matrix factorization (MF) approaches. However, existing LL1-based HU algorithms use a three-factor parameterization of the tensor (i.e., the hyperspectral image cube), which leads to a number of challenges including high per-iteration complexity, slow convergence, and difficulties in incorporating structural prior information. This work puts forth an LL1 tensor decomposition-based HU algorithm that uses a constrained two-factor re-parameterization of the tensor data. As a consequence, a two-block alternating gradient projection (GP)-based LL1 algorithm is proposed for HU. With carefully designed projection solvers, the GP algorithm enjoys a relatively low per-iteration complexity. Like in MF-based HU, the factors under our parameterization correspond to the endmembers and abundances. Thus, the proposed framework is natural to incorporate physics-motivated priors that arise in HU. The proposed algorithm often attains orders-of-magnitude speedup and substantial HU performance gains compared to the existing three-factor parameterization-based HU algorithms.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源