[1] ZHOU G X,CICHOCKI A,XIE S L.Fast nonnegative matrix/tensor factorization based on low-rank approximation[J].IEEE Transactions on Signal Processing,2012,60(6):2928-2940. [2] CICHOCKI A,MANDIC D,DE LATHAUWER L,et al.Tensor decompositions for signal processing applications:from two-way to multiway component analysis[J].IEEE Signal Processing Magazine,2015,32(2):145-163. [3] QIU Y C,ZHOU G X,ZHANG Y,et al.Canonical Polyadic Decomposition(CPD) of big tensors with low multilinear rank[J].Multimedia Tools and Applications,2021,80(15):22987-23007. [4] QIU Y C,SUN W J,ZHANG Y,et al.Approximately orthogonal nonnegative Tucker decomposition for flexible multiway clustering[J].Science China Technological Sciences,2021,64(9):1872-1880. [5] KOLDA T G,BADER B W.Tensor decompositions and applications[J].SIAM Review,2009,51(3):455-500. [6] LIU Y,LONG Z,HUANG H,et al.Low CP rank and Tucker rank tensor completion for estimating missing components in image data[J].IEEE Transactions on Circuits and Systems for Video Technology,2020,30:944-954. [7] ZHAO Q B,ZHANG L Q,CICHOCKI A.Bayesian CP factorization of incomplete tensors with automatic rank determination[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2015,37(9):1751-1763. [8] QIU Y N,ZHOU G X,CHEN X Q,et al.Semi-supervised non-negative Tucker decomposition for tensor data representation[J].Science China(Technological Sciences),2021,64(9):1881-1892. [9] PAN J J,NG M K,LIU Y,et al.Orthogonal nonnegative Tucker decomposition[J].SIAM Journal on Scientific Computing,2021,43(1):55-81. [10] OSELEDETS I V.Tensor-train decomposition[J].SIAM Journal on Scientific Computing,2011,33(5):2295-2317. [11] BENGUA J A,PHIEN H N,TUAN H D,et al.Efficient tensor completion for color image and video recovery:low-rank tensor train[J].IEEE Transactions on Image Processing,2017,26(5):2466-2479. [12] YANG Y,KROMPASS D,TRESP V.Tensor-train recurrent neural networks for video classification[EB/OL].[2022-04-11].https://arxiv.org/abs/1707.01786. [13] LEE N,PHAN A H,CONG F,et al.Nonnegative tensor train decompositions for multi-domain feature extraction and clustering[C]//Proceedings of International Conference on Neural Information Processing.Berlin,Germany:Springer,2016:1-9. [14] SHCHERBAKOVA E,TYRTYSHNIKOV E.Nonnegative tensor train factorizations and some applications[M].Berlin,Germany:Springer,2020. [15] YUAN L H,ZHAO Q B,GUI L H,et al.High-order tensor completion via gradient-based optimization under Tensor Train format[J].Image Communication,2019,73(C):53-61. [16] LI S Z,HOU X W,ZHANG H J,et al.Learning spatially localized,parts-based representation[C]//Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.Washington D.C.,USA:IEEE Press,2003:1-10. [17] CAI D,HE X F,WU X Y,et al.Non-negative matrix factorization on manifold[C]//Proceedings of the 8th IEEE International Conference on Data Mining.Washington D.C.,USA:IEEE Press,2009:63-72. [18] LENG C C,ZHANG H,CAI G R,et al.Graph regularized LP smooth non-negative matrix factorization for data representation[J].IEEE/CAA Journal of Automatica Sinica,2019,6(2):584-595. [19] ROWEIS S T,SAUL L K.Nonlinear dimensionality reduction by locally linear embedding[J].Science,2000,290(5500):2323-2326. [20] TENENBAUM J B,DE SILVA V,LANGFORD J C.A global geometric framework for nonlinear dimensionality reduction[J].Science,2000,290(5500):2319-2323. [21] BELKIN M,NIYOGI P.Laplacian eigenmaps for dimensionality reduction and data representation[J].Neural Computation,2003,15(6):1373-1396. [22] CAI D,HE X F,HAN J W,et al.Graph regularized nonnegative matrix factorization for data representation[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2011,33(8):1548-1560. [23] LI X L,CUI G S,DONG Y S.Graph regularized non-negative low-rank matrix factorization for image clustering[J].IEEE Transactions on Cybernetics,2017,47(11):3840-3853. [24] QIU Y N,ZHOU G X,ZHANG Y,et al.Graph regularized nonnegative tucker decomposition for tensor data representation[C]//Proceedings of 2019 IEEE International Conference on Acoustics,Speech and Signal Processing.Washington D.C.,USA:IEEE Press,2019:8613-8617. [25] SOFUOGLU S E,AVIYENTE S.Graph regularized tensor train decomposition[C]//Proceedings of 2020 IEEE International Conference on Acoustics,Speech and Signal Processing.Washington D.C.,USA:IEEE Press,2020:3912-3916. [26] 吴泽鑫.图约束非负张量列分解算法及其在特征提取中的应用[D].广州:广东工业大学,2021.WU Z X.Graph constrained nonnegative tensor column decomposition algorithm and its application in feature extraction[D].Guangzhou:Guangdong University of Technology,2021.(in Chinese) [27] GAO Y,ZHANG Z Z,LIN H J,et al.Hypergraph learning:methods and practices[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2022,44(5):2548-2566. [28] SHENG H,AHMED E,DAN Y.On the effect of hyperedge weights on hypergraph learning[J].Image and Vision Computing,2017,57:89-101. [29] SCHÖLKOPF B,PLATT J,HOFMANN T.Learning with hypergraphs:clustering,classification,and embedding[M].Cambridge,USA:MIT Press,2007. [30] ZENG K,YU J,LI C H,et al.Image clustering by hyper-graph regularized non-negative matrix factorization[J].Neurocomputing,2014,138:209-217. [31] 陈璐瑶,刘奇龙,许云霞,等.基于超图正则化非负Tucker分解的图像聚类算法[J].计算机工程,2022,48(4):197-205.CHEN L Y,LIU Q L,XU Y X,et al.Image clustering algorithm based on hypergraph regularized nonnegative Tucker decomposition[J].Computer Engineering,2022,48(4):197-205.(in Chinese) [32] XU W,LIU X,GONG Y H.Document clustering based on non-negative matrix factorization[C]//Proceedings of the 26th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval.New York,USA:ACM Press,2003:267-273. [33] MACQUEEN J B.Some methods for classification and analysis of multivariate observations[EB/OL].[2022-04-11].https://www.semanticscholar.org/paper/Some-methods-for-classification-and-analysis-of-MacQueen/ac8ab51a86f1a9ae74dd0e4576d1a019f5e654ed?p2df. |