[1] ZHU F, CHEN X C, CHEN S, et al. Relative margin induced support vector ordinal regression[J]. Expert Systems with Applications, 2023, 231: 120766. [2] CHU W, KEERTHI S S. Support vector ordinal regression[J]. Neural Computation, 2007, 19(3): 792-815. [3] BELLMANN P, SCHWENKER F. Ordinal classification: working definition and detection of ordinal structures[J]. IEEE Access, 2020, 8: 164380-164391. [4] 王雅辉, 钱宇华, 刘郭庆. 基于模糊优势互补互信息的有序决策树算法[J]. 计算机应用, 2021, 41(10): 2785. WANG Y H, QIAN Y H, LIU G Q. Ordered decision tree algorithm based on fuzzy dominance complementarity mutual information[J]. Journal of Computer Applications, 2021, 41(10): 2785. (in Chinese) [5] ZHU H P, SHAN H M, ZHANG Y H, et al. Convolutional ordinal regression forest for image ordinal estimation[J]. IEEE Transactions on Neural Networks and Learning Systems, 2021, 33(8): 4084-4095. [6] WANG J H, CHENG Y, CHEN J T, et al. Ord2Seq: regarding ordinal regression as label sequence prediction[C]//Proceedings of the IEEE/CVF International Conference on Computer Vision. Washington D. C., USA: IEEE Press, 2023: 5865-5875. [7] CRAMMER K, SINGER Y. PRanking with ranking[J]. Advances in Neural Information Processing Systems, 2001, 14: 641-647. [8] SHEN L B, JOSHI A K. Ranking and reranking with perceptron[J]. Machine Learning, 2005, 60: 73-96. [9] VARGAS V M, GUTIÉRREZ P A, HERVAS-MARTINEZ C. Cumulative link models for deep ordinal classification[J]. Neurocomputing, 2020, 401: 48-58. [10] SHI W L, GU B, LI X, et al. Quadruply stochastic gradient method for large scale nonlinear semi-supervised ordinal regression AUC optimization[C]//Proceedings of the AAAI Conference on Artificial Intelligence.[S. l.]: AAAI Press, 2020: 5734-5741. [11] TSUCHIYA T, CHAROENPHAKDEE N, SATO I, et al. Semisupervised ordinal regression based on empirical risk minimization[J]. Neural Computation, 2021, 33(12): 3361-3412. [12] CHEN H Y, JIA Y Z, GE J M, et al. Incremental learning algorithm for large-scale semi-supervised ordinal regression[J]. Neural Networks, 2022, 149: 124-136. [13] STEWART N, BROWN G D A, CHATER N. Absolute identification by relative judgment[J]. Psychological Review, 2005, 112(4): 881. [14] SADER M, VERWAEREN J, PEREZ-FERNANDEZ R, et al. Integrating expert and novice evaluations for augmenting ordinal regression models[J]. Information Fusion, 2019, 51: 1-9. [15] TANG M Z, PEREZ-FERNANDEZ R, BAETS B. A comparative study of machine learning methods for ordinal classification with absolute and relative information[J]. Knowledge-Based Systems, 2021, 230: 107358. [16] LIU T Y. Learning to rank for information retrieval[J]. Foundations and Trends® in Information Retrieval, 2009, 3(3): 225-331. [17] FOGEL F, D’ASPREMONT A, VOJNOVIC M. Spectral ranking using seriation[J]. Journal of Machine Learning Research, 2016, 17(88): 1-45. [18] 张会影, 圣文顺. 基于标记适应的人脸年龄识别优化算法[J]. 计算机工程, 2025, 51(1): 174-181. ZHANG H Y, SHENG W S. Improved algorithm for facial age recognition based on label adaptation[J]. Computer Engineering, 2025, 51(1): 174-181. (in Chinese) [19] TANG M, PÉREZ-FERNÁNDEZ R, BAETS B. Combining absolute and relative information with frequency distributions for ordinal classification[C]//Proceedings of the 18th International Conference on Information Processing and Management of Uncertainty in Knowledge-Based Systems. Lisbon, Portugal: Springer International Publishing, 2020: 594-602. [20] TANG M, PÉREZ-FERNÁNDEZ R, BAETS B. Ordinal classification with a spectrum of information sources[J]. Expert Systems with Applications, 2022, 208: 118163. [21] TANG M, PÉREZ-FERNÁNDEZ R, BAETS B. Fusing absolute and relative information for augmenting the method of nearest neighbors for ordinal classification[J]. Information Fusion, 2020, 56: 128-140. [22] TANG M, PÉREZ-FERNÁNDEZ R, BAETS B. Distance metric learning for augmenting the method of nearest neighbors for ordinal classification with absolute and relative information[J]. Information Fusion, 2021, 65, 72-83. [23] ARLEGI R, DIMITROV D. Fair elimination-type competitions[J]. European Journal of Operational Research, 2020, 287(2): 528-535. [24] BRADLEY R A, TERRY M E. Rank analysis of incomplete block designs: I. the method of paired comparisons[J]. Biometrika, 1952, 39(3/4): 324-345. [25] LUCE R D. Individual choice behavior: a theoretical analysis[M].[S. l.]: Courier Corporation, 2005. [26] CARON F, DOUCET A. Efficient Bayesian inference for generalized Bradley[WT6,3.75BZ]—[WT6BZ]Terry models[J]. Journal of Computational and Graphical Statistics, 2012, 21(1): 174-196. [27] KENETT D Y, PERC M, BOCCALETTI S. Networks of networks[WT6,3.75BZ]—[WT6BZ]an introduction[J]. Chaos, Solitons & Fractals, 2015, 80: 1-6. [28] CUCURINGU M. Sync-rank: robust ranking, constrained ranking and rank aggregation via eigenvector and SDP synchronization[J]. IEEE Transactions on Network Science and Engineering, 2016, 3(1): 58-79. [29] BACCO C, LARREMORE D B, MOORE C. A physical model for efficient ranking in networks[J]. Science Advances, 2018, 4(7): eaar8260. [30] D’ASPREMONT A, CUCURINGU M, TYAGI H. Ranking and synchronization from pairwise measurements via SVD[J]. Journal of Machine Learning Research, 2021, 22(19): 1-63. [31] RIGUTINI L, PAPINI T, MAGGINI M, et al. SortNet: learning to rank by a neural preference function[J]. IEEE Transactions on Neural Networks, 2011, 22(9): 1368-1380. [32] KÖPPEL M, SEGNER A, WAGENER M, et al. Pairwise learning to rank by neural networks revisited: Reconstruction, theoretical analysis and practical performance[C]//Proceedings of European Conference on Machine Learning and Knowledge Discovery in Databases. Würzburg, Germany: Springer International Publishing, 2020: 237-252. [33] BURGES C, SHAKED T, RENSHAW E, et al. Learning to rank using gradient descent[C]//Proceedings of the 22nd International Conference on Machine Learning. New York, USA: ACM Press, 2005: 89-96. [34] MAURYA S K, LIU X, MURATA T. Graph neural networks for fast node ranking approximation[J]. ACM Transactions on Knowledge Discovery from Data, 2021, 15(5): 1-32. [35] HE Y X, GAN Q, WIPF D, et al. GNNRank: learning global rankings from pairwise comparisons via directed graph neural networks[C]//Proceedings of International Conference on Machine Learning.[S. l.]: PMLR, 2022: 8581-8612. [36] TONG Z, LIANG Y, SUN C, et al. Digraph inception convolutional networks[J]. Advances in Neural Information Processing Systems, 2020, 33: 17907-17918. [37] WERRIJ P, KAPTEIN R. Clustering ordinal survey data in a highly structured ranking[C]//Proceedings of Dutch-Belgian Information Retrieval Workshop 2016. New York, USA: ACM, 2016: 1-5. [38] ALDAHDOOH R T, ASHOUR W. DIMK-means—"distance-based initialization method for k-means clustering algorithm"[J]. International Journal of Intelligent Systems and Applications, 2013, 5(2): 41. [39] SHIN N H, LEE S H, KIM C S. Moving window regression: a novel approach to ordinal regression[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. Washington D. C., USA: IEEE Press 2022: 18760-18769. [40] SHI X T, CAO W Z, RASCHKA S. Deep neural networks for rank-consistent ordinal regression based on conditional probabilities[J]. Pattern Analysis and Applications, 2023, 26(3): 941-955. [41] SHEN W, GUO Y L, WANG Y, et al. Deep regression forests for age estimation[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Washington D. C., USA: IEEE Press, 2018: 2304-2313. [42] HU Z J, YANG Z Y, HU X F, et al. SimPLE: similar pseudo label exploitation for semi-supervised classification[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. Washington D.C., USA: IEEE Press, 2021: 15099-15108. [43] SOHN K, BERTHELOT D, CARLINI N, et al. FixMatch: Simplifying semi-supervised learning with consistency and confidence[J]. Advances in Neural Information Processing Systems, 2020, 33: 596-608. |