Author Login Editor-in-Chief Peer Review Editor Work Office Work

15 October 2019, Volume 45 Issue 10
    

  • Select all
    |
  • LI Jianpeng, SHI Guozhen, LI Li, SUN Deyang, ZHENG Gewei
    Computer Engineering. 2019, 45(10): 1-7. https://doi.org/10.19678/j.issn.1000-3428.0053556
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    This paper proposes a two-level scheduling strategy to address the performance differences,various combinations of cipher service processing commands and cipher algorithms,and random high concurrency of computing nodes in a heterogeneous cloud environment.The strategy supports various cipher processing commands and algorithms.Considering the multiple attributes of request tasks of users and computing nodes in the cloud,it optimizes the Quality of Service(QoS) and success rate of task scheduling of the whole scheduling system from both the task viewpoint and the node viewpoint.Through the mapping of function attributes between tasks and computing nodes,the cipher service requests can be realized correctly.On this basis,the node priority algorithm is used to improve the real-time performance of task processing and success rate of task scheduling in a cipher service system with random high concurrency.Simulation results show that the proposed strategy can guarantee the success rate of task scheduling and effectively improve the execution efficiency and task load balancing performance.Compared with the Dynamic Priority Assignment(DPA) scheme and Genetic algorithm(GA),it can shorten task execution by around 4% and 17% respectively.
  • ZHU Guohui, KANG Xiaoxuan, LEI Lanjie
    Computer Engineering. 2019, 45(10): 8-12. https://doi.org/10.19678/j.issn.1000-3428.0053268
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Aiming at the problems that the fragmentation of physical resources results in the rejection of the embedding requests and reduces utilization of physical resources during the virtual network mapping,a Virtual Network Mapping(VNM) algorithm based on the optimal subnet is proposed.It coarsens network topology using Band Heavy Edge Matching(B-HEM) algorithm by merging the virtual nodes that meet the constraints.A set of candidate physical subnet is created by the Breadth First Search(BFS) algorithm,and the coarsened virtual network request is mapped to the optimal subnet.Simulation results show that the proposed algorithm can reduce the hops of link mapping and improve the request acceptance ratio and the revenue/cost ratio of virtual networks.
  • JIANG Zetao, SHI Chen
    Computer Engineering. 2019, 45(10): 13-18. https://doi.org/10.19678/j.issn.1000-3428.0053469
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In the hybrid cloud environment,in order to meet the cross-domain authentication requirements of identity authentication schemes between different cryptosystems,a cross-domain identity authentication scheme based on Public Key Infrastructure(PKI) and Certificateless Cryptography(CLC) is proposed.The PKI-based multi-center authentication management mechanism is introduced to control and track the anonymous identity of users in different cryptosystem security domains.In the bidirectional authentication process between the user and the cloud service provider,the negotiation of the session key and the conversion of the anonymous identity of different cryptosystems are completed.The analysis results show that the scheme can resist replay attacks,replacement attacks and man-in-the-middle attacks while achieving cross-domain identity authentication between different cryptosystems,and it has high security and computational efficiency.
  • LI Wenxin, ZHOU Xiaobo, XU Renhai, QI Heng, LI Keqiu
    Computer Engineering. 2019, 45(10): 19-25,32. https://doi.org/10.19678/j.issn.1000-3428.0054041
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Aiming at the Coflow scheduling problem in the prior knowledge unknown scene,an Approximate Smallest-Effective-Bottleneck-First(A-SEBF) Coflow scheduling method is proposed.Coflow's scheduling order is determined by combining the current size and width of Coflow,and the Coflow is characterized by large and small flows,as well as features such as fat,short and thin,so as to increase the space for scheduling optimization.Experimental results show that compared with the Aalo method in the prior knowledge unknown scene,the method can reduce the average completion time of Coflow by 33.2%.Compared with the SEBF method in the prior knowledge known scene,the average completion time of Coflow lags only 7.3% in performance.
  • ZHENG Chuhong, PENG Yong, XU Yiming, LIAO Yi
    Computer Engineering. 2019, 45(10): 26-32. https://doi.org/10.19678/j.issn.1000-3428.0053218
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In order to solve multi-objective task scheduling problem in cloud manufacturing environment,this paper proposes a User Preference Task Scheduling Algorithm(UPTSA) through improving Non-dominated Sorting Biogeography-based Optimization(NSBBO) algorithm.The quality of the manufacturing task scheduling scheme is evaluated by the user preference defined by the uniform weight allocation strategy,so that the UPTSA algorithm can find the optimal solution reflecting the user's preference,and the trapezoidal migration rate calculation model is designed to expand the search neighborhood and avoid falling into the local maximum.The example analysis results show that UPTSA algorithm can effectively solve the multi-objective task scheduling problem in cloud manufacturing environment,and provide users with a set of scheduling schemes to assist their decision-making,so as to meet highly personalized user requirements.
  • LIU Kainan
    Computer Engineering. 2019, 45(10): 33-39. https://doi.org/10.19678/j.issn.1000-3428.0053889
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Changing the relationship between virtual machine selection and placement in cloud data centers can improve the overall performance of cloud data centers.For this reason,this paper proposes a virtual machine selection strategy based on task mapping.This strategy takes granularity of tasks,the size of a virtual machine and computing capabilities of a physical machine as indexes.Four selection algorithms,Simple,Multiple(k),Maxsize(u) and Relation,are designed by integrating the selection and placement of virtual machine,so as to construct mathematical models for virtual machine selection based on task mapping.Experimental results based on Cloudsim simulator show that,by using the proposed strategy to optimize the virtual machine selection and placement processes,the energy consumption and times of virtual machine migration can be reduced,and the cost of cloud service providers can be saved.
  • RAN Decheng, WU Dong, QIAN Lei
    Computer Engineering. 2019, 45(10): 40-45. https://doi.org/10.19678/j.issn.1000-3428.0052372
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    An integer matrix multiplication accelerator based on Zynq SoC platform is proposed to satisfy the computing requirements of matrix multiplication of different sizes in deep learning inference.The parallel architecture based on bus broadcasting makes full use of the reusability of on chip data and minimizes the moving range of intermediate cumulative result to reduce the access requirement of external DRAM.By dynamically adjusting the size of matrix blocks,the accelerator can maintain high efficiency in calculating matrix multiplication with irregular shape.Experimental results show that under DeepBench test benchmark,the accelerator can achieve 8.4 times acceleration effect for matrix multiplication of dual-core ARM Cortex-A9 CPU.
  • JIANG Rengui, YANG Siyu, XIE Jiancang, YAN Dongfei, WANG Xiaojie
    Computer Engineering. 2019, 45(10): 46-51. https://doi.org/10.19678/j.issn.1000-3428.0053289
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In order to improve the urban flood control and disaster relief capabilities,a three-dimensional visualization emergency management information system of urban waterlogging is designed integrating information technologies including Beidou satellite monitoring,big data analysis,cloud services and comprehensive integration,et al.The system architecture consists of perception layer,data layer,service layer,application layer and client layer.The Beidou technology is used to implement all-weather multi-dimensional dynamic monitoring and transmission of multi-source information.Massive data is analyzed and stored based on the Hadoop platform to propel the construction of the three-dimensional visualization integrated enviro.This system takes Xi'an city as a study object,providing function modules including multi-source information resource analysis,waterlogging monitoring,alarming and preplan.The results indicate that this system can provide all-weather,all-round waterlogging emergency management services.It has advantages such as strong usability and good scalability,which can provide decision support for scientific urban waterlogging response.
  • YANG Zhenglong, GAO Jianhua
    Computer Engineering. 2019, 45(10): 52-56,63. https://doi.org/10.19678/j.issn.1000-3428.0053243
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Facing a large amount of Internet information,users lack an objective recognized Oracle to verify whether the results returned by the search engine are correct or not.Therefore,the metamorphic test is applied for search engine performance test.For search engines like Baidu,Bing and 360,the corresponding metamorphic relationship is defined by combining search operators.Their retrieval ability and sorting stability are tested,and the test results are quantified by abnormal rate and average Jaccard coefficient.Analysis results show that,among search engines Baidu,Bing and 360,Bing has the lowest abnormal rate and Baidu has the highest ranking stability.Meanwhile,Baidu,Bing and 360 have little difference in keyword search performance in different fields,but there are big differences in search performance of different languages.The results provide a reference for users of different domain when choosing the right search engine,and help search engine developers to find and remove errors in the program.
  • SUN Meidong, LIU Qinrang, LIU Chongyang
    Computer Engineering. 2019, 45(10): 57-63. https://doi.org/10.19678/j.issn.1000-3428.0052444
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    To address the problem of long transmission path and tendency of network congestion of existing multicast routing algorithm in non-fully interconnected 3D Network-on-Chip(3D NoC),a multicast routing algorithm based on region partition is proposed.According to the distribution of destination nodes,data packet transmission is divided into inter-layer transmission and intra-layer transmission.For inter-layer transmission,the nearest Through-Silicon-Via(TSV) in the TSV table is selected as the transmission channel.For intra-layer transmission,the network area is divided according to the coordinates of the source node,and then the rows of the source node are regarded as common path on which the initial packets carrying the addresses of the destination nodes are transmitted.On this basis,destination address column checking and packet replication are carried out.Experimental results show that compared with the 3D LADPM algorithm and the 3D HOE algorithm,the proposed algorithm can shorten the transmission delay and reduce the network packet loss rate.
  • QI Longyun, Lü Xiaoliang, LU Hong, HUANG Hao
    Computer Engineering. 2019, 45(10): 64-69,77. https://doi.org/10.19678/j.issn.1000-3428.0053152
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Formal verification of software is an important means to guarantee the provability,reliability and security of software, but the generation process of traditional formal verification script is complex and requires a lot of manual verification of formal verification experts.In order to improve the efficiency of proof,this paper constructs an automatic proof model,and on this basis,proposes a semantic automatic specification algorithm and its automatic generation algorithm for generating proof scripts.Using C++ and Python and randomly selecting 10 programs in the benchmark data by the interactive theorem prover Isabelle 2017 to run the test,results show that compared with the fully manual operation,the algorithm has higher verification efficiency,and can implement automatic specification and verification of sequential statement.
  • LIN Rongfeng, SHI Jian, ZHU Yanqing, SHEN Yiwei, ZHOU Yu
    Computer Engineering. 2019, 45(10): 70-77. https://doi.org/10.19678/j.issn.1000-3428.0054411
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Design Verifier,the formal verification component of Safety Critical Application Development Environment(SCADE)can be used to verify the safety properties of embedded software systems in the aerospace field,but it cannot adequately describe the safety requirements of complex temporal properties.To solve this problem,a verification method of temporal properties for the SCADE state machine is constructed,which transforms the SCADE model into the NuSMV model.Linear Temporal Logic (LTL) and Computational Tree Logic (CTL) are introduced into the SCADE model requirements specification.Analysis results show that,with the aid of the NuSMV model checker as well as its verification results,complex temporal safety properties can be verified to reduce bugs at the requirements phase in the development cycle,and improve system security and reliability.
  • LONG Ken, QIAN Meiling, YU Xiang, CHEN Kan
    Computer Engineering. 2019, 45(10): 78-83. https://doi.org/10.19678/j.issn.1000-3428.0052219
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In order to allocate bandwidth resources efficiently in wireless virtualization environment,the bandwidth resource orchestration problem based on enhanced Mobile Broadband(eMBB) and ultra Reliable Low Latency Communication(uRLLC) is studied.For small-scale networks,the problem of bandwidth resource orchestration is simplified to the problem of bandwidth resource allocation and the global optimal solution is given.For large-scale networks,the bandwidth resource orchestration problem is transformed into the bandwidth resource orchestration relaxation problem based on the relaxation theory,and a heuristic algorithm HPGH based on the greedy theory is proposed.Simulation results show that by solving the problem of bandwidth resource orchestration,bandwidth resources can be dynamically allocated to users on demand,and the HPGH algorithm can effectively improve the throughput of eMBB services and reduce the maximum transmission delay of uRLLC services.
  • LAN Yawen, LI Qiang, DENG Shutao, HUANG Shiya
    Computer Engineering. 2019, 45(10): 84-89. https://doi.org/10.19678/j.issn.1000-3428.0054355
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    To address the problems that the single attribute characterization ability of the node in Wireless Sensor Network(WSN)is poor and the threshold range is difficult to determine,this paper proposes a distributed fault detection method based on correlative multi-attribute decision-making.Based on the non-uniform clustering network structure,the confidence interval of the cluster head is estimated by combining the distance factor with the correlation analysis result between nodes.Under the condition of reliable cluster head,it makes feedback decision on the state of the member nodes in the cluster through the significance test method,and analyzes the source of the abnormal data through the multi-attribute relevance degree defined by the relative entropy theory.Experimenal results show that,in the face of different fault types,the proposed method can improve the fault detection accuracy of the network nodes effectively and determine the source of abnormal data.On the basis of energy saving,it can ensure the stable operation of the network.
  • ZHENG Wei, ZHANG Zifeng, PAN Hao
    Computer Engineering. 2019, 45(10): 90-95. https://doi.org/10.19678/j.issn.1000-3428.0052257
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In order to better describe the dynamic evolution of mobile social network time series,based on the multifractal features of four typical mobile social networks:Camb lab,MIT,Inf 05,and Roller,this paper proposes an analysis method of mobile social network multifractals based on the box covering algorithm.Through the analysis of the probability density distribution and the partition function of the network,the maximum value f(a),the spectral width W and the degree of symmetry B of the multifractal spectrum are calculated,which proves that the mobile social network has multifractal features.On this basis,the network metric indexs are calculated,and the intrinsic influencing factors of the multifractals of the mobile social network are compared and analyzed.Experimental results show that the network degree distribution is represented by power law distribution.When the assortativity coefficient r is smaller than 0,the multifractal characteristics of the network are more obvious,and the internal structure distribution of the network is more irregular.
  • LIANG Qing, SHANGGUAN Yiwei, ZHANG Wenfei, XIONG Wei
    Computer Engineering. 2019, 45(10): 96-100,109. https://doi.org/10.19678/j.issn.1000-3428.0052477
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In the Unmanned Aerial Vehicle Ad Hoc Network(UANET),a Greedy Perimeter Stateless Routing based on Neighbor Node Screening(GPSR-NS) protocol is proposed to solve the problems of inaccurate neighbor node location and low data forwarding efficiency in GPSR protocols.The protocol uses a failure node screening mechanism to predict the current position of the neighbor nodes,eliminates the failure neighbor nodes,and reduces the probability of the failure node to forward data.At the same time,it uses a hole node screening mechanism to eliminate the neighbor nodes that may become a hole node in the next hop and avoid forwarding the data to the hole node in advance.Thus the protocol can establish a more stable and reliable communication network.Simulation results show that compared with GPSR protocol and MP-GPSR protocol,the average end-to-end delay and routing overhead of the GPSR-NS protocol are reduced by 56.79%,21.94% and 50.67%,38.81%,and the network throughput is increased by 147.86% and 102.12%.
  • HUANG Xiaobing, NIE Lanshun
    Computer Engineering. 2019, 45(10): 101-109. https://doi.org/10.19678/j.issn.1000-3428.0052620
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Aiming at the problem of the downlink cost and high power consumption of Low Power Wide Area Network(LPWAN),this paper proposes a new network architecture OPEN-MAC.The operation of the OPEN-MAC network is defined in the Time Division Multiple Access(TDMA) time slot,and a star network protocol stack capable of adapting to the LPWAN low rate is designed.Commercial low-cost LPWAN devices are used to ensure reliable network uplink and guaranteed network downlink.Experimental results show that the network can realize network downlink data received by the node at any time,the maximum time delay of the downlink phase of the network does not exceed 6 s,and the power consumption of the node is low,the duty ratio is within 0.3%.
  • SU Jiali, WU Zhongdong, DING Longbin, ZHU Jing
    Computer Engineering. 2019, 45(10): 110-115,121. https://doi.org/10.19678/j.issn.1000-3428.0052670
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In LTE-R handover,the A3 event-based handover algorithm is prone to causing ping-pong effect and Radio Link connection Failure(RLF) when running at high speed.Therefore,the RBF neural network-based handover optimization algorithm is proposed.The algorithm collects hys and ttt parameter sets with good handover effect when the train runs at different speeds in a specific environment,and sends them to the RBF neural network for training to obtain the nonlinear expression of hys and ttt at different speed.Based on the received quality of the signal received by the train,the self-correcting term is added to perform secondary adjustment and optimization of hys and ttt.Simulation experimental results on Matlab show that the proposed algorithm reduces the call drop rate and ping-pong handover rate,and improves the handover success rate and robustness of the train in high-speed operation environment.
  • LI Daoquan, ZHANG Yuxia, WEI Yanting
    Computer Engineering. 2019, 45(10): 116-121. https://doi.org/10.19678/j.issn.1000-3428.0052719
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In order to reduce and balance the energy consumption of sensor nodes in Wireless Sensor Network(WSN),a WSN clustering algorithm based on optimal transmission distance and K-means clustering is proposed.According to the hierarchical clustering algorithm,the Clustering Feature(CF) Tree is established,and each leaf node of the CF Tree is regarded as a cluster,so that each cluster is controlled within the optimal transmission distance,and the energy consumption balance of nodes in the clustering is realized.The K-means clustering is optimized by the objective function to ensure the uniform distribution of the number of nodes in the cluster,and it completes the data transmission of nodes on the basis of considering the residual energy and geographical location.Experimental results show that the algorithm can effectively extend the network life cycle while balance network energy consumption.
  • DAI Xianbo, WANG Na, LIU Ying
    Computer Engineering. 2019, 45(10): 122-129. https://doi.org/10.19678/j.issn.1000-3428.0052713
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    By abstracting the Border Gateway Protocol(BGP) update message augmentation anomaly problem into a two-class problem,an Improved Gaussian Kernel Function-based BGP Anomaly Detection(IGKAD) method is proposed.The Fisher-Markov Slector(FMS) feature selection algorithm is used to select the feature that can simultaneously maximize the distance between classes and minimize the distance within the class,and obtain the feature weights of metric classification ability.The improved Gaussian kernel function based on Manhattan distance and feature weight is used to construct the Support Vector Machine(SVM) classification model,and the parameter optimization method based on grid search and cross-validation is combined to improve the classification accuracy of SVM model.By designing the feature efficiency function,the optimal feature subset construction method is given,which is selected as the training dataset.Experimental results show that when the training set contains TOP10 and TOP8 features,the classification accuracy of the IGKAD method is 91.65% and 90.37%,respectively.Compared with the machine learning-based BGP anomaly detection method,the classification performance is better.
  • ZHANG Yiwei, LIN Lin, ZHAO Jian, LI Fajun, LIANG Lixin
    Computer Engineering. 2019, 45(10): 130-133. https://doi.org/10.19678/j.issn.1000-3428.0051952
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    The data access path of the crypto System on Chip(SoC) is an important target of intrusive probe analysis.To resist intrusive analysis,an SoC memory encryption bus is designed by using the LBlock algorithm.Take every 4 rounds of LBlock algorithm hardware structure expansion as 1 clock cycle,so that the 32 bit encryption and decryption timing is compressed to 8 clock cycles,and the 32 bit bus generally used by the data memory is buffered to 64 bit to work with the packet operation of the LBlock algorithm.FPGA verification results show that the design scheme makes the bus of data memory embedded in a chip(RAM,Flash,etc.) can not be read even if it is acquired by the probe attack.The throughput rate of data access reaches 533 KB/s after 8 clock cycles of encryption using 64 bit data blocks,and exhaustive attacks against 32 bit block encryption can be avoided.The implementation cost is reduced.
  • SHI Shuying, HE Jun
    Computer Engineering. 2019, 45(10): 134-138. https://doi.org/10.19678/j.issn.1000-3428.0052520
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    The GRANULE algorithm is an ultra-lightweight block cipher algorithm with good hardware and software implementation performance,but now the algorithm lacks security estimation result under impossible differential analysis.Therefore,using middle error technology,the multiple 5 round impossible differential distinguishers are found.Based on the obtained distinguisher,the impossible differential analysis on 11 round GRANULE64/80 is presented by adding three rounds upward and three rounds downward.The algorithm can recover the 80-bit master key with time complexity of 273.3 11-round encryptions,and a data complexity of 264 chosen-plaintexts.
  • CHEN Yuwan, JIA Xiangdong, FAN Qiaoling, XIE Mangang, JI Shanshan
    Computer Engineering. 2019, 45(10): 139-143,149. https://doi.org/10.19678/j.issn.1000-3428.0052221
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Based on the non-optimal User Cascade(UC) scheme,the physical layer security performance of multi-layer Heterogeneous Networks(HetNets) is studied.The network nodes are modeled as homogeneous Poisson Point Process(PPP) by using stochastic geometry operations,and the system security probability expression of multi-layer HetNets is derived by means of probability and statistics mathematical tools.The influence of transmission power,security threshold and eavesdropper density on the system security probability is analyzed.Simulation results show that when Base Station(BS) transmission power is relatively small,the security performance of the non-optimal UC scheme outperforms the traditional optimal UC scheme.When transmission power is relatively large,the security performance of the two schemes tends to be the same.Then,the system security probability of the non-optimal UC scheme decreases first and then increases with the increase of the transmit power,and the stability is higher.
  • XIA Wentao, PAN Senshan, WANG Liangmin
    Computer Engineering. 2019, 45(10): 144-149. https://doi.org/10.19678/j.issn.1000-3428.0052512
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    To address the low security of Radio Frequency Identification(RFID) system,an ultra-lightweight stream cipher algorithm Willow for RFID tags is proposed.Function taps are selected according to the property of positive difference set to increase the complexity of guess-and-determine attack.In order to reduce the circuit area and power consumption of the algorithm,a dynamic initialization method is adopted,and a counter with a smaller number of digits is used for key indexing and initialization.A comparative experiment is carried out on Design Compiler,and results show that the delay and power consumption of Willow algorithm are lower than those of Grain-v1 and Plantlet algorithm,and it achieves a good balance between hardware performance and security.
  • ZHANG Wei, WANG Yihuai
    Computer Engineering. 2019, 45(10): 150-154,159. https://doi.org/10.19678/j.issn.1000-3428.0051762
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    The current security protection of cloud system is mainly carried out in a single aspect.It is difficult to achieve comprehensive security protection and it lacks the ability to evaluate the security of cloud system.Therefore,a security enhancement algorithm is proposed.The throughput rate and parallelism of the cloud system are improved by the fourth-level fluidization PF_RING,an analysis method based on neural network is applied to detect the security of the cloud system,and the optimal selection algorithm is used to evaluate the security of the cloud system.Experimental results show that the proposed algorithm can process network traffic capture and vulnerability detection in the cloud system in real time,the average invasion protection capacity reaches 94.1%,the packet capture rate of cloud system reaches 97.6%,and its real-time analysis and processing ability of network data in high-speed network environment are improved.
  • PANG Tianyang, LI Yonggui, NIU Yingtao, XIA Zhi, HAN Chen
    Computer Engineering. 2019, 45(10): 155-159. https://doi.org/10.19678/j.issn.1000-3428.0052390
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In order to detect the high-efficiency interference of structural data,the linear block code is taken as the research object,and malicious interference signals are added to the noise model of classical energy detection algorithm to deduce the mathematical expression of test statistics in binary hypothesis model.On this basis,taking the minimum sum of false alarm rate and missed detection rate as the criterion,an energy detection algorithm based on optimal detection threshold is proposed.Simulation results show that the algorithm can detect high-efficiency interference at a low signal-to-interference ratio,and the detection probability is higher than that of the algorithm based on CFAR threshold theory.
  • WANG Zhanwan, LI Guangqiu, QIAN Hui
    Computer Engineering. 2019, 45(10): 160-165,170. https://doi.org/10.19678/j.issn.1000-3428.0051994
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In order to evaluate the physical layer security performance of Transmit Antenna Selection with channel Prediction(TASP)/receive Maximum Ratio Combining(MRC) wireless communication system over Rayleigh block fading channel in multi-eavesdropper scene,the exact analytic expressions for the non-zero security capacity probability,security outage probability and asymptotic security outage probability of TASP/MRC wireless communication systems based on Minimum Mean Square Error(MMSE) channel predictor are derived.On this basis,the impacts of the parameters such as the number of eavesdroppers,normalized feedback delay,the number of transmit/receive antennas on the physical layer security performance of system are analyzed.Numerical calculation and simulation results show that,in a TASP/MRC system,reducing the normalized delay and the number of eavesdroppers or increasing the number of legitimate receive antennas can improve the physical layer security performance.In the case of high Signal Noise Ratio(SNR),the physical layer security diversity gain is equal to the number of legitimate receive antennas,but has nothing to do with the number of eavesdropper and transmit/receive antennas.
  • DILXAT Ghopur, CHEN Cheng, NURMAMAT Helil
    Computer Engineering. 2019, 45(10): 166-170. https://doi.org/10.19678/j.issn.1000-3428.0052085
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In Ciphertext-Policy Attribute-Based Encryption(CP-ABE),the time required for data encryption and decryption is related to the complexity of the access structure,so implementing CP-ABE in mobile devices makes the device under greater computational pressure.Therefore,this paper proposes a verifiable outsourced decryption Online/Offline Ciphertext-Policy Attribute-Based Encryption(OO-CP-ABE) access control scheme with hidden policy.Considering the large amount of computation in the encryption phase,through the online/offline encryption method,the data owner uses a high-performance server to complete a large number of computational operations ahead of time without determining the plaintext and access structure.After determining the plaintext and attributes,the entire encryption process is completed with less computation on its mobile device,thereby reducing the computational burden of the mobile device during the encryption phase.At the same time,the scheme uses a proxy server to decrypt the data and introduces a short signature method to verify the correctness of the decrypted data.Analysis results show that the scheme reduces the computational burden of the mobile device and verifies the correctness of data decrypted by the proxy server.
  • WANG Yang, WU Zhongdong, HUO Zhongcai
    Computer Engineering. 2019, 45(10): 171-175,182. https://doi.org/10.19678/j.issn.1000-3428.0052314
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Traditional machine learning algorithms need to construct sample features manually,which have poor classification effect when dealing with massive multi-source intrusion data in heterogeneous network.To solve this problem,a hybrid deep learning intrusion detection algorithm is proposed combing Deep Belief Network(DBN) with Kernel Extreme Learning Machine(KELM),which is named DBN-KELM.It uses DBN to extract the abstract features of high historical data in dimensional network,so as to obtain the low dimensional representation form of the original data.On this basis,it uses KELM to do supervised learning for low dimensional data to accurately identify the network attack.Simulations are carried out on the NSL-KDD dataset,and the experimental results show that,DBN-KELM algorithm can improve the accuracy of classification,reduce the false alarm rate of small sample attacks and shorten the training time of the classifier.
  • DENG Shuhua, LU Zebin, LI Zhengfa, GAO Xieping
    Computer Engineering. 2019, 45(10): 176-182. https://doi.org/10.19678/j.issn.1000-3428.0052435
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Aiming at the shortcomings of limited throughput of Software Defined Networking(SDN) switch control agent,a switch control agent denial-of-service attack method is proposed.To protect the SDN switch control agent resources,a hierarchical multi-threshold attack detection scheme is designed to detect attacks by calculating the rate of Packet-In messages associated with each port of the SDN switch and comparing it with a given threshold.Experimental results show that the algorithm can detect the switch control agent denial-of-service attacks in real time.Compared with the similar attack detection method,the proposed method can not only identify the attack traffic,but also locate the location of the attacks without changing the network architecture.
  • CHI Yaping, LING Zhiting, WANG Zhiqiang, YANG Jianxi
    Computer Engineering. 2019, 45(10): 183-188,202. https://doi.org/10.19678/j.issn.1000-3428.0051976
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    The Intrusion Detection System(IDS) has high false alarm rate and weak generalization ability in the case of large amount of data,and the single machine learning algorithm can not cope with multiple attack types well.To address this problem,this paper designs an IDS based on Support Vector Machine(SVM) and Adaboost algorithm.It relies on Snort system,which uses Principal Component Analysis(PCA) method to reduce the dimension of extracted features and uses the SVM-Adaboost clustering algorithm as detection engine.NSL-KDD dataset is used for training and testing.Experimental results show that the accuracy of the proposed system reaches 97.3%,which is improved by 4.8% and 14.3% respectively compared with the SVM algorithm and Adaboost algorithm.
  • REN Shengbing, XIE Ruliang
    Computer Engineering. 2019, 45(10): 189-195. https://doi.org/10.19678/j.issn.1000-3428.0050909
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In regularization multi kernel learning,the sparse kernel function weight leads to the loss of useful information and the degradation of generalization performance,while selecting all kernel functions through non-sparse models generates more redundant information and is sensitivity to noise.Aiming at these problems,an elastic-net regularization multi kernel learning algorithm based on AdaBoost architecture is proposed.When the basic classifier is selected at each iteration,the weight of the kernel function is added with the elastic-net regularization,that is,mixed L1 norm and Lp norm constraints.The basic classifier are constructed based on multi basic kernel optimal convex combinations,which are integrated into the final strong classifier.Experimental results show that the proposed algorithm can balance the sparsity and non-sparsity of the weight in kernel function while preserving the advantages of the integrated algorithm.Compared with L1-MKL and Lp-MKL algorithms,it can obtain the classifier with higher classification accuracy in fewer iterations.
  • SUN Lian, LI Shuqin, LIU Bin
    Computer Engineering. 2019, 45(10): 196-202. https://doi.org/10.19678/j.issn.1000-3428.0052591
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    To address the problems of the average weight distribution and topic drift in the weighted LeaderRank algorithm,a user social network sorting algorithm is proposed.Integrating the GloVe model,cosine similarity calculation method and Newton's law of cooling,introduce the link-in and link-out factor,the topic relevance factor and the time attenuation factor to the weighted LeaderRank algorithm to improve its disadvantages.Experimental results show that compared with the weighted LeaderRank algorithm,the precision,the click rate and the NDCG value of the proposed algorithm is increased by 7.80%,6.73% and 4.75% respectively.The sorting quality can be improved effectively.
  • ZHANG Menghan, WANG Hai, LIU Xin, BAO Lei
    Computer Engineering. 2019, 45(10): 203-207,214. https://doi.org/10.19678/j.issn.1000-3428.0052466
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Samples that do not satisfy the independent and identical distribution will lead to deviations of the gradient estimation in the iterative process,and the convergence bound of the optimal individual cannot be determined under the interference of noise.Therefore,a linear interpolation stochastic Dual Averaging(DA) optimization method is proposed.The proof of the convergence of the DA method is given.On the basis of the gradient estimation bias,the individual convergence bounds of the non-cumulative deviation of the linear interpolation DA stochastic optimization method are obtained,and the optimization method of individual convergence precision of regularized loss function structure is assured.Experimental results show that compared with the stochastic accelerate method,the method has a faster individual convergence rate and a higher convergence accuracy.
  • NIU Shuoshuo, CHAI Xiaoli, LI Deqi, XIE Bin
    Computer Engineering. 2019, 45(10): 208-214. https://doi.org/10.19678/j.issn.1000-3428.0054297
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    The traditional Latent Dirichlet Allocation(LDA) topic model uses Gibbs Sampling to fit unknown parameters under known conditional distributions in text classification calculations,making it difficult to weigh classification accuracy and computation complexity.Therefore,based on the LDA topic model,a neural network is used to fit the word-topic probability distribution,and a text classification algorithm NLDA is proposed.Experiments on the THUCNews corpus and Fudan University corpus show that compared with the traditional LDA model,the average classification accuracy of the algorithm is increased by 5.53% and 4.67% respectively,and the average training time is reduced by 8% and 10%.
  • WANG Guang, JIANG Li, DONG Shuaihan, LI Feng
    Computer Engineering. 2019, 45(10): 215-220. https://doi.org/10.19678/j.issn.1000-3428.0052499
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    When dealing with massive data,the traditional collaborative filtering recommendation algorithm has the data sparsity and the long tail effect of the items,resulting in low recommendation accuracy.Aiming at this problem,combining ontology semantics and user attributes,this paper proposes an improved collaborative filtering algorithm.The item similarity matrix is constructed by using ontology to calculate semantic similarity between items.User attributes are introduced to calculate user similarity matrix.The user-item scoring matrix is formed by integrating ontology semantics and user attributes,and the prediction score of the matrix is weighted to provide the TOP-N recommendation results.Experimental results show that compared with Pearson similarity calculation collaborative filtering algorithm,collaborative filtering algorithm based on ontology semantics,and collaborative filtering algorithm based on scoring matrix filling and user interest,the proposed algorithm has the lowest mean absolute error and the highest precision,and its integrity and novelty are superior.
  • LIANG Yanhong, KAN Qixuan, SU Yi
    Computer Engineering. 2019, 45(10): 221-226. https://doi.org/10.19678/j.issn.1000-3428.0052033
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    When classifying texts with fuzzy categories,the topic model only considers the document and topic level information,and does not consider the implicit information of the underlying words.The information in most topics is complex and the center is not clear.Therefore,an improved text classification method is proposed.Topics with a clear center are selected by using quantile,and are mapped to the word2vec space to enable classification for fuzzy texts,so as to obtain the text classification result.Experimental results show that compared with the C_LCD+KNN method,the proposed method has better classification performance and robustness.
  • CHEN Jicheng, CHEN Hongchang, YU Hongtao
    Computer Engineering. 2019, 45(10): 227-233. https://doi.org/10.19678/j.issn.1000-3428.0052570
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In order to realize the rapid analysis of complex networks,an Improved Non-Negative Matrix Factorization(INMF) algorithm based on Clustering Quality(CQ) is proposed,and applied to dynamic community detection.From the perspective of theoretical analysis,the equivalence between evolutionary spectral clustering,INMF and module density optimization is proved.Based on the equivalence,a semi-supervised INMF algorithm is given by adding a priori information to INMF without increasing the time complexity.Experimental results on artificial networks and real-world dynamic networks show that the algorithm has better community detection quality and community detection efficiency compared with QCA and MIEN algorithms.
  • LIU Chongyang, LIU Qinrang
    Computer Engineering. 2019, 45(10): 234-238. https://doi.org/10.19678/j.issn.1000-3428.0052165
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    To address the over-fitting problem caused by the down-regulation of the Dropout rate in the pruning operation of the neural network model,a verification method for the generalization ability of the pruning model is proposed.By artificially occluding the dataset to simulate the change of the image range,the effects of different Dropout values and pruning ratios on the accuracy of the model are analyzed,and the reasons for the change of the generalization ability of the model after pruning operation are obtained.Experiments on the convolutional neural network model lenet-5 show that the generalization ability of the pruning model is weakened because of the drop of the Dropout rate and the change of the parameter amount during the pruning operation.
  • XIANG Wending, YANG Ping, XU Bing
    Computer Engineering. 2019, 45(10): 239-245. https://doi.org/10.19678/j.issn.1000-3428.0052600
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    When the dark channel priori dehazing algorithm is used to process an image with a large bright white area,the problem of image distortion occurs.To this end,an improved single image dehazing algorithm is proposed.The atmospheric light is obtained by quad-tree decomposition method based on sub-block average gray value and standard deviation.The segmentation threshold is adaptively calculated by the image dark channel histogram distribution,and the image is divided into a bright area and a non-bright area.The grayscale distribution of the image is used to calculate a weighting factor.And this weighting factor is used to blend the transmittance to make the edges smoother.On this basis,the haze-free image is restored by the atmospheric scattering model.Experimental results show that the algorithm effectively solves the problem of color distortion in the sky region,the visual effect is bright and natural,and the sudden change of the depth of the image boundary is smooth.
  • ZHOU Linyong, XIE Xiaoyao, LIU Zhijie, TAN Hongwei, YOU Shanping
    Computer Engineering. 2019, 45(10): 246-252,259. https://doi.org/10.19678/j.issn.1000-3428.0052774
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    To address the problem that the image classification algorithm based on Auxiliary Classifier Generative Adversarial Net(ACGAN) is unstable and the classification effect is poor,an improved image recognition algorithm CP-ACGAN is proposed.In the network structure,the authenticity discrimination of the output layer samples is cancelled.The posterior estimation of the sample label is outputted and introduced into the pooling layer.For the loss function,in addition to the cross entropy loss of real samples the cross entropy loss between the conditional control label of the generated sample and its posterior estimation is added to the discriminant network.The loss functions of the generator and discriminator are reconstructed based on the cross entropy loss and attributes of true and false samples.Experiments on MNSIT,CIFAR10 and CIFAR100 datasets show that compared with ACGAN algorithm and CNN algorithm,the algorithm has better classification effect and stability,and the classification accuracy rate is 99.62%,79.07% and 48.03% respectively.
  • ZHU Chenchen, QI Lin, TIE Yun
    Computer Engineering. 2019, 45(10): 253-259. https://doi.org/10.19678/j.issn.1000-3428.0052880
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    The computation of Fast Point Feature Histogram (FPFH) features requires human intervention to choose the neighborhood radius,which adds to the complexity and decreases the efficiency of computation.To address the problem,an automatic neighborhood radius identification algorithm for FPFH feature extraction based on arc length density is proposed.The calculation method of point cloud arc length density is given.The neighborhood radius of multiple pairs of point clouds is estimated according to the arc length density to extract FPFH features and complete Sample Consensus Initial Aligment(SAC-IA).The optimal values of the radius and arc length density are determined.On this basis,the least square method is used to fit the function expression between the neighborhood radius and the arc length density,and combined with the FPFH feature extraction algorithm to form an automatic neighborhood radius identification FPFH feature extraction algorithm.The experimental results show that the algorithm can automatically identify the appropriate neighborhood radius according to the point cloud arc length density,and the computation speed is fast.
  • WANG Junqiang, LI Jiansheng, ZHOU Huachun, ZHANG Xu
    Computer Engineering. 2019, 45(10): 260-265,271. https://doi.org/10.19678/j.issn.1000-3428.0053359
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In order to extract the typical elements (buildings and roads) of high-resolution remote sensing images,based on deep learning,a extraction method combining semantic segmentation and full-connection Conditional Random Field (CRF) is proposed.Using Deeplabv3+ as the semantic segmentation model,the relatively complete image segmentation information is extracted and used as the input of the unary potential function of the fully connected CRF.The mean field approximation method is used to realize the optimization of the segmentation information boundary.The robustness of Deeplabv3+ model is verified by analyzing the training effect of Deeplabv3+ in noise sample set data and based on public image and vector data source,a large-scale remote sensing training sample set intelligent collection system is designed.Collecting 2 000 square kilometers of remote sensing images of Rhode Island and taking corresponding typical elemental marker data as samples to conduct experiments,the results show that the MIoU value of the proposed method reaches 80.23%.After morphological processing,the boundary of the element is obviously clearer than the intial segmentation raslut.
  • ZHENG Mingming, LIN Zhiyi
    Computer Engineering. 2019, 45(10): 266-271. https://doi.org/10.19678/j.issn.1000-3428.0052309
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Based on the isometric invariance property of biharmonic distance,a similarity measure method for three-dimensional shape is proposed.The definitions of Biharmonic distance,formal expression and discrete computation are given,and the biharmonic distance matrix of shape is decomposed by singular value decomposition.The eigenvalues of biharmonic distance matrix are extracted as shape descriptors,and the cosine distance of a pair of shape eigenvalues is taken as shape similarity.Experimental results on TOSCA2010 database show that compared with FMPS method and SHED method,the proposed method can give better consideration to both time consuming and shape matching.
  • XU Zhigang, MA Qiang, ZHU Honglei, ZHANG Moyi
    Computer Engineering. 2019, 45(10): 272-276,282. https://doi.org/10.19678/j.issn.1000-3428.0052065
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Color image super-resolution reconstruction method based on sparse representation model usually adopts sparse coding process based on image blocks,which easily leads to the instability of sparse representation,and the problems of detail blurring and color artifacts in the reconstruction of color images.Therefore,a reconstruction algorithm combining nonlocal sparse representation with color channel constraints is proposed.The low-resolution color images are converted into YCbCr color space,and the brightness information of low-resolution color image is reconstructed by nonlocal sparse model.Then the reconstructed image is converted back to RGB color space,and the color artifacts are removed by using color channel constraints,thus ensuring the quality of image detail information reconstruction,and improving the ability in removing color artifacts.Experimental results show that compared with Bicubic Interpolation(BI) algorithm,ScSR algorithm and so on,the reconstructed image of the proposed algorithm has higher Peak Signal to Noise Ratio (PSNR) and Structural Similarity Index Measurement(SSIM).
  • QIN Jie, JI Zexuan, CAO Guo
    Computer Engineering. 2019, 45(10): 277-282. https://doi.org/10.19678/j.issn.1000-3428.0052358
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Aiming at the low registration accuracy of Synthetic Aperture Radar (SAR) images and optical images,a point feature descriptor HHIHC based on corner amplitude and orientation is proposed to improve the Harris corner detector.The detector's response function is extended from a first-order term to a second-order term to improve its robustness to SAR image noise.At the same time,a weight function combining pixel space and gray information is constructed to improve the accuracy of corner detection.Experimental results show that compared with conventional Harris corner detector,the improved detector can effectively reduce the influence of SAR image noise and has higher registration accuracy.
  • ZHOU Fuxing, CHEN Xiuzhen, MA Jin, LI Shenghong
    Computer Engineering. 2019, 45(10): 283-287. https://doi.org/10.19678/j.issn.1000-3428.0052535
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Due to the short length of the microblog text,the direct use of Latent Dirichlet Allocation(LDA) model will lead to high-dimensional sparse feature vectors.Thus,a hot topic mining method integrating tag semantics is proposed.The common block algorithm is used to calculate the similarity of the microblog tags,and the microblog texts with high tag similarity are combined.The merged text is modeled by LDA model,and the hot topic of microblog is mined by K-means clustering algorithm.Experimental results show that compared with the method of modeling a single microblog text and the method of directly merging the same label,the proposed method obtains a lower perplexity and a higher accuracy in mining topics.
  • SAIMAITI Maimaitimin, ESMAEL Abdurehim
    Computer Engineering. 2019, 45(10): 288-292,300. https://doi.org/10.19678/j.issn.1000-3428.0052123
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In order to improve the efficiency of information processing,the text information retrieval system usually filters out the stop words as noise,which affects the effect of text processing.Aiming at this problem,a stop words extraction method in Uyghur language is proposed.On the basis of analyzing the characteristics of Uyghur stop words,the statistics on a large number of corpus is carried out by means of Document Frequency(DF),Term Frequency(TF) and Entropy(EN),and the part of speech distribution of candidate stop words is analyzed.The threshold of stop words is determined by text classification experiments.Experimental results show that after filtering stop words with the proposed method,the computational complexity of text classification is reduced,and the classification precision reaches 80.8%.
  • ZHANG Lu
    Computer Engineering. 2019, 45(10): 293-300. https://doi.org/10.19678/j.issn.1000-3428.0052714
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Driven by economic interests,a large number of malicious users publish false comments containing untrue content to influence users' purchasing decisions,thereby promoting the sale of their own products or suppressing competitors,seriously disrupting the order of e-commerce operations.Therefore,the paper introduces fake commodity reviews research result,including the identification of false reviews,publishers,and groups of fake reviewers,analyzes and compares the features and detection methods used,and gives the evaluation method and indicators of false comment recognition effect.On this basis,the future research on false comment recognition is discussed and prospected.
  • LIU Yuling, HOU Jin, ZHANG Xiaoyu, CHEN Zeng
    Computer Engineering. 2019, 45(10): 301-307. https://doi.org/10.19678/j.issn.1000-3428.0052812
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Aiming at the problem of same frequency interference which is rather prominent in radio interference,deep learning is applied to the interference signals detection,and a new algorithm for detecting same frequency interference in frequency modulation broadcast is proposed.Transform the frequency modulation broadcast signals into the time-frequency images of wavelet which can reflect the signal characteristics, and then those images are used as the training data of the Convolutional Neural Network(CNN).After training the CNN to learn the time-frequency features of the signal,the detection model is obtained.Experimental results show that,compared with the traditional machine learning algorithm,the proposed algorithm can detect whether there is same frequency interference signal in the broadcast signal more accurately,and the accuracy can reach 95.0%.
  • LIU Ximei, PAN Lijun
    Computer Engineering. 2019, 45(10): 308-313. https://doi.org/10.19678/j.issn.1000-3428.0054037
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    The Bike-sharing Rebalancing Problem (BRP) is an extension of the Single Commodity Traveler Problem (1-PDTSP) and a type of NP-hard problem.To address the slow solution speed of the existing algorithms,which is not conducive to real-time scheduling optimization,a non-intergenerational genetic algorithm for solving BRP is proposed.The excellent individuals are retained based on the individual search mechanism,and the line crossover operator and k-point destroy and repair mutation operator are designed.The destroy and repair mechanism is introduced,and when the convergence speed of the algorithm slows down,new individuals are automatically generated into the crowd to avoid falling into the local optimal solution.Computational results on the benchmark problems show that the algorithm can find the optimal solution for the small-scale benchmark problems with an average CPU consumption of 3.8 s.For the medium-scale and large-scale benchmark problems,the algorithm finds the best solutions for 9 benchmark problems,and the speed is improved by over 77% than those of B&C and D&R algorithms.
  • XU Shuo, TANG Zuoqi, WANG Xin
    Computer Engineering. 2019, 45(10): 314-320. https://doi.org/10.19678/j.issn.1000-3428.0052813
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In the emergency management assessment process,the differences in the experience and language description of the experts can cause uncertainty and ambiguity in the assessment information,directly affecting the assessment results.Aiming at this problem,an assessment method for emergency management capability is proposed based on Analytic Hierarchy Process extended by D numbers preference relation (D-AHP) and Technique for Order Preference by Similarity to Ideal Solution(TOPSIS).According to the actual situation of emergency management,an assessment index system is established.By constructing the D-AHP hierarchical structure model,the impact weights of each assessment index is solved.Combined with the assessment results of the experts,the TOPSIS method is used to rank the emergency management capability level of the assessment objects.The assessment results of four districts and counties of a certain city show that the method can effectively identify the key links in the emergency management process,visually display the actual level of each assessment object and help to give specific suggestions for improving the emergency management and control construction.