Author Login Editor-in-Chief Peer Review Editor Work Office Work

15 March 2018, Volume 44 Issue 3
    

  • Select all
    |
  • YANG Guang, XIE Rui, XUE Guangtao
    Computer Engineering. 2018, 44(3): 1-7. https://doi.org/10.3969/j.issn.1000-3428.2018.03.001
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    For enterprise level private cloud systems,thousands of virtual machine instances are deployed on multiple data centers across the nation,which will generate massive raw data for the monitoring system to persist and process.This makes a significant pressure on computing,storage and network for monitoring system providing real-time monitor and statistical reports.Aiming at this problem,this paper designs a monitoring system for large scale private cloud by using whole set of big data method which makes the work distributed to solve the challenge mentioned above.Meanwhile,with the collected monitoring data,it uses thermal migration mechanism to reduce the waste of physical resources caused by unevenly distribution.Experimental result shows that this system can satisfy real-time monitoring as well as offline statistics and enhance above 13% physical resource utilization rate.
  • ZHANG Yue,YU Jia
    Computer Engineering. 2018, 44(3): 8-12,18. https://doi.org/10.3969/j.issn.1000-3428.2018.03.002
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In most of existing cloud storage integrity detection schemes,people need to execute many public key computations and complicated public key certificate management operations.In order to solve this problem,this paper proposes an ID-based cloud storage integrity detection scheme.It introduces ID-based cryptography,which reduces public key certificate management operations for user.Besides,user can authorize a Third-Party Auditor(TPA) to complete all time-consuming operations,including data uploading,generating data authenticator and data integrity detection,which minimizes the burden on the user side.The security and performance analysis results show that the proposed scheme can reduce computational overhead and generate user lightweight validators while supporting data privacy.
  • CHEN Yuan,ZHANG Changhong,FU Wei,ZHAO Huarong
    Computer Engineering. 2018, 44(3): 13-18. https://doi.org/10.3969/j.issn.1000-3428.2018.03.003
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    To solve the problem that the ciphertext interval search scheme based on single assertion named SRQSAE cannot resist statistical analysis attack of only-ciphertext,a new secure ciphertext interval retrieval scheme based on cloud storage technology is proposed.Through improving the structure of the key matrix and introducing random numbers,the proposed scheme can hide the size of keyword and ensure the confidentiality of keyword index and interval trapdoor,which can meet the requirements for the security of arrange and merge features.The comparative results of the complexity,storage space,operation time and data transmission show that,compared with SRQSAE scheme,the proposed scheme can get great improvement on above performance while ensuring the security.

  • WU Xiuguo
    Computer Engineering. 2018, 44(3): 19-26,36. https://doi.org/10.3969/j.issn.1000-3428.2018.03.004
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Most of traditional replicas distribution strategies usually assume that the data is independent and require no management cost,with no consideration of influence to data replicas cost and the generation of intermediate data.For these reasons,a two-phase data replicas distribution and generation strategy considering cost and storage space is proposed.In the phase of distributing the data replica,it selects the appropriate replicas storage places based on Genetic Algorithm(GA) by comparing the data transfers cost and storage cost and determines the storage and generation mode on the data center based on Dijkstra algorithm by comparing the data storage cost and generation cost.Experimental results show that the strategy is both feasible and effective in reliable data access while reducing the data management cost and data storage space,so as to improve the performance of cloud storage.
  • DU Yuanzhi,DU Xuehui,YANG Zhi
    Computer Engineering. 2018, 44(3): 27-36. https://doi.org/10.3969/j.issn.1000-3428.2018.03.005
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The traditional Information Flow Control(IFC) technology is limited by its stand-alone environment research,it is difficult to effectively protect the security of data in cloud computing.Therefore,this paper proposes an information flow control mechanism based on attribute encryption technology,which combines Attribute-Based Encryption(ABE) technology with IFC technology.By redesigning the user private key and access tree generation method,it reduces to access mechanism,making the mechanism to control the cloud data effective information flow,thus eliminates potential safety problems.Performance test results show that this mechanism can effectively resist the shared channel based attacks and protect the security of sensitive data in static virtual domains.
  • ZHU Wei,WANG Jun,ZHOU Xunzhao
    Computer Engineering. 2018, 44(3): 37-41,54. https://doi.org/10.3969/j.issn.1000-3428.2018.03.006
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Hospital cloud computing system has the problems of demand uncertainty and heterogeneity of nodes,which causes load imbalance.Therefore,a load balancing hospital cloud computing system resource scheduling scheme is proposed.The scheme is based on the hybrid leapfrog algorithm,aiming at the problem that the hybrid leapfrog algorithm is easy to fall into the local optimal solution,a resource scheduling scheme of the hospital cloud computing system based on the discussion mechanism leapfrog algorithm is proposed.By increasing the number of self-adaptive discussions,the search capability of the algorithm is improved.Simulation results show that the proposed scheme has better performance of load balancing than the traditional load balancing scheme and can solve the problem that the hybrid leap frog algorithm falls into local optimum.

  • HE Zhongqiao,XU Yun
    Computer Engineering. 2018, 44(3): 42-46. https://doi.org/10.3969/j.issn.1000-3428.2018.03.007
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The current multi-genome alignment algorithm requires a lot of time and memory overhead.Multi-genome Index (MuGI) alignment algorithm is faster,but failed to take advantage of multi-genomic duplication of information.therefore,an improved MuGI index alignment algorithm is proposed,which uses the dynamic seed expansion algorithm with Single Nueleotide Polymorphism(SNP) pruning and utilizes the repeated information of multiple genomes to improve the alignment speed.At the same time,it uses on-demand indexed memory management strategy to improve the space efficiency of the algorithm.Experimental results show that the improved algorithm only needs 6 GB running memory,which can be aligned on 1 092 human genomes and the speed of 5 mismatch is about 3 times faster than MuGI algorithm.
  • WEI Jianjun,CHEN Liangyu
    Computer Engineering. 2018, 44(3): 47-54. https://doi.org/10.3969/j.issn.1000-3428.2018.03.008
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In the conventional serial calculation method,there are some problems such as frequent elementary transformation in the solution method and large-scale calculation of high-order numerical determinants.Therefore,a fast and accurate calculation method of determinant based on General Purpose Graphic Process Units(GPGPU) is proposed.Using GPGPU and modular methods to solve integer matrix determinants in parallel in all nuclear environments,the calculation process can be accelerated to avoid floating point errors and the Chinese residual theorem can be used to obtain accurate results.Experimental results show that compared with the commonly used calculation software such as Maple and NTL,this method has the advantages of high computational speed,less memory consumption and memory expansion during calculation,which is more obvious for the higher-order integer matrix determinant.
  • SHE Yuxuan,XIONG Yun
    Computer Engineering. 2018, 44(3): 55-59. https://doi.org/10.3969/j.issn.1000-3428.2018.03.009
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    At present,most of the research on story line mining focuses on the similarity analysis of news documents and events,while ignoring the structured expression of stories and the delay of news.It is difficult to intuitively see the development of different news topics from the model results.Therefore,an unsupervised storyline mining algorithm based on Bayesian network is proposed,which considers the story line as the joint probability distribution of date,time,organization,person,place,topic and key words and considers the timeliness of news in inside.Experiments and evaluations results on multiple news datasets show that this algorithm model has a higher mining potential than the K-means and LSA algorithms.
  • DENG Kaixuan,CHEN Hongchang,HUANG Ruiyang
    Computer Engineering. 2018, 44(3): 60-64. https://doi.org/10.3969/j.issn.1000-3428.2018.03.010
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    With the increasing scale of the network,the Label Propagation Algorithm(LPA) has obvious advantages in the time complexity,but its internal mechanism is uncertain and random which leads to unaccurate and unsteadiness community discovery results.This paper presents an improved LPA algorithm.It constructs a new calculation method of node importance based on the K-shell decomposition algorithm,then uses the node importance analysis the label propagation ability in the label propagation algorithm,finally utilizes the node importance and label propagation ability to develop new label update strategy,and obtains the final results.Experimental results on artificial and real networks show that the algorithm has high accuracy and stability.
  • XIA Xiufeng,ZHANG Liuchang,LIU Xiangyu
    Computer Engineering. 2018, 44(3): 65-72. https://doi.org/10.3969/j.issn.1000-3428.2018.03.011
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem of long construction time,high storage cost and long response time of reachability index of large graph,a Distributed Reachability Index and Query (DRIQ) strategy is proposed in this paper.Large graph is partitioned into several small subgraphs without destroying the reachability of the nodes.And reachability indexes are created for each subgraph distributed and parallel to improve the efficiency of index creating.Some methods are designed to keep the reachability of the nodes in the subgraphs and the reachability of the nodes between the subgraphs in DRIQ,which can ensure the correctness of the reachability query based on DRIQ.Experimental results show that the strategy is highly efficient and scalable,compared with the traditional reachability query method.
  • ZHOU Xianchun,XU Haojie,SHI Lanfang
    Computer Engineering. 2018, 44(3): 73-77. https://doi.org/10.3969/j.issn.1000-3428.2018.03.012
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The complex physical environment in the indoor microenvironment is irregular,which makes the actual wireless channel parameters different from the parameters in the ideal environment.To this end,a statistical channel model based on the distribution of the scattered body and asymmetrical indoor transmission environment is proposed.The parameters of the radio channel related to the angle and time caused by the small scale fading multipath effect are derived.The TOA/AOA joint probability density function of the signal is derived.The probability density function of the arrival angle of the wireless signal is derived.The simulation results show that the relative wireless channel parameters derived by the model are more suitable for the actual channel environment than the uniform distribution of the scatterers and the symmetric model.
  • LIU Yajun,LI Shibao,LIU Jianhang,CHEN Haihua
    Computer Engineering. 2018, 44(3): 78-81. https://doi.org/10.3969/j.issn.1000-3428.2018.03.013
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    In order to further reduce the decoding delay,a low-latency adaptive Successive Cancellation List(SCL) decoding algorithm based on path reuse is proposed.A repeated path replicating scheme based on CRC check is adopted for the phenomenon that there is a duplicate path between SCL decoders for different lists.Simulation results show that compared with traditional CA-SCL algorithm and AD-SCL algorithm,the proposed decoding algorithm can maintain high decoding performance and lower decoding delay in the low signal-to-noise ratio channel.

  • SHI Yao,LI Hui,DU Wencai,LI Fabin
    Computer Engineering. 2018, 44(3): 82-86. https://doi.org/10.3969/j.issn.1000-3428.2018.03.014
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Aiming at the complex and ever-changeable marine communication environment and the lack of wireless infrastructure,a routing algorithm with anchor node forwarding time limitation is proposed.The application of Delay Tolerant Network(DTN) in the marine environment,the use of the vessel to store,carry,forward messages is to solve the problems that the message can not be transmitted due to frequent mobile communications link breaks.In the Matlab environment,the random trajectory of fishing vessels in a sea area of the south China sea is modeled and simulated.The generation of data packets in a heterogeneous network is subject to Poisson distribution.Based on this,a forwarding mechanism with limited survival time is introduced.The communication performance of DTN is analyzed and the impact of the survival time,the number of fishing vessels and the wireless network coverage on the delivery of data packets is analyzed are compared.Simulation results show that,this algorithm reduce the network transmission delay and improve the performance of maritime wireless communication network.

  • YU Le,MO Lufeng,YI Xiaomei
    Computer Engineering. 2018, 44(3): 87-92,98. https://doi.org/10.3969/j.issn.1000-3428.2018.03.015
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The complexity of forest environment leads to the large location error of the Received Signal Strength Indication(RSSI) of sensor network,and the current RSSI path loss model cannot meet the location requirement of sensor node in forest.Aiming at this problem,this paper proposes a Wireless Sensor Network(WSN) forest location algorithm.According to the discrete coefficient division location area of RSSI in different regions,the RSSI path loss model is established for different regions.A new model which is more suitable for the actual environment is built by using combination of log path loss model and piecewise fitting model.The location error is eliminated by the sub-region range location and K-means clustering algorithm.Experimental results show that the proposed algorithm can effectively improve the position accuracy.
  • BAO Hui,YAO Yaqing
    Computer Engineering. 2018, 44(3): 93-98. https://doi.org/10.3969/j.issn.1000-3428.2018.03.016
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The 2D precoding technology can only adjust the beam in the horizontal direction through data processing,and it will cause the serious inter-cell interference for the multi-user system,especially the base station edge user.Aiming at this problem,this paper studies the effect of feedback overhead on the performance of the system using stochastic geometry under limited feedback mechanism in frequency division duplex 3D MIMO heterogeneous network system.And under the deployment of uniform planar array antenna in base station,the design scheme of 3D MIMO multiuser codebook based on horizontal emission angle and vertical dip angle is proposed,and the codebook scheme is simulated and analyzed.Results show that the feedback overhead and micro base station density affects the throughput of the system,and even affects the bit error rate of 3D precoding scheme.Comparing with the 2D and 3D discrete Fourier transform codebook precoding scheme,the proposed scheme can greatly reduce the bit error rate of the system,improve the throughput of the system,and optimize the performance of the system.
  • CHEN Fatang,LONG Yunbo,WANG Yufan
    Computer Engineering. 2018, 44(3): 99-102,108. https://doi.org/10.3969/j.issn.1000-3428.2018.03.017
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    At receiver,Space Modulation(SM) algorithm always uses Maximum Likelihood(ML) signal detection to obtain the transmission antenna number and the modulation signal and restore the transmitted information bit.However,the complexity of ML algorithm increases exponentially with the increase of the number of antennas and the number of modulation orders,which is not practical.To solve this problem,this paper proposes a new low complexity sub-optimal detection algorithm by setting up a reasonable decision threshold combining Signal Vector Detection(SVD) and Hard Limited-Maximum Likelihood(HL-ML) algorithm.Monte Carlo simulation results show that the detection performance of Bit Error Rate(BER) of the proposed algorithm is more close to the ML algorithm than the SVD algorithm,and the complexity of the algorithm is reduced by 85% compared with the ML algorithm.
  • YU Ke,WANG Huifeng
    Computer Engineering. 2018, 44(3): 103-108. https://doi.org/10.3969/j.issn.1000-3428.2018.03.018
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    RPL routing protocol in non-storage mode,congestion will be caused by accumulation of a large number of Destination Advertisement Option(DAO) and intermediate nodes need larger buffer to store DAO before relaying to their parents.Regarding the issue above,an improved algorithm for RPL routing protocol is proposed,which adjusts the DelayDAO timers by sending the backward pressure message to its children.Thus,controlling the generation and relaying of DAO messages.At the same time,using the multi-frequency characteristics of many existing nodes,the control information messages and data packets are physically isolated.The experiment and analysis show that compared with the traditional non-storage mode of RPL routing protocol,the improved routing protocol reduces the buffer occupancy,and also reduces the network congestion and packet loss during data packet transmission.
  • REN Xiuli,JI Pengshuo
    Computer Engineering. 2018, 44(3): 109-113,118. https://doi.org/10.3969/j.issn.1000-3428.2018.03.019
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to ensure the accuracy and real-time of data in data collection and transmission,a Fuzzy-Weighted Algorithm for Data Fusion(FWADF) based on clustering is proposed.Fuzzy logic controller is used to analyze the reliability of node data in the cluster to ensure the credibility of the data.At the same time,the priority of data is added to reduce the network delay.In the cluster,fuzzy-weighted matrix method is used to improve the accuracy of the data.Experimental result with NS-2 simulation tool shows that,the time delay of arriving at the base station is the shortest under the same data traffic,when the nodes collect the same amount of data,compared with the algorithms such as Proposed DF and VWFFA,FIM,the average accuracy rate of the data obtained by base stations is increased by 5.0%,16.1% and 9.5% respectively.
  • LIU Yun,CHEN Qian
    Computer Engineering. 2018, 44(3): 114-118. https://doi.org/10.3969/j.issn.1000-3428.2018.03.020
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to solve the problem of effective data estimation in distributed measurement of Wireless Sensor Network(WSN),a Distributed Compression Estimation Algorithm for Measurement Matrix Optimization(MMDCE) is proposed.It performs the distributed estimation of unknown parameter variables on a compressed dimension,and updates the measurement matrix by using adaptive random gradient recursion.The DCE is combined with the measurement matrix optimization,the optimization of convergence rate and estimated error accuracy are achieved.Simulation results show that compared with dNLMS algorithm and DCE algorithm,the algorithm has better convergence speed and higher precision of estimation error.
  • LI Xinran,ZHOU Jinhe
    Computer Engineering. 2018, 44(3): 119-126. https://doi.org/10.3969/j.issn.1000-3428.2018.03.021
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Studies in Content Delivery Network(CDN) of energy consumption are focused on redirection of requests,scheduling of caches and controlling about links consumption nowadays.However,there is a lack of efficient solution toward unbalance distribution of resource,low utilization of servers,energy inefficiency and poor service quality caused by absence of reasonable deployment mechanism of cache servers.This paper proposes a scheme of deploying based on the property of nodes of complex networks in CDN.The algorithm divides the network into communities firstly,during which it chooses then converges an initial community at the beginning,then iterates to expand the community in terms of similarity function.A threshold is set to get exact community,then cache servers can be deployed in proper places to balance the network load,reduce the resource waste while cache servers are idle and improve the utilization of servers.Experimental results show that this algorithm has lower computational complexity and fine granularity compared with spectrum average method and GN algorithm and so on.
  • WANG Yan,LIU Jiayong,LIU Liang,JIA Peng,LIU Luping
    Computer Engineering. 2018, 44(3): 127-131. https://doi.org/10.3969/j.issn.1000-3428.2018.03.022
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    At present,the number of binary vulnerabilities supported by existing vulnerability utilization platforms is not much,and the flexibility and development efficiency of these platforms is low.There are few utilization tool R & D and generation systems specifically for binary vulnerabilities.Therefore,this paper presents a new automatic generation framework for binary vulnerability exploit tool.The framework modularizes the exploit process.The development and automation of vulnerability utilization tools are quickly and flexibly generated through a variety of modular combinations.According to the different characteristics of each module,different design is used to achieve a shorter development cycle and higher development efficiency.Experimental results show that the framework is simple and easy to use,with high flexibility and extensibility.
  • JIA Junjie,CHEN Luting
    Computer Engineering. 2018, 44(3): 132-137. https://doi.org/10.3969/j.issn.1000-3428.2018.03.023
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem that the current p-Sensitive k-anonymous model has no regard for the semantic similarity of sensitive attribute and can be susceptible to similarity attack,this paper proposes a (p,k,d)-anonymous model that can resist similarity attack.This model uses semantic hierarchical tree to semantic analysis of sensitive attribute values and computes the semantic dissimilar values between sensitive attribute values,and each equivalence class exist at least p-Sensitive attribute values that satisfy the d-different on the basis of satisfying k anonymity to prevent similarity attack.Considering the availability of data,the model divides the equivalence class by means of the distance-based measurement methods to reduce the loss of information.Experimental results show that compared with p-Sensitive k-anonymous model,the proposed (p,k,d)-anonymous model cannot only reduce the probability of sensitive attribute leakage to protect individual privacy more effectively,but also improve the usability of data.
  • RAO Zhihong,LIU Jie,CHEN Jianfeng
    Computer Engineering. 2018, 44(3): 138-143. https://doi.org/10.3969/j.issn.1000-3428.2018.03.024
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Efficient management of massive knowledge is the prerequisite for the effectiveness of network monitoring and early warning.Aiming at this problem,a large-scale Resource Description Framework(RDF) data storage approach is proposed based on graph database.By leveraging the graph characteristics of the RDF data,it partitions the RDF dataset based on a heuristic greedy strategy.Dataset partitioning contains the subgraph generation stage and subgraph partitioning stage.It also considers the dynamic property of the data flow.Meanwhile,it uses a dynamic mechanism for replicating and deleting the hot spot data to achieve load balancing of dynamic dataflow.The proposed approach is compared based on relational databases on three different datasets.The experimental results show that it is obviously better than the approach based on relational database.
  • WANG Yixing,SUN Lianshan,SHI Libo
    Computer Engineering. 2018, 44(3): 144-150. https://doi.org/10.3969/j.issn.1000-3428.2018.03.025
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To improve the low provenance utility provided by existed provenance sanitization mechanisms,a data provenance sanitization mechanism is proposed.PROV-DM model is extended to generalize the dependencies into uncertain dependencies.The rationality of recovering provenance utility by introducing uncertain dependencies is proved.An evaluation model for utility is built to quantitatively evaluate sanitized views with uncertain dependencies.A novel provenance sanitization mechanism of “delete and recover”is proposed to delete sensitive nodes or edges and then recover provenance utility by introducing uncertain dependencies in sanitized views under the premise that the result of provenance tracing is not increased.Experimental results show that the proposed mechanism can produce sanitized views with higher provenance utility,in comparison with existed typical sanitization mechanisms.
  • FAN Zihua,CHANG Chaowen,HAN Peisheng,PAN Dongcun
    Computer Engineering. 2018, 44(3): 151-155,165. https://doi.org/10.3969/j.issn.1000-3428.2018.03.026
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem that the applicable network scale for existing attack graph generation methods is limited,through analysis of the shortage of the existing attack graph construction methods and the characteristics of the construction process,the constructed attack graph is transformed into a pattern matching between the threat action properties.Rete is introduced into the construction process of attack graph,an attack graph building method based on Rete is proposed.Experimental results show that the method has better construction efficiency and can be applied to the construction of attack graph in large-scale network.
  • JIN Jiejing,ZHANG Yongbin,RAN Chongshan,SUN Lianshan
    Computer Engineering. 2018, 44(3): 156-165. https://doi.org/10.3969/j.issn.1000-3428.2018.03.027
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Typosquatting is a typical cybersquatting,the purpose of speculators register typosquatting is mainly for profit.With the increasing number of registration,the impact of the typosquatting to the Internet user is becoming more and more serious.The latest relevant research of typosquatting is reviewed,the existing research methods,research achievements,relevant policies and regulations are introduced.On this basis,future challenges and research directions in the area of typosquatting are presented.
  • WANG Yong,ZHANG Yuhan,HONG Zhi,WEN Ru,FAN Chengyang,WANG Juan
    Computer Engineering. 2018, 44(3): 166-170,177. https://doi.org/10.3969/j.issn.1000-3428.2018.03.028
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The problem that the current integrity measurement technology cannot support 2.0 specification of the Trusted Platform Module(TPM),it improves the Linux kernel Integrity Metric Architecture(IMA),and designs the kernel integrity metrics framework based on TPM 2.0.At the same time,based on TPM 2.0 chip,the Linux trusted kernel that supports TPM 2.0 specification is implemented.Test results show that IMA 2.0 can detect the integrity of the system key files based on TPM 2.0 and resist the tampering attacks on the kernel files.
  • FAN Yourong,YANG Tao,WANG Yongjian,JIANG Guoqing
    Computer Engineering. 2018, 44(3): 171-177. https://doi.org/10.3969/j.issn.1000-3428.2018.03.029
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    An identification method based on URL feature detection is proposed to effectively identify illegal websites.A website similarity model based on path similarity is designed based on the hierarchical characteristics of user access path in message request line information,and distributed computing of the model is implemented by using Python programming language.Websites clustering is achieved by Fast Unfolding algorithm,and URL features of illegal websites are extracted.The features of high accuracy and specific meaning are selected as effective illegal website features.By detecting whether an unknown website has the URL features of an illegal website to identify illegal websites.Experimental results show that the method can effectively measure the degree of association between similar websites,and can effectively distinguish different types of websites with Fast Unfolding algorithm.Compared with other identifying methods based on URL morphological features,HTML or semantic features,F-Measure value of the proposed method achieves the best result.
  • CHEN Zhenyu,YANG Yang,JI Sai,LIU Wenjie
    Computer Engineering. 2018, 44(3): 178-181,188. https://doi.org/10.3969/j.issn.1000-3428.2018.03.030
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to reduce the cost of quantum devices and perform quantum computing better,a multiparty semi-quantum key agreement protocol based on delegating quantum computation mode is proposed.Delegating quantum computation model is introduced.The complicated operations such as unitary operation and Bell measurement are entrusted to the quantum center.Participants only need the simple ability of accessing quantum channel and preparing single photons.In order to prevent key information from being stolen by quantum center or outside attackers,the confusion strategy of inserting confusion single photons into target states is used to keep target states safe.Analysis results show that compared with previously quantum key agreement protocols,the quantum capabilities required by the participants significantly decrease,thus improving the practical feasibility of the protocol.
  • FAN Yundong,WU Xiaoping
    Computer Engineering. 2018, 44(3): 182-188. https://doi.org/10.3969/j.issn.1000-3428.2018.03.031
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Most existing attribute-based encryption schemes of hidden policies are directed to a single authority,without taking into account the situation where the users’ attributes are managed by multi-authority,having the problems such as low-efficient of key generation,single authority easy to be broken through and requirement of environmental security inconsistent with cloud storage.Therefore,a multi-authority Attribute-based Encryption(ABE) scheme is put forward.This scheme realizes complete concealment of access policies so as to protect the privacy of users by improving the access structure.Users’ private key is generated by the data owner and the attribute authorities,which improves the efficiency of key generation and resists conspiracy attack from illegal users and authorities.Based on Decisional Bilinear Diffie-Hellman (DBDH) assumption,the scheme is proved to be chosen-plaintext security under standard model.Experimental results indicate that this scheme can improve the efficiencies of key generation,encryption and decryption.
  • YAN Rui,LI Shijun
    Computer Engineering. 2018, 44(3): 189-194. https://doi.org/10.3969/j.issn.1000-3428.2018.03.032
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Conventional search engines collect documents which only contain key words in the query, but not considering the true intent hidden inside its users.Aiming at this problem,taking the document retrieval as a personalized recommendation problem,this paper proposes a personalized retrieval algorithm based on query intent identification and topic model.First,the topic model of Dirichlet Distribution Allocation(LDA) is applied for modeling the historical search data of its user.When a new query comes,latent topic of the query is recognized by the topic model of the historical search of its user,and then appropriate documents are recommended for the correlation of topics.Finally,the KL distance between the query and document sets is calculated,and the documents returning to the user are sorted according to the distance.Experimental results show that the proposed algorithm is better than the method based on collaborative similarity calculation and the method based on user interest clustering on efficiency.
  • LIU Yezheng,XIONG Qiang,JIANG Yuanchun
    Computer Engineering. 2018, 44(3): 195-200. https://doi.org/10.3969/j.issn.1000-3428.2018.03.033
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Sales of the e-commerce platform possess a long tail character and niche products in the long tail are difficult to be involved in the list produced by the recommendation method whose goal is the pursuit of precision.Aiming at this problem,from the perspective of niche product,this paper proposes a new recommendation method.It calculates user ratings similarity,preferences similarity and latent features similarity between users based on rating information,attribute information and latent feature information respectively.Then,it excavates the possible users of the niche products that have the top similarity with the users who have high ratings for niche products based on the three similarities,and provids niche products for those possible users.Experimental results show that the recommendation conversion rate of the proposed method is much higher than probability matrix factorization method and collaborative filtering method for niche product recommendation.Therefore,it is more effective to solve the problem of niche products recommendation.
  • TIAN Xuedong,WANG Chen
    Computer Engineering. 2018, 44(3): 201-207. https://doi.org/10.3969/j.issn.1000-3428.2018.03.034
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Most existing mathematical expression retrieval models are designed for common mathematical expressions,and when they are used to retrieve linear algebraic expression,their effects are poor due to lack of consideration of linear algebraic expression’s features.Therefore,a retrieval method for linear algebraic expressions is designed.The improved formula describe structure is used to describe the features of the linear algebraic expressions in LaTeX format.According to the types of linear algebraic expressions,expressions are classified and the corresponding expansion operations are defined,and the index files are built.Four linear algebraic expression matching algorithms are designed to achieve a flexible retrieval mode,and to improve the relevance of the search results.Experimental results show that the proposed method conforms to the retrieval features of linear algebraic expressions,and has a more reasonable index structure,and obtains a higher matching efficiency.
  • ZHANG Yong,CHEN Feng
    Computer Engineering. 2018, 44(3): 208-213,219. https://doi.org/10.3969/j.issn.1000-3428.2018.03.035
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To overcome the slow speed and low precision in convergence of the Whale Optimization Algorithm(WOA),to preserve the simplicity of the original algorithm while enhancing the performance,an improved WOA is proposed.Firstly,to maintain the diversity of the initial population in the global search,the population position is initialized by the chaotic sequence generated by the piecewise Logistic chaotic mapping.Secondly,considering the nonlinear optimization process of the algorithm and the difference of individual state in the search process,a nonlinear adaptive weighting strategy is introduced in the basic algorithm to coordinate the global exploration and local development.By the simulation,it compares the performance of the improved algorithm and the WOA on solving six typical benchmark functions.Experimental results show that the improved WOA preserves the initial population diversity in the process of optimization with better convergence speed and precision.
  • XIAN Ying,YU Jiong,XUE Pengqiang
    Computer Engineering. 2018, 44(3): 214-219. https://doi.org/10.3969/j.issn.1000-3428.2018.03.036
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Recommendation system based on context sensing through the introduction of environmental information to make recommendations,the user’s privacy information can be directly or indirectly acquired by the attacker,causing privacy disclosure.To solve the above problems,an improved anonymous model is introduced into the context sensing recommendation system.Combines clustering methods to divide different sensitive attribute values in different groups,for multiple sensitive attributes anonymous,in terms of privacy protection,protection degree is higher on high sensitive attributes information,and protection degree is similar on similar sensitivity information.Finally,experiments are carried out on a real data set.The experimental results show that the proposed method can improve the accuracy of recommendation while protecting the user’s privacy in the recommendation system.
  • FENG Xi,ZHU Fuxi,LIU Shichao
    Computer Engineering. 2018, 44(3): 220-225,232. https://doi.org/10.3969/j.issn.1000-3428.2018.03.037
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem of low accuracy in traditional Label Propagation Algorithm(LPA),an improved label propagation algorithm based on DeepWalk model is proposed.Firstly,the algorithm takes the social network as the input of the DeepWalk model,samples the nodes in the network to get random sequences by means of a deep random walk,and uses SkipGram model to train the samples in neural network.Secondly,computes the kernel part of SkipGram model by hierarchical Softmax and obtains the feature vector of the nodes,and then calculates the similarity between the nodes.Finally,takes the similarity of the nodes as the weight during the label propagation procedure,and then gets the results of community detection.Experimental results on 6 real network dataset and synthetic dataset show that,compares with the traditional label propagation algorithm,the improved algorithm gets the higher accuracy,and especially when the nodes’ number is more than 100 in real network dataset,the Q shows 10% rise in improved algorithm.
  • XIE Bin
    Computer Engineering. 2018, 44(3): 226-232. https://doi.org/10.3969/j.issn.1000-3428.2018.03.038
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Many public opinion classification system analyze information without enough consideration of domain knowledge,based on this problem,a public opinion classification method based on semantic network is proposed.Expand the public knowledge map by using Term Frequency-Inverse Document Frequency(TFIDF) technology,and use semantic network to model public opinion information.Then,use the mapping values of conceptual nodes in semantic networks to express public opinion information,the related concepts can be mapped to each other to generate gains,thus highlighting the theme of public opinion information,then will find relevant concepts that are not explicit in public opinion through the concept of public opinion,so as to reflect the overall situation of public opinion information.The comparison experiment with the mainstream classifier shows that the classification method of public opinion information based on the semantic network has better classification effect under the mainstream classification method.
  • XUE Zhixiang,YU Xuchu,TAN Xiong,WEI Xiangpo
    Computer Engineering. 2018, 44(3): 233-240. https://doi.org/10.3969/j.issn.1000-3428.2018.03.039
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem that current hyperspectral image tensor feature extraction methods cannot make full use of the multiple spectral-spatial features of hyperspectral image,a new hyperspectral image tensor feature extraction method based on fusion of multiple spectral-spatial features is proposed in this paper.Firstly,3D Gabor wavelet is used to get multiple texture features with different directions and frequencies,and the multiple shape structural features are got by different morphological attribute filters.The tensor features are constructed by combining the spatial feature,multiple texture features and multiple shape structural features.Then,by using the local tensor discriminant analysis,the proposed method effectively increases the consistency of the same kind tensors and the difference of different kinds of tensors,which can get the lower dimensional tensors consisting of discriminating information and multiple spatial-spectral features.The experiments are performed on the Pavia University and Salinas hyperspectral data sets.Experimental results indicate that the proposed method can maintain the spatial-spectral information and discriminating information,which has higher classification accuracy and better spatial continuity classification map when it is applied to the classification images.
  • ZOU Binyi,LIU Hui,SHANG Zhenhong,LI Runxin
    Computer Engineering. 2018, 44(3): 241-244. https://doi.org/10.3969/j.issn.1000-3428.2018.03.040
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem that the similarity between neighborhood blocks in non-local means is not accurate,a local Hu moment denoising algorithm based on Krawtchouk polynomial weighting function is proposed.By combining the weighting function of Krawtchouk polynomial and image function,a geometric moment new weighting function is constructed.The new center moment is obtained by using the constructed geometrical moment weighting function.A set of featurevectors are constructed by using two moment invariants with second order and third order center moments.The Euclidean distance is used to measure the similarity of the feature vectors in the neighborhood and the new weighting is obtained by combining the weighting with the neighborhood blocks.Experimental results under different intensities of noise,show that compared with the original non-local mean noise denoising algorithm,the peak signal to noise ratio and structural similarity are significantly improved.
  • LIU Lihui,XU Jun,GONG Lei
    Computer Engineering. 2018, 44(3): 245-250,258. https://doi.org/10.3969/j.issn.1000-3428.2018.03.041
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Aiming at the problem that the unsupervised dictionary learning algorithm has low image classification accuracy,a supervised dictionary learning classification algorithm which combines with multiple image features is proposed.It uses the convolution neural network to detect and divide cells to extract the texture of the cell structure.It extracts a variety of texture signatures for the cells corresponding to the pathological image of the cells,and then extracts the SIFT and SURF characteristics of the whole picture.In order to reduce classification errors,unsupervised dictionary learning and binary classification functions are jointly trained,and images are replaced by multi-feature as dictionary learning input,and breast pathological images are classified.Two breast pathological databases are compared,and experimental results show that multi-feature supervised dictionary learning algorithm classification accuracy is up to 92.15% and classification performance is better than unsupervised dictionary learning algorithm.

  • WANG Ti,CHEN Jian,ZENG Lei,TONG Li,YAN Bin
    Computer Engineering. 2018, 44(3): 251-258. https://doi.org/10.3969/j.issn.1000-3428.2018.03.042
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Traditional level set methods for image segmentation cannot deal with the efficiency segmentation of intensity-texture image problem,which consider more of the underlying data and ignore the high-level semantic features.Aiming at this problem,this paper proposes an level set method combining with shape prior for intensity-texture image segmentation.Firstly,it uses ASLVD filtering to obtain texture term,meanwhile obtaining the shape prior term by localization of filter image.Then,it combines the energy function of level set consists of intensity item,regularization term,texture term and shape prior term to construct an integral horizontal set curve evolution energy function.Finally,it obtains the segmentation result by minimizing the energy function.Experimental results show that the proposed method can segment the overlapped intensity-texture image and object better performance.
  • LI Jun,CHENG Jian
    Computer Engineering. 2018, 44(3): 259-263. https://doi.org/10.3969/j.issn.1000-3428.2018.03.043
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Traditional circle detection method has several limitations in complex background,such as low accuracy,high error detection rate,high leak identification rate and low reliability.To solve these problems,a circle detection method which can be applied to complex background is proposed.According to radius search range,it picks out candidate circles in the way of three level screening method.It calculates the number of each candidate circle edge point and divides it by circle radius,and then picks out real circles based on this result.It sorts all circles and deletes interference circles according to the minimum distance and minimum radius difference threshold between circles.Experimental results show that,in comparison to traditional Hough gradient transform method,the proposed method can detect circle more accurately and reliably with error detection rate reduced by 24%,and leak identification rate reduced by 8%,even in complex background.
  • DU Cui,ZHANG Qianli,LIU Jie
    Computer Engineering. 2018, 44(3): 264-269,274. https://doi.org/10.3969/j.issn.1000-3428.2018.03.044
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem that railway ballast Ground Penetrating Radar(GPR) detection need high timeliness and accuracy,an image registration algorithm based on Histogram Curvature Analysis(HCA) and KAZE features is proposed.It extracts high energy region of image using HCA threshold segmentation,in order to save time for registration useless region.KAZE features in GPR images are extracted and features matching is optimized through Fast Library for Approximate Nearest Neighbors(FLANN) algorithm and Random Sample Consensus(RANSAC) algorithm.Experimental results show that the proposed algorithm can get good registration results for GPR images including disease,grayscale and structure difference.And it gains higher accuracy than original KAZE algorithm,ORB algorithm and SIFT algorithm and more than 8% higher efficiency than original KAZE algorithm.
  • WU Teng,ZHANG Zhili,ZHAO Junyang,ZHANG Haifeng
    Computer Engineering. 2018, 44(3): 270-274. https://doi.org/10.3969/j.issn.1000-3428.2018.03.045
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    In order to solve the problem of large computational load of the corner detection threshold selection method,a new adaptive corner detection method is proposed.Nine basic statistical characteristics that can reflect the gray distribution,contrast and correlation of the images are analyzed.The basic statistical characteristics of 4 848 samples are extracted,and the principal components analysis is used to calculate 4 comprehensive indexes reflecting the different attributes of the images.The multivariate nonlinear local optimal threshold prediction model is established,and the model parameters are optimized and estimated by the training data.The prediction model of the guidance corner detection adaptive threshold selection is obtained.Experimental results show that the introduction of prediction model can improve the quality of detection of significant corners of the image,detection rate of significant corners in complex images is improved by 45% on average compared with the original detection algorithm,and the average false detection rate is reduced by 81% on average.

  • HE Tong,XIONG Fengguang,HAN Xie,ZHANG Yuan
    Computer Engineering. 2018, 44(3): 275-280,286. https://doi.org/10.3969/j.issn.1000-3428.2018.03.046
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To solve the problems existing in previous feature curve algorithms,such as insensitivity to detailed features of the model,high time cost and poor anti-noise performance,an algorithm for extracting feature curve from point cloud based on covariance matrix and projection mapping is proposed in this paper.Firstly,the eigenvalues of the covariance matrix are used for regional growth clustering segmentation to cluster the point cloud into multiple band clusters.Then,key feature points are extracted in accordance with the principle direction within each cluster.The key feature points are projected onto the local surface,which is around the key feature point and fitted by the Moving Least Squares(MLS) method.Finally,the feature curves are achieved.Experimental results show that the proposed algorithm has higher efficiency and stronger anti-noise performance,meanwhile getting smooth feature curves.
  • ZENG Bi,HUANG Wen
    Computer Engineering. 2018, 44(3): 281-286. https://doi.org/10.3969/j.issn.1000-3428.2018.03.047
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem that the traditional point cloud segmentation algorithm is not precise and feature description is not comprehensive in specific scenes,an Affinity Propagation(AP) clustering ensemble segmentation method fusing with 2D and 3D features is proposed.Firstly,a set of descriptors representing different cloud types of complex indoor scenes,such as colour image features,curvature,normal vectors,rotating images,are obtained from point clouds.Secondly,according to the difference between them,the clustering members are obtained by AP clustering for each class of features,and the cluster consensus matrix is established.Finally,the final segmentation result is obtained by using Ncut algorithm.Experimental results show that the proposed method is better than traditional point cloud segmentation algorithm in distinguishing indoor 3D point cloud scene,and has better stability.
  • SHAO Xiancheng,CAI Chao,WANG Houjun,LI Dongwu
    Computer Engineering. 2018, 44(3): 287-293. https://doi.org/10.3969/j.issn.1000-3428.2018.03.048
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    The route planning is a multi-objective optimization problem with constrains.The commonly used optimization algorithm is to transform the multi-objective optimization problem into a single-objective optimization problem by the weighted coefficient method.This fixed weighting coefficient method can not adapt to changes in the battlefield environment,and unable to meet the individual preferences of different experts on optimization goals.To solve the above problems,an optimization route planning method based on Type-2 Fuzzy Sets(T2FSs) reasoning is proposed.A complex constraint hierarchical expression model of the aircraft is established,the improved Per-C method is adopted,and the experts’ preferences for the optimization target and the route constraint value is used to obtain the fuzzy cost of the route.Then,the multi-objective optimization route planning method based on T2FSs is established by applying the fuzzy inference to the A* search cost calculation process.Experimental results show that the method can effectively represent the preferences of experts on the optimization goal,with strong flexibility and versatility.

  • JIANG An,LI Xiangyang
    Computer Engineering. 2018, 44(3): 294-300. https://doi.org/10.3969/j.issn.1000-3428.2018.03.049
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem that the existing anti-collision algorithms based on tree have too many collision slots and low recognition efficiency,an anti-collision algorithm based on Information Bit Grouping (IBG) is proposed.The tags is grouped according to the number of bit “1” in tag information bits.When the collision occurs,if multiple tags meet the prediction recognition condition,those tags can be fully identified,otherwise the invalid collision slots will be skipped and new query prefixes will be generated according to the collision recovery mechanism.The simulation results show that the proposed algorithm can reduce the reader’s search times and the communication complexity of the system,effectively improve the recognition efficiency of system compared with Query Tree(QT) algorithm,Adjustive Hybird Tree(AHT) algorithm and Enhanced Multi-Bit Identification(EnMBI) algorithm,especially with the increase of the number of tags,the advantage is more obvious.
  • GAO Yi,LUO Jianxin,QIU Hangping,WU Bo
    Computer Engineering. 2018, 44(3): 301-306,314. https://doi.org/10.3969/j.issn.1000-3428.2018.03.050
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The Boolean operation of arbitrary polygons is mostly based on the CPU grid method,because of the serialization of CPU,the raster process takes a long time.In order to solve the above problem,an algorithm for polygon Boolean operations based on the Graphics Processing Unit(GPU) rasterization is proposed.The time-consuming process of two-dimensional graphics rasterization in CPU is realized by GPU and the internal and external contour fragments are extracted,the raster data structure in the GPU environment and the vertex data structure in the CPU environment corresponding to its space mapping are constructed.Based on this,the internal and external contours are alternatively visited in order to perform the vertex tracking and the contour fragments compression using CPU and GPU in a coordinated manner.Finally the correct Boolean result polygon is obtained.The experimental results show that compared with the existing algorithm of polygons Boolean operation,the proposed algorithm can effectively control the precision,and also possesses with higher execution efficiency.
  • GUO Lei,WANG Xiaodong,WANG Jian,XU Bowen
    Computer Engineering. 2018, 44(3): 307-314. https://doi.org/10.3969/j.issn.1000-3428.2018.03.051
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to reduce the computational complexity of the intra prediction process of High Efficiency Video Coding(HEVC),a fast intraframe prediction algorithm based on the dominant direction intensity of texture is proposed.According to the Coding Unit(CU) distribution feature of each depth layer and it the main direction strength of each CU texture determine whether the CU needs to be divided.The texture directions of 4×4 blocks are counted on CU blocks with depth levels of 0 and 1 to determine the texture main direction strength of the current CU and judge its texture complexity.Pixel variance is combined on CU with depths of 2 and 3,and the main direction intensity of the corresponding CU is calculated in pixel units.Through experiment training sequence,the algorithm obtains the appropriate threshold,adaptively terminate the coding unit ahead of time,and reduces the complexity of intraframe prediction coding.Experimental results show that,compared with the platform HM15.0,the average encoding time of this algorithm can be saved by 51.1% while ensuring the same signal-to-noise ratio and bit rate.
  • ZHANG Dong,PENG Jianyun,YU Chunyan
    Computer Engineering. 2018, 44(3): 315-321. https://doi.org/10.3969/j.issn.1000-3428.2018.03.052
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Traditional methods of speech modulation fail to consider the tone and the edge tone,which affect the sound quality and cause distortion.In order to solve this problem,this paper proposes an improved modulation method based on linear prediction.According to the perceptual characteristics of the edge tone existed in the musical sound,it uses the harmonic-percussive separation method to split an input audio signal into harmonic component and percussive component.It separates the harmonic component into resonator filter and glottal excitation signal based on linear prediction model.It resamples the glottal excitation signal to perform pitch-scale modification,and adopts frame-signal superimposed synthesis to improve the continuity of the splicing segment.Finally,it reconstructs the signal of musical sound by the harmonic-percussive superposition in frequency domain.Experimental results show that,compared with the traditional linear prediction modulation method,the proposed method not only can modify the pitch of musical sound while keeping the tone relatively stable and undistorted after the modulation process,but also can greatly improve the quality of speech.