Author Login Editor-in-Chief Peer Review Editor Work Office Work

15 August 2014, Volume 40 Issue 8
    

  • Select all
    |
  • XIANG Yi-hong,ZHU Yan-min
    Computer Engineering. 2014, 40(8): 1-5. https://doi.org/10.3969/j.issn.1000-3428.2014.08.001
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Measuring the performance of those nodes in Wireless Sensor Network(WSN) suffering from interference is meaningful for protocols such as congestion control and link scheduling.Recent studies suggest that the physical model(PRR-SINR model) is significantly more accurate than existing interference models.This paper proposes a central algorithm and a distributed algorithm to build the PRR-SINR model for every node in a WSN respectively.The central algorithm uses a node to send commands to tell other nodes in the network when to receive/broadcast measurement packets.Thereby each node will build the PRR-SINR model according to the commands.In the distributed algorithm,however,each node builds the model all by itself.This paper evaluates the two algorithms in a network which is composed of 17 TelosB nodes.Experimental result shows that the models built by both of the proposed algorithms achieve high accuracy,while the overhead is significantly low.
  • ZHANG Shi-yue,WU Jian-de,WANG Xiao-dong,FAN Yu-gang,LENG Ting-ting
    Computer Engineering. 2014, 40(8): 6-9. https://doi.org/10.3969/j.issn.1000-3428.2014.08.002
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problems of some typical clustering routing protocol such as the cluster size is not uniform and the energy consumption is not balanced,an Energy-balanced Clustering Routing Algorithm Based on Energy and Distance(ECRED) is proposed.The cluster head selection threshold formula based on energy and distance factor is improved to prolong the working life of the selected cluster heads.The alternative cluster head is chosen to reduce the energy consumption of reelections.A waiting time before broadcasting election information is added.Nodes select the cluster head based on communication cost.Finally,it establishes the optimal routing path among clusters and transmits information using single hop and multi-hop combination method.The simulation results show that compared with EECS protocol,ECRED algorithm can save energy by about 8%,balance the node energy consumption effectively and prolong the life cycle of network.
  • CHEN Shu,HAN Jin,JIANG Wei
    Computer Engineering. 2014, 40(8): 10-14. https://doi.org/10.3969/j.issn.1000-3428.2014.08.003
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    For the problems of node premature death and imbalance of the energy consumption of cluster heads in the low-redundancy Wireless Sensor Network(WSN),which the existing clustering routing algorithm can not solve effectively,this paper proposes a new uneven cluster routing algorithm based on Particle Swarm Optimization(PSO) algorithm and the Shortest Routing Tree(SRT).The algorithm utilizes PSO to optimize the process of uneven clustering,and then establishes the SRT to search the optimal multi-hop transmission paths to realize efficient data transmission from sensor nodes to base station.Simulation results demonstrate that the proposed algorithm is better than EEUC and EECS algorithm.It can effectively solve unbalanced energy consumption problem and greatly prolong the network lifetime cycle.
  • QIU Feng-mei,LI Huai-zhong
    Computer Engineering. 2014, 40(8): 15-20,26. https://doi.org/10.3969/j.issn.1000-3428.2014.08.004
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In the traditional DV-Hop localization algorithm,an unknown node only receives an average hope distance broadcasted by the nearest anchor node,which results in accumulated localization error due to larger estimation error of average hop distances.Aimming at this problem,this paper proposes an improved DV-Hop algorithm.The proposed algorithm adopts improved average hop distance which is derived from the weighted average of estimated hop distances broadcasted by multiple anchor nodes.Finally,localized coordinates of the unknown node are modified to further improve localization accuracy.Furthermore,the located unknown nodes are upgraded to anchor nodes to assist in localizing other unknown nodes.Simulation studies using Matlab show that localization accuracy of the proposed algorithm is 10.26%~15.38% higher than the traditional DV-Hop algorithms and 2.0%~3.78% higher than the improved algorithm proposed by Feng Jiang et al(Computer Engineering,2012,No.19).Coverage rate of the proposed algorithm is 8.6%~12.7% higher than the traditional DV-Hop algorithms and about 1.3% higher than the improved algorithm proposed by Zhang Jing et al(Journal of Computer Applications,2011,No.7).
  • GAO Lei,CAO Jian-zhong,HUANG Jin-qiu
    Computer Engineering. 2014, 40(8): 21-26. https://doi.org/10.3969/j.issn.1000-3428.2014.08.005
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to determine the suitability of ZigBee in data-intensive Body Sensor Networks(BSN) and to evaluate the performance of ZigBee,the Delivery Ratio(DR) and end-to-end delay are evaluated,for BSN with body sensor networks star and tree topologies,under the contention environment.The effect of the devices’ clock drift and hidden nodes on the reliability of the star network is modeled and validated through experimental tests.The reliability of the ZigBee network in a star topology without hidden nodes is very good(DR close to 100%).The performance in a tree topology is declined,due to router overload and the activation of the route maintenance protocol triggered by periods of high traffic load.The worst case DR decreased to 13%,while for the non-acknowledged mode.Therefore,to apply ZigBee protocol to BSN,a mechanism to distributing the nodes’ traffic over the time is required to avoid the performance degradation of BSN.
  • ZENG Ping,ZHANG Li,YANG Ya-tao,CHU Xu,LIU Yu-xin
    Computer Engineering. 2014, 40(8): 27-32,37. https://doi.org/10.3969/j.issn.1000-3428.2014.08.006
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at some security problems are considered in Internet of Things(IOT),such as key management,and privacy protection.A key management scheme based on Homomorphic Encryption and China Remainder Theorem(HECRT) is proposed,in which a layer network model based on node location is given to deploy network infrastructure.Double key pool is introduced for key management and distribution,which saves network overhead and node resource consumption.According to the use of homomorphic encryption for the note privacy information,it protects users’ privacy and keeps the data processing safe.Simulation result shows that this scheme has good connectivity and security.
  • LIU Xi,LIU Kai-hua,MA Yong-tao,YU Jie-xiao
    Computer Engineering. 2014, 40(8): 33-37. https://doi.org/10.3969/j.issn.1000-3428.2014.08.007
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    With the development of the Internet of Things(IoT) technologies,passive UHF RFID localization techniques is widely applied in many fields.However,in complicated indoor multipath channel environment,the localization accuracy drops a lot due to multipath interfere.To solve this problem,this paper analyses the passive UHF RFID localization errors in multipath environment,and proposes a passive UHF RFID localization algorithm based on Multidimensional Scaling(MDS).This algorithm extracts the Phase Difference of Arrival(PDOA) information of the tag to be located and the reference tags.With the PDOA information,it can construct the distance matrix of the tag to be located and the reference tags,and obtain the location of the tag using MDS method.The simulation result indicates that in multipath environment with a relatively strong LOS path,this algorithm can reduce the localization error caused by multipath interfere effectively just using several reference tags.
  • GAO Quan-li,GAO Ling,YANG Jian-feng,WANG Hai,REN Jie,ZHANG Yang
    Computer Engineering. 2014, 40(8): 38-42. https://doi.org/10.3969/j.issn.1000-3428.2014.08.008
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The traditional collaborative filtering algorithms may lead to very small difference between each similarities when evaluating the similarity of users or items because of the datasets’ sparsity,which are one major cause of the problem that cannot find the real similar users and items.To solve this problem,this paper modifies the computation of similarity by introducing impact factor which is based on the amount of common ratings users and items.In addition,it presents user-based and item-based prediction computation algorithms for this new calculation method,and takes the common evaluations between users and items as the weighted factor.A weighted prediction computation algorithm based on the common evaluations is employed to compute the final evaluations which are used to get the TopN recommendation.Experimental results by using a real dataset show that the proposed algorithm can achieve lower mean absolute error than 0.78 at different amount of neighbors and better quality of recommendation.
  • LI Ting,XU Yun,NIE Peng-yu,PAN Wei-hua
    Computer Engineering. 2014, 40(8): 43-47. https://doi.org/10.3969/j.issn.1000-3428.2014.08.009
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The complexity of parallel programming,and the portability problem caused by the diversity of parallel computing platforms,restrict the popularization of parallel computing.To solve these problems,this paper designs and implements a cross-platform hierarchical parallel programming framework,Open Cross-platform Hierarchical(OpenCH).With a design of two-level parallel libraries and hierarchical API,the framework hides the details of parallelization to the upper applications.By providing common interfaces for library functions based on different platforms,changes of platform are invisible to the upper applications.A block-filling programming method,supported with a task-scheduling system,is designed for the development of lower parallel libraries.OpenCH is tested by applying it to a remote-sensing image classification problem.Experimental results show that,parallel programs based on OpenCH can be executed on multiple parallel computing platforms and achieve ideal parallel speedup ratio,time overhead caused by the framework is no more than 15%.
  • ZHAO Fa-xin
    Computer Engineering. 2014, 40(8): 48-51,57. https://doi.org/10.3969/j.issn.1000-3428.2014.08.010
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The fuzzy database has a canonical interpretation as a set of regular databases because of the fuzzy information.On the basis of existing research,a vague join operation based on the vague relational data model is discussed,and a formula of vague foreign key join under constrained conditions is given.The key for efficiency resides in the fact that the formula does not require to make computations explicitly over all the possible states,but works directly on the vague relational databases,and the query results satisfy rep (q(T))=q(rep(T)).Compared with other query methods based on the possible states,query results of the method are valid and have high execution efficiency.
  • LI Song,ZHANG Li-ping,LIU Yan,HAO Xiao-hong,YANG He-yu
    Computer Engineering. 2014, 40(8): 52-57. https://doi.org/10.3969/j.issn.1000-3428.2014.08.011
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The Simple Continues Near Neighbor Chain(SCNNC) query has important significance in the spatial data query,spatial data mining,Web search,etc.According to the problem that the existing query methods can not handle the SCNNC query in dynamic dataset with obstacles,the influence of the points insertion and deletion to the SCNNC are considered.The OB_DYNSCNNC_ADD algorithm and the OB_DYNSCNNC_DET algorithm are given for the dataset increasing dynamically and the dataset decreasing dynamically based on the judging circle and the filtering methods.The performance of the methods are analyzed and compared by experiment.The theatrical study and the experimental results show that the algorithms have great advantages for the SCNNC query in dynamic dataset with obstacles.
  • YANG Ya-jun,ZHANG Kun-long,YANG Xiao-ke
    Computer Engineering. 2014, 40(8): 58-63,69. https://doi.org/10.3969/j.issn.1000-3428.2014.08.012
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem that DBSCAN can not find clusters of varied densities and is sensitive to parameters,this paper proposes a self-adaptive spatial clustering method based on varied density.The algorithm uses the change rate of density to find the boundaries between clusters with different densities,and self-adjust the values of parameters.Specifically,it defines one point’s density as the distance from itself to its k Nearest Neighbor(kNN).If the density change rate of a point and one of its nearest neighbors is less than the threshold given by the user,the neighbor is called similar neighbor.The paper redefines core point as point which has at least k similar neighbors in its nearest neighbors.Based on these modifications,it uses DBSCAN to breadth first search,and marks the connected core points as well as their nearest neighbors as the same cluster.In addition,the algorithm automatically adjusts the values of the parameters at runtime according to the average densities and density change rate of the marked core points.Experimental results show that the improved method can find clusters of arbitrary shape,size and density,and eliminate outliers.Besides,with the self-adaptive,setting parameters is easier than other algorithms.
  • XU Gu-cheng,CUI Bin-ge
    Computer Engineering. 2014, 40(8): 64-69. https://doi.org/10.3969/j.issn.1000-3428.2014.08.013
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    When querying remote sensing images under the Web environment,it often takes too long to wait for the page response due to the mass data volumes and numerous data sources.The page response time of the traditional paging query methods increases rapidly when the query volumes become larger and larger.An optimization strategy based on the new page transformation algorithm for the paging query of multi-source and mass remote sensing data is proposed.The total number of records that satisfy the query criteria is acquired by invoking Web services.The page requested by users is divided into several subpages using the page transformation algorithm.The subpage for each data source is queried dynamically and merged by the intelligent Agent.Some experiments are performed for different data volume of queries.Experimental results show that as data volumes increase,the response time of the traditional query methods increases linearly,while the response time of the optimized paging query method remains unchanged as a whole.
  • ZHANG Bin,LE Jia-jin
    Computer Engineering. 2014, 40(8): 70-75,85. https://doi.org/10.3969/j.issn.1000-3428.2014.08.014
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    The character of big data are large scale,depth,velocity,common hardware and open source.Aiming at the system’s inefficiency and scalability problem of traditional relational database in big data analysis,this paper presents an algorithm of parallel join in a MapReduce environment based on column-store by introducing MapReduce computing model.The design of large data-oriented distributed computing models is proposed.It designs the MapReduce column-store file,and achieves optimization by cooperative localization strategy.Secondly,and the partition aggregation and the heuristic optimization strategy to realize the implementation of parallel join algorithm are proposed.Experimental results show that the algorithm has the high performance and scalability in execution time and load capacity. 

  • LI Jian-bo,YOU Lei,JIANG Shan,DAI Chen-qu,XU Ji-xing
    Computer Engineering. 2014, 40(8): 76-85. https://doi.org/10.3969/j.issn.1000-3428.2014.08.015
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Aiming at the problem of handling network partitions and intermittently link connectivity in Delay Tolerant Network(DTN),this paper proposes a Location-based Controlled Epidemic(LC-Epidemic) routing protocol for DTN.LC-Epidemic assumes each node be aware of its position without relying on other prior global topology knowledge,which makes it be more practical and reliable than some existing routing schemes.Performances of different routing strategies are then evaluated by simulations,and the results indicate that LC-Epidemic achieves an approximately high delivery ratio compared with Epidemic protocol,while introducing only 50% overheads to the network when the nodes move in a comparatively slow way and the messages have a short Time to Live(TTL).When the TTL of each message is set to be short,LC-Epidemic outperforms Binary Spray & Wait and First Contact protocols in average latency as long as the buffer resources of nodes is not a bottleneck factor in limiting the routing performance.

  • LU Zhi-gang,JIANG Zheng-wei,LIU Bao-xu
    Computer Engineering. 2014, 40(8): 86-90,95. https://doi.org/10.3969/j.issn.1000-3428.2014.08.016
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    In the cloud environment,since the physical boundary of security domain blurs,and multi-tenants across the Internet share the same hardware resource pool,traditional network access control method no longer can satisfy the need of virtual network.By researching cloud-oriented virtual network access control method.It uses virtual network traffic isolation technology,IP multicast management and virtual mapping method based on Virtual eXtensible Local Area Network(VXLAN) protocol.It provides tunnel access technology to achieve cross-data center virtual machine communication,and it can isolate virtual flows within each different security domain.Experimental results prove that the using of VXLAN Tunnel End Point(VTEP) combined with the virtual gateway can access the virtual network for effective control and isolation,along with the strong analytical protocol effectiveness and operating efficiency.

  • ANG Guang-yu,LIU Chun-feng,ZHAO Zeng-hua,SHU Yan-tai
    Computer Engineering. 2014, 40(8): 91-95. https://doi.org/10.3969/j.issn.1000-3428.2014.08.017
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Due to the high mobility and non-uniform distribution of vehicles in Vehicular Ad Hoc Network(VANET),the network topology changes fast and routing paths break frequently,which makes the performance of traditional routing protocols decline seriously.This paper proposes an algorithm Kalman prediction-based hybrid routing which is adequate for city scenario.The algorithm uses Kalman predictor to predict real-time location of vehicles for routing computation.Besides the greedy mode and perimeter mode like Greedy Perimeter Stateless Routing(GPSR),the algorithm takes full use of the mechanism of store-carry-forward in Delay Tolerant Network(DTN) routing.Packets which have no appropriate forwarding nodes are stored and carried by vehicles until the network is well connected,and sends to appropriate forwarding neighbor which benefits delivery performance.Simulation results show that the algorithm has better packet delivery ratio and lower delay compared to GPSR and GPSR with buffer algorithm.
  • LIU Xiao-hua,PENG Yong
    Computer Engineering. 2014, 40(8): 96-100,105. https://doi.org/10.3969/j.issn.1000-3428.2014.08.018
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In ZigBee network,aiming at the problem that AODVjr routing algorithm has a high node mortality and large energy consumption,combined with the storage structure,the energy and link quality characteristics of the node,an improved F-AODVjr routing algorithm is proposed.The Improved algorithm uses neighbor list maintained by routing node to find destination node before starting a route discovery process,in order to reduce the energy consumption of starting a process in Route Request(RREQ) addressing destination node.In the routing path discovery phase,routing algorithm is designed with routing hop,residual energy and link quality to find the optimal path with minimum routing cost,and the thought of shortest path in AODVjr routing algorithm is improved.Simulation results show that the F-AODVjr routing algorithm effectively reduces the network energy consumption,raises node’s survival rate and packet’s successful delivery rate.
  • LIU Xue-yan,LI Zhan-ming
    Computer Engineering. 2014, 40(8): 101-105. https://doi.org/10.3969/j.issn.1000-3428.2014.08.019
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    There is only a same key in Domingo-Ferrer(DF) data aggregation scheme while encryption/decryption,therefore it can not resist the known plaintext attack,capture attack,etc.In order to solve these problems,this paper proposes a data aggregation scheme based on Privacy Homomorphism(PH) mechanism.The scheme adopts a double encryption mechanism of one-time pad,multi-resource node uses different key while encryption/decryption,so that it can be effectively against plaintext/ciphertext attack,compromising attack and man-in-middle attack.It does not decrypt while aggregating,so need not the extra decryption overhead,and at the same time it ensures the confidentiality of the data and user’s privacy.Analysis result shows that this scheme has strong forward security and lower storage cost compared with the SDAP and SEDA scheme.
  • WANG Hui-lin,YAN Xiang-tao
    Computer Engineering. 2014, 40(8): 106-111,115. https://doi.org/10.3969/j.issn.1000-3428.2014.08.020
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to solve ciphertext search in untrusted cloud environment,this paper proposes a new ciphertexts-searchable public key encryption scheme without bilinear pairings computation based on Searchable Public Key Encryption with a Designated Tester(dPEKS) scheme.The scheme references the thought of RSA algorithm and Elgaml algorithm,a special cyclic group is constructed from modulo-residue class ring.Based on this cyclic group,it constructs the encryption algorithm,decryption algorithm and keyword-search algorithm.This scheme satisfies indistinguishability against adaptive chosen plaintext attack and the off-line keywords-guessing attack,if the discrete logarithm assumption and decisional Diffie-Hellman assumption stand in the chosen cyclic group.Analysis results show that it can encrypt and decrypt data effectively,and search the cipher with keywords correctly.In addition,the scheme has obvious advantages of efficiency compared with schemes which have the same security.
  • BAO Si-gang,GU Hai-hua
    Computer Engineering. 2014, 40(8): 112-115. https://doi.org/10.3969/j.issn.1000-3428.2014.08.021
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    BLS short signature is a digital signature scheme based on bilinear pairings,compared with the traditional ECDSA signature,BLS scheme has the advantage of its short signature length.Recently,a lot of researches are made in the field of fault attack on elliptic curve cryptography.However,fault attack on the bilinear pairings-based cryptography is rarely researched.This paper studies the security of BLS short signature scheme through analyzing in fault attack scenario.The main idea is to create the invalid curve attack method suitable for GF(3l) by applying the current invalid curve attack method suitable for GF(2m).This attack method can be used to attack the BLS short signature scheme.Simulation experimental result shows that the key of the BLS short signature can be broken with high probability by this method and only one time with single bit fault injection is required.
  • HU Jun-feng,CAO Jun
    Computer Engineering. 2014, 40(8): 116-122. https://doi.org/10.3969/j.issn.1000-3428.2014.08.022
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem of Wireless Sensor Network(WSN) node localization when exists malicious nodes,a robust secure localization algorithm based on Beta Reputation System(BRS) is proposed. The trust evaluation framework is established on the basis of BRS. Then, final trust value of anchor nodes are compared with stored threshold within the communication scope of multiple hops by sensor nodes and thus can reduce the impact of the malicious attackers in WSN. The weighted Taylor-series least squares method is employed to estimate the coordinates of sensor nodes, it can identify malicious anchor nodes of WSN and improve node localization accuracy. Simulation results show that the algorithm increases the localization accuracy by 10%, 15%, 55%, 110% at the condition without malicious node colluding and by 15%, 20%, 65%, 150% with malicious node colluding compared with RMLA2, RMLA1, Bilateration, t-TLS localization algorithm.
  • ZENG Feng-lin,WEN Luo-sheng
    Computer Engineering. 2014, 40(8): 123-127. https://doi.org/10.3969/j.issn.1000-3428.2014.08.023
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem of the epidemic spreading on the computer network,a bipartite scale-free network is proposed in which the nodes are divided into servers and clients.By using Susceptible-Infected-Susceptible(SIS) of the network,it applies rate equation approach to research state transition and critical behavior of epidemic on bipartite scale-free network,and the relation between the infection rates of server-to-client and client-to-server is analyzed.Then using the immunization strategy on the SIS model to analyze the function of the immunization strategy when considering the same topology of the network and spreading model.The numerical simulation is presented to support the analytic results.The result is that all of the stochastic immunization,vertex degree immunization and acquaintance immunization strategy can defense against epidemic spreading,and the power of immunization enhanced in proper sequence.
  • LI Ji-liang,LI Shun-dong,WU Chun-ying
    Computer Engineering. 2014, 40(8): 128-132. https://doi.org/10.3969/j.issn.1000-3428.2014.08.024
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Existing (n,t,n) secret sharing scheme can guarantee the main share of strong t consistency,but they can not verify the authenticity of subshare or prevent fraud appeared in the process of secret reconstructing.This paper designs a publicly verifiable strong (n,t,n) secret sharing scheme without trusted center using the assumption about the difficulty of computing discrete logarithm,public-key encryption algorithms and key agreement thought,which reduces the cost needed by establishing private channel.In addition,the participator needs only public information to finish verification,without interactive group communication,which can resist deceits of other schemes.Performance analysis shows that the proposed scheme not only has less computational overhead and communication cost,but also has strong t consistency and public verifiability,compared with existing (n,t,n) secret sharing scheme.
  • WANG Qun,DAI Xiu-yue,YANG Li
    Computer Engineering. 2014, 40(8): 133-137. https://doi.org/10.3969/j.issn.1000-3428.2014.08.025
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The dynamic property description and information aggregation ability is a key function of trust models,but most current models cannot describe the dynamic variation features of node behaviors,and ignore the fuzzy property as well as behavior context relationship of trust.In this paper,an intuitionistic fuzzy theory based dynamic trust model DIFTrust is proposed.In DIFTrust,the behavior context correlated,fuzzy,and dynamic properties of trust are characterized by interest domain partition mechanism,intuitionistic fuzzy theory,as well as time window and adaptive weight vector mechanism.DIFTrust provides a way to describe the node trust relationship with dynamic and intuitionistic fuzzy features.Experimental results show that DIFTrust can describe trust features more effectively,and has good performance in defending strategic behavior changing of malicious peers compared with current models such as PeerTrust and DyTrust.
  • XU Feng,ZHANG Gui-zhu,ZHAO Fang,WU De-long
    Computer Engineering. 2014, 40(8): 138-142. https://doi.org/10.3969/j.issn.1000-3428.2014.08.026
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Because of the problems of Shuffled Frog Leaping Algorithm(SFLA) such as premature convergence and low accuracy for hard high-dimensional optimization problems,an adaptive alternating optimization algorithm of differential shuffled frog leaping called ADE-SFLA is presented.In order to improve the quality of the initial solution,this algorithm uses Particle Swarm Optimization(PSO) to generate a group of initial solution that satisfies the constraints.On this basics,it draws on that Differential Evolution(DE) algorithm has strong global search capability,better population diversity,etc.It designs an adaptive selection mechanism to dynamically alternate SFLA and DE,and builds a win-win relationship between them and complementary advantages.Six classic functions of the simulation results show that the algorithm not only can enrich the diversity of particle,but also can make the algorithm have a better pre- and post- optimization ability.Its optimization rate,solution accuracy,stability are better than SFLA and DE,and are also better than the differential SFLA which is compared.
  • WU Jian-min,WANG Min-gang
    Computer Engineering. 2014, 40(8): 143-146. https://doi.org/10.3969/j.issn.1000-3428.2014.08.027
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Environment modeling and path planning are two key problems for wheeled robot vision navigation.Aiming at the real-time shortage in the environment modeling,this paper proposes a method using the principle of extending edge;To overcome the oscillation and local trap problem in mobile robot path planning,this paper presents the contrarian potential field method to give a new function and the generalized wheeled robot safety channel;To solve the nonholonomic of the Wheeled Mobile Robot(WMR),this paper presents a curvature mapping method that the WMR path is mapped into a curvature to create smoothing path.Finally,the path generation algorithm is given,and some experiments are conducted to validate the algorithm.
  • WANG Fu,ZHENG Ya-ping,LIU Tian-qi
    Computer Engineering. 2014, 40(8): 147-151. https://doi.org/10.3969/j.issn.1000-3428.2014.08.028
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    For disadvantages such as slow convergence velocity and inaccurate level of Niche Chaotic Mutation Particle Swarm Optimization(NCPSO),a new algorithm using the adjusting factor is proposed in this paper.It can evaluate the convergence velocity of swarm,and change the velocity of these particles trapped in local optima,enhancing the diversity of particles in order that the accurate level and convergence velocity of new algorithm is improved.Experimental result shows that,compared with PSO with inertia weight(PSO-ω) and NCPSO,the new algorithm NCPSO-FLV is much faster and more accurate.NCPSO-FLV is utilized for the simulating experiment of engineering production task.The simulations reveal that there is a high utilization for obtaining accurate ration of production task.
  • LU Xing-jia,GUO Lin,CHEN Zhi-rong,LIN Yong
    Computer Engineering. 2014, 40(8): 152-157. https://doi.org/10.3969/j.issn.1000-3428.2014.08.029
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem of illumination change,occlusion and weather condition in multiple vehicles detection and tracking process.This paper proposes the multiple detection and tracking algorithms,which are based on Markov Chain Monte Carlo(MCMC) method and Multiple Hypothesis Data Association(MHDA).The algorithms combine Histogram of Oriented Gradients(HOG) features template match with MCMC motion state estimation,and consist global data association.It improves the detection match threshold and decreases the motion estimation error to satisfy the accuracy and real-time.Experimental results show that algorithms have much higher accuracy and precision in detecting and tracking four kinds of vehicles and under normal light conditions,detection and tracking accuracy are 90% and 85% respectively.
  • ZHAI Dong-hai,CUI Jing-jing,NIE Hong-yu,YU Lei,DU Jia
    Computer Engineering. 2014, 40(8): 158-162,167. https://doi.org/10.3969/j.issn.1000-3428.2014.08.030
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Sensitive topics often contain tendentious attitude and some prior knowledge,and how to effectively use prior knowledge to determine sensitivities of network text is the difficulty and hot spots in sensitive topics detection.Taking full advantage of strong knowledge fitting capability of Conditional Random Fields(CRFs),this paper proposes a sensitive topic detection model based on CRFs.By extracting feature items,in combination with the sensitive terminology,this approach represents new documents and sensitive topic categories as observation sequence and state sequence of CRFs.Feature function is constructed by using prior knowledge of sensitive topics categories,and observation sequence and state sequence are connected by them.It estimates the credibility of the observation sequence by Viterbi algorithm,so feature items in new documents is marked with items in sensitive topic categories in probability.Experimental results demonstrate that this approach achieves very good results in precision,recall rate and F-measure.
  • ZANG Fei,YANG Qin-mei
    Computer Engineering. 2014, 40(8): 163-167. https://doi.org/10.3969/j.issn.1000-3428.2014.08.031
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Concerning the fusion learning problem of the unlabeled and single labeled samples,this paper gives the concept of samples sparse neighborhood,then further puts forward Sparisity Preserving Discriminant Analysis Based on Sparse Neighborhood(SNSPDA) algorithm.Samples sparse neighborhood makes full use of its discriminant attribute,and SNSPDA reinforces the role of those samples which have big reconstructive coefficients.This algorithm not only captures the local geometry structure,but also maintains the sparse reconstruction relationship between samples.Furthermore,it avoids the overfitting problem during the process of the single labeled sample learning.A mass of experimental evidence from single labeled image samples demonstrates that this fusion feature algorithm has a higher discriminating rate than those fusion methods which only reflect single data attribute.For instance,when the illumination condition changes significantly,the distinguishing rate of SNSPDA raises by 2.14% and 17.43% compared with Sparsity Preserving Discriminant Analysis(SPDA) algorithm and Semi-supervised Discriminant Analysis(SDA) algorithm.
  • LIU Fei,HAO Kuang-rong,DING Yong-sheng,LIU Huan
    Computer Engineering. 2014, 40(8): 168-172,178. https://doi.org/10.3969/j.issn.1000-3428.2014.08.032
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at complex background and overlap problems in the human Action recognition task, depth images are used to obtain 20 joints in this paper. On the base of them, the joint-angle variation of action time series is proposed as human action feature model. Besides, this paper proposes an improved Dynamic Time Warping(DTW) algorithm to avoid the pathologic alignment that can be caused by original DTW algorithm. Then, the similarity between joint-angle variation series in different actions is calculated with improved DTW in order to recognize actions, i.e. template matching. The proposed method is carried out in self-collected action database and MSR Action3D database. Experimental results show the proposed method gains recognition of more than 90 percent accuracy.
  • WANG Lei,PAN Feng
    Computer Engineering. 2014, 40(8): 173-178. https://doi.org/10.3969/j.issn.1000-3428.2014.08.033
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem that the Gravitational Search Algorithm(GSA) plays bad performance in the local search ability and is easy to get into the premature convergence in the searching process,a new GSA based on Diversity and Local Optimization strategy(DLOGSA) is proposed.The ideas about the local optimal solution of particle swarm optimization and the chemo-repellents of bacterial chemotaxis are introduced to the gravitational search algorithm,which improves the local optimization ability of particles and the diversity of population in GSA through helping the particles approach the optimal position and flee the worst position.The validity of the improved method is confirmed by testing the benchmark functions.Results show that the GSA based on the improvement of diversity and local optimization ability can balance the global search ability and the local search ability,keep the diversity of population at utmost,and improve the search capability substantially.
  • HE Tian-zhong,ZHOU Zhong-mei,HUANG Zai-xiang
    Computer Engineering. 2014, 40(8): 179-182,189. https://doi.org/10.3969/j.issn.1000-3428.2014.08.034
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Many rule-based classifications use single measurement to select the attribute value.Thus,many attribute value pairs have the same measure.It is difficult to distinguish which attribute value pair is the best.Besides,rule-based classification usually extracts 100% confidence rules.So it takes long time to extract these rules.Moreover,the support of these rules is very low.Confronting these problems,this paper proposes a new measure,called selectivity.Selectivity is a multi-measure which includes three measures.So,it can select the best attribute.It develops a new algorithm LRSM which can extract rule based on selectivity.When the number of the negative instance is less than the threshold,LRSM stops the rule extraction.It extracts another rule.Experimental results show that LRSM has high accuracy and decreases consume time.
  • ZHANG Da-bin,JIANG Hua,XU Liu-yi,ZHANG Wen-sheng
    Computer Engineering. 2014, 40(8): 183-189. https://doi.org/10.3969/j.issn.1000-3428.2014.08.035
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Differential Evolution Based on Two-stage Mutation and Crossing Strategy(TMCDE) aims to accelerate convergence and improve accuracy of Differential Evolution(DE).The TMCDE algorithm introduces the initialization method of opposition-based chaos and stochastic diffusion search strategy,and the initial group is divided into sub-groups of both better and worse.The two stages successively improve the better and worse of two sub-groups with different DE strategies.At certain time,the two sub-groups combine to one group,and which enters to the next stage according to their fitness values.It is conducive to improve the group’s quality,and overcomes the shortcomings of a single differential strategy.By the benchmark function experiments,the TMCDE algorithm performs better convergence speed and optimization capability by the comparison with other DE algorithms,and the results prove the effectiveness of the TMCDE algorithm.
  • BAI Lu-ping,MA Li-hong,LI Qing-long
    Computer Engineering. 2014, 40(8): 190-193,200. https://doi.org/10.3969/j.issn.1000-3428.2014.08.036
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem of the poor reconstruction accuracy and the limited sparsity of the signal available of the DFT correlation detection algorithm in the compressed sampling using chirp matrix,this paper proposes a Discrete Chirp-Fourier Transform(DCFT) based reconstruction algorithm.It increases the number of measurements according to the signal sparsity k so that the sampling matrix has the ability of reconstructing accurately the signal having a large value of k.And it selects the atom indexes corresponding to the k largest DCFT amplitude of the sampling signal to hit the nonzero positions to reduce the incorrect detection of the best atoms caused by cross interference in the DFT correlation algorithm.It uses the Least Square method to estimate the amplitudes of nonzero elements to further reduce the reconstruction error.Results of sampling and reconstruction experiments of one-dimensional signal with length N of 1 681 show that the DCFT reconstruction algorithm can reconstruct accurately the signal whose sparsity k increases to 4 times as that of the DFT correlation detection algorithm,and has a considerable computational complexity O(kN).
  • SHAO Chao,WAN Chun-hong,ZHAO Jing-yu
    Computer Engineering. 2014, 40(8): 194-200. https://doi.org/10.3969/j.issn.1000-3428.2014.08.037
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The success of manifold learning algorithms depends greatly upon selecting a suitable neighborhood size parameter,however,it is an open problem how to do this efficiently.To solve this problem,this paper proposes an efficient method to incrementally select a suitable neighborhood size.According to the local Euclidean property of the manifold,that all the neighborhoods in the neighborhood graph are linear or almost linear is the basis to think the corresponding neighborhood size suitable,when their linearity measures can remain small and fall into one cluster.However,once the neighborhood size becomes unsuitable,some neighborhoods are nonlinear,and their linearity measures can not fall into one cluster any more.So,this method runs the weighted Principal Component Analysis(PCA) on each neighborhood in the neighborhood graph,to obtain its reconstruction error as its linearity measure,and computes the corresponding Bayesian Information Criterion(BIC) to detect the number of clusters of all the reconstruction errors in the neighborhood graph,by which the neighborhood size can be selected incrementally.Experimental results that this method does not require any extra parameter,and has high run efficiency. 
  • ZHOU Shao-wu,CHEN Wei,TANG Dong-cheng,ZHANG Hong-qiang,WANG Xi,ZHOU You
    Computer Engineering. 2014, 40(8): 201-204,216. https://doi.org/10.3969/j.issn.1000-3428.2014.08.038
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper puts forward the improvement of gravitational search algorithm called PGSA based on affinity to improve the algorithm convergence and search precision,and this improved Gravitational Search Algorithm(GSA) changes the particle’s gravitational force calculation formula.It includes the principles of gravitational search algorithm and the structure of the affinity,namely,it is the affinity between particles which is represented by the quality value of the difference between the particles,and the affinity for the appropriate transformation is added to the formula resultant force.Then the formula resultant force is modified.Ultimately,this paper verifies the algorithm by means of Matlab,and experiments show that the improved algorithm has better convergence and better solution.
  • XIE Juan-ying,WANG Yan-e
    Computer Engineering. 2014, 40(8): 205-211,223. https://doi.org/10.3969/j.issn.1000-3428.2014.08.039
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To overcome the deficiencies of traditional K-means algorithm whose clustering is dependent on the seeds chosen randomly and of the improved K-means algorithms whose clustering are unstable for the parameters selected arbitrarily,a novel K-means clustering algorithm is proposed in this paper.This new K-means algorithm adopts the pattern information of exemplars in a dataset,and computes the deviation for each sample.It uses the well known principle that the deviation of a sample addresses the intensive of exemplars around it.The less the deviation is,the more exemplars are intensively gathered around the related sample.The proposed K-means algorithm chooses the first K samples with the minimum deviation and far away from each other as the initial cluster centers to improve the performance of it.The proposed K-means algorithm is tested on UCI data sets and on synthetic datasets with some proportional noises.The experimental results demonstrate that the proposed novel K-means algorithm not only can achieve a very promising and stable clustering,but also get the immune property with noises in its clustering.
  • ZHOU Xun,GUO Min,MA Miao
    Computer Engineering. 2014, 40(8): 212-216. https://doi.org/10.3969/j.issn.10003428.2014.08.040
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper proposes a limited speeddiscrete bee colony optimization algorithm to solve the normalized color image segmentation problem in image field.According to the problem model,the position of the bee colony algorithm is redefined discrete position,and the speed definition of individual bees is increased.In order to solve the problem of premature convergence,the paper introduces a limited speed process,and designs a limited speed function to increase the diversity of the population.Meanwhile the adaptive weighting adjustment strategy is introduced to update the position of individual bee.So the stability and convergence speed of algorithm is improved.Experimental results show that the algorithm is superior to other similar algorithm in convergence rate and efficiency,and the algorithm in the normalized color image segmentation problem is verified to be efficient and superior.
  • ZENG Weibo,XING Yongkang,SHI Yang
    Computer Engineering. 2014, 40(8): 217-223. https://doi.org/10.3969/j.issn.10003428.2014.08.041
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    For structure image restoration,a fast method based on gray theory and nibble control is proposed.It adopts a singlefactor cloud grey forecasting model based on meanvalue generating time series to mine the tendency of grayscale sequences in different direction and to predict the grayscale value of single unknown pixel using the known information around the area to be restored.The restoration process is the same as the silkworms eat mulberry leaves.It replaces the complex priority calculation and guarantees the quality of edge structure restoration.Experimental results show that the algorithm not only extends topology of edge and smooth area, but also meets the realtime requirements and a good visual effect.
  • SHEN Shi-wen,CAO Guo,SUN Quan-sen
    Computer Engineering. 2014, 40(8): 224-228. https://doi.org/10.3969/j.issn.10003428.2014.08.042
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Previous superresolution reconstruction methods based on learning fail to obtain no artificial result and cost much time.Aiming at the problem,the reconstruction method by searching similar patches in multiple scales and weighing them adaptively is proposed.Through modifying the learned high resolution block and taking into consideration of the mutual influence among surrounding pixels,the reconstruction ambiguity and saw tooth on the edge of image are reduced.Furthermore,this paper introduces randomized patchmatch for searching similar patches which costs less time than the methods using tree structures.Experimental results show that the proposed algorithm can reconstruct high resolution image rapidly for a single low resolution image without any priori information.The proposed method also improves and validates the quality of superresolution reconstruction with the help of the reference image based quality assessment and the blind image quality assessment.
  • LIN Ya-zhong,LI Xin,ZHANG Hui-qi,LUAN Qin-bo,HU Yong-shi
    Computer Engineering. 2014, 40(8): 229-232. https://doi.org/10.3969/j.issn.10003428.2014.08.043
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Segmentation correction model base on local features can implement the correction of intensity inhomogeneity images,for which has better segmentation effect,but with the local features of the proposed model,and using multiphase segmentation,so that it is sensitive to the initial position of the active contour curve and the segmentation is slow.Aiming at the shortage,a new fast segmentation algorithm is proposed by introducing Adaptive Distance Preserving Level Set(ADPLS) algorithm,combining the segmentation correction model and adaptive distance preserving level set method.Experimental results show that the improved algorithm not only gets rid of the influence of the initial contour,but also avoids the edge leakage and segmentation shortage phenomenon,maintains the fast segmentation.
  • ZHOU Gui,LIU Feng
    Computer Engineering. 2014, 40(8): 233-236,241. https://doi.org/10.3969/j.issn.10003428.2014.08.044
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In the railway wireless train dispatching system,the Session Initiation Protocol(SIP) proxy server provides call routing services by parsing and forwarding all the incoming SIP packets in an IP telephony network.The efficiency of this procedure is directly related to the safety and reliability of rail transportation.In this paper,through the research analysis of SIP proxy server architecture,it proposes the M/G/1 queuing model based on the SIP proxy server in WiMAX network environment and studies some of the key performance benchmarks such as average response time to process the SIP calls and mean number of SIP calls in the system to meet the practical requirement.Experimental results show that the average response time is not beyond between 20 ms while the number of calls is not beyond between 400 per second.It meets the actual need very well.And the model can be used to predict the performance of the SIP proxy server,and provide useful reference for the deployment and optimization of the wireless train dispatching system.
  • SUN Bao-yin,ZHOU Qiang,ZHU Jun-jie,NI Sa-hua,TAO Zhi,GU Ji-hua
    Computer Engineering. 2014, 40(8): 237-241. https://doi.org/10.3969/j.issn.10003428.2014.08.045
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Currently,the Cochlear Implant(CI) coding techniques achieve a high speech recognition rate in quiet environment,but the auditory perception performance significantly decreases in noisy conditions.In order to solve this problem,this paper proposes an enhancement method in CI on the basis of improved gain function.Based on the combined coding algorithm,this paper makes use of the spectrum estimation algorithm of constrained variance noise to calculate the noise power spectrum estimation and applies it into Signal to Noise Ratio(SNR) estimation,and combines it with human ears’ masking threshold to adaptively adjust the gain function in subband.The speech enhancement in CI is achieved by combining the improved gain function with the channel selection.Experimental results show that comparing with the methods of frontend denoising spectral subtraction algorithm and the traditional gain function algorithm of the speech enhancement for CI,the proposed algorithm keeps more voice information and greatly removes the background noise.The average recognition rate of this method is respectively improved by 53% and 22%.
  • LI Jingmei,WANG Xue,HAN Qilong
    Computer Engineering. 2014, 40(8): 242-245,252. https://doi.org/10.3969/j.issn.10003428.2014.08.046
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Based on the shortcoming that the bad priority mechanism and the low task scheduling efficiency of existing task scheduling algorithm for heterogeneous Multicore Processor(CMP),this paper proposes a comprehensive task scheduling algorithm which is based on heuristic algorithm.The algorithm belongs to list scheduling algorithm with a weighted scheduling priority list approach based on the characteristics of heterogeneous platform and the dependent task features.On this basis,the algorithm takes multiduplication technology to reduce the community between the dependent tasks located on the different core,which advances the earliest start time of the task.And in the allocating strategy,it takes interval insertion as the task allocating scheme,which improves the resource utilization.Finally,a simulation experiment is designed to test the performance of the new algorithm and two compared heuristic algorithm.The results show that the new algorithm is effective to enhance the performance of task scheduling.
  • QI Chan-ying,LI Zhan-huai,ZHANG Xiao,FENG Wen-xiong,ZHANG Rui-jie
    Computer Engineering. 2014, 40(8): 246-252. https://doi.org/10.3969/j.issn.10003428.2014.08.047
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Configuration capacity is often underutilized in the traditional configuration techniques.And storage space which is occupied by the storage resources that are allocated already cannot be used again even if they are released.Thin Provisioning(TP) serves effectively to solve the problem of storage resources waste which is caused by the traditional configuration technology by allocating space on demand.In view of the problem that the reclaim methods of free space require additional system resources in the general file system,it puts forward the smart space reclamation in the Storage Area Network(SAN) environment diagram based on the NTFS file system cluster bit.Experimental result demonstrates that the reclaim method can reclaim free space in the NTFS file system effectively.The reclaim efficiency is up to 90% when TP page size is set in KB.At the same time,the method delays the time of the capacity warning and expansion online effectively,and it improves the utilization rate of storage greatly.
  • LI Chang-zhi,FU Xiao-dong,TIAN Qiang,WANG Wei,XIA Yong-ying
    Computer Engineering. 2014, 40(8): 253-258,263. https://doi.org/10.3969/j.issn.10003428.2014.08.048
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    By allocating the reliability constraint of the Web service composition to each component service in the design phase,a service composition with high reliability and minimum cost can be provided.For this purpose,this paper analyzes structure patterns of service composition and corresponding reliability of these patterns are given.Then,it analyzes the reliability of the Web service composition.Based on the relationship between the reliability and the cost of the component services,it designs a reliability allocation optimization model to allocate reliability constraint to component services reasonably and uses genetic algorithm to solve it.The proposed optimization model can satisfy the reliability constraint of the service composition with minimum cost.Experimental results show that the proposed method is better than other reliability allocation method in terms of cost saving.It tests the effectiveness,practicality and efficiency of the method by extensive experiments.
  • MA Ye-chen,LI Xing-fei
    Computer Engineering. 2014, 40(8): 259-263. https://doi.org/10.3969/j.issn.1000-3428.2014.08.049
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In Strapdown Inertial Navigation System(SINS),the floatingpoint matrix multiplication is complicated and timeconsuming,especially in serial way.Therefor the calculation is a constraint of developing SINS’s realtime.To solve the problem,a sort of method in parallel based on FPGA/SOPC(System on a Programmable Chip) is put forward.The core of this method is the highperformance matrix multiplication cell whose structure is formed on systolic array basis with optimizing by loop tiling,data space diving and iteration space combining.Together with the remarkable speed advantage of Direct Memory Access(DMA) for mass data exchange,the operating rate of matrix multiplication is boosted further.The accelerator relied on the principle mentioned above is worked out.The result of test illustrates that the accelerator has ability to carry out the specific computation accurately and fast,and its speed performance is especially prominent.The cycles that accelerator consumed is decreased by above 71.3%,78% compared with serial and its counterpart method.In conclusion,the accelerator provides a new idea for enhancing navigation systems’ realtime.
  • HU Deng-gui,YUAN Xiu-jiu,YE Ying,YANG Rong,ZHAO Xue-jun
    Computer Engineering. 2014, 40(8): 264-267,272. https://doi.org/10.3969/j.issn.10003428.2014.08.050
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To study the impact of the nonstandard meteorological factors on missile motion characteristics,combining with the basic equations of atmospheric motion and classical trajectory model,the model of the parabolatrajectory is established under the nonstandard weather condition.Besides,an easytoprogram numerical solution of the basic equations of atmospheric motion is presented,and with which the missile’s point coordinate(20 565.8,0,-1 389) under the nonstandard atmosphere is obtained by solving its kinetic equations and kinematics equations using numerical integral method of variablestep four orders RungeKutta.The result demonstrates that the missile’s flight trajectory changes under different meteorological conditions.The visual simulation of the airtoground missile flightattack by the threedimensional visualization modules of STK(Satellite Tool Kit) is realized,which exhibits good performance.
  • GUO Hong-jian,DONG Xiu-ze,GAO Xian-wei
    Computer Engineering. 2014, 40(8): 268-272. https://doi.org/10.3969/j.issn.10003428.2014.08.051
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to implement ZUC algorithm more efficiently on Field Programmable Gate Array(FPGA) platform,this paper comes up with a new method.This method utilizes the iterative nature and parallel feature of ZUC algorithm and utilizes the property of modular addition,reduces the number of adders,including substitutes simple adder which has less hardware resources and less hardware delay for carry save adder and mod(231-1) adder which have more hardware resources and more hardware delay.In this way,it implements the mod(231-1) additions in the critical path of ZUC algorithm,simulates and tests the method on Quartus Ⅱ and ISE,campares the simulate results with other known methods and present analysis.Expermental results show that this method can reach 5.322 Gb/s throughput with only 305 slices in hardware resources employment.Compared with the best published method,the method gives some 23% decrement in hardware resources employment and 25.9% increasement in throughput per unit area,and it can implement ZUC algorithm efficiently as well as reduce the hardware resources employment.
  • CHU Yan-jie,WEI Qiang,LI Yu-zhao
    Computer Engineering. 2014, 40(8): 273-276,281. https://doi.org/10.3969/j.issn.10003428.2014.08.052
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    A new event detection algorithm is proposed to solve the problem of event detection of texts with association relationship.The algorithm uses the method of association relation analysis,semantics and scope extension of the keywords.In this way,all texts in an event are regarded as one text to match the keywords.The algorithm separates the keywords into two groups,then executes keyword semantics extension,keyword matching and association relationship analysis with the groups to get all texts in the event,and extracts and affirms important information.Experimental results indicate that when the keywords are less,the algorithm can improve the probability of getting the event using semantics extension of the keywords,and when it has a large number of keywords,the algorithm can improve the detection rate and reduces the false negative rate using scope extension of the keywords.
  • ZHOU Wen-le,ZHU Ming,CHEN Tian-hao
    Computer Engineering. 2014, 40(8): 277-281. https://doi.org/10.3969/j.issn.1000-3428.2014.08.053
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To solve the shortcomings in traditional methods of personalized recommendation such as sparsity,coldstart,overspecialization and low accuracy problem,this paper proposes a recommendation method based on Website aggregation and knowledge.It gets a movie set to be recommended by Web crawler aggregating Websites,and also builds an ontologybased film model based on which that proposes an algorithm for learning the weights of user preference.It measures the similarity between movies using SimRank method and the weighted average to recommend to users according to the level of similarity.Experimental results show that the accuracy of this method is improved by about ten percent than the existing methods when it is used on nonrealtime recommendation.And quality of recommendations is improved significantly on realtime recommendation.In some extent,sparsity,coldstart,overspecialization problem can be solved.
  • YANG Jia-jia,JIANG La-lin,JIANG Lei,DAI Qiong,TAN Jian-long
    Computer Engineering. 2014, 40(8): 282-287,292. https://doi.org/10.3969/j.issn.1000-3428.2014.08.054
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to solve the Deterministic Finite Automata(DFA) space explosion problem,a DFA algorithm based on clustering,named ClusterFA,is proposed.However,it is difficult to take the ideal value for the number of groups for ClusterFA algorithm.The number in each line of the class center vector table,which is also named CommonTable,is continuously repeated.In order to further improve the clusterFA compression ratio,this paper puts forward a new solution:extracting the same head and tail section between lines of the CommonTable as part of the index table,and using the Runlength Encoding(RLE) technique to code the continuously repeated numbers.This algorithm is tested by Bro,Snort and L7filter rule sets.Experimental results show that the rule sets compression ratio is up to 99% or more except that the compression ratio of L7_2 and L7_6 increases to 96.1% and 98.1%.Compared with the ClusterFA algorithm,the compression ratio of the En_ClusterFA improves an average of 4%.It proves that the En_ClusterFA can effectively improve the compression ratio of the DFA.
  • DENG Zhi-chao,WANG Tong-qing
    Computer Engineering. 2014, 40(8): 288-292. https://doi.org/10.3969/j.issn.1000-3428.2014.08.055
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Motion estimation is the key of video coding,with the analysis of the principle of motion estimation,it can be obtained that matching criterion is the key link.In view of the traditional matching criteria,the block matching accuracy is not high,which leads to more insufficient information redundancy,it proposes a new matching criterion using sparse processing on Quadratic Renyi’s Entropy(QRE) for motion estimation.When calculating the Renyi entropy,the matching criterion introduces statistical histogram to calculate the probability density function,and according to the theory of image quality assessment based on gradient motion vector and the center deviation characteristics,it makes the statistical interval of its histogram sparse.Experimental results show that,this matching criterion simplifies the calculation of probability density function,multiplication computation is greatly reduced by more than 80%with sparse histogram statistics,and in view of the dramatic video sequence,the matching criterion movement can be superior to traditional sum of absolute difference function in peak signaltonoise ratio and image quality.
  • WEI Chun-tao,BI Gui-hong,ZHANG Shou-ming
    Computer Engineering. 2014, 40(8): 293-301. https://doi.org/10.3969/j.issn.1000-3428.2014.08.056
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    According to configuration model,a complex Agent dynamic hierarchical network model for the spread of HIV among Injecting Drug Users(IDUs) is developed to simulate HIV transmission in the real needlesharing network.IDUs on the needlesharing network can be divided into core layer,inner layer and outer layer,corresponding to highintensity drug users,mediumintensity drug users and generalintensity drug users.The connection within a layer is stronger and the connection between layers is weaker,which is controlled by an adjustable probability matrix.Over time,the number of the total nodes in the network changes,while edges between nodes can be dissolved and reformed.The network has dynamic characteristics.Each individual Agent has infection,course of disease progression and rules to react to HIV prevention interventions,including free syringe exchange programs,voluntary counseling and HIV testing and antiretroviral therapy treatment on individual behavior change,the course of disease and the spread of disease.The simulation results show that the model can correctly reflect the characteristics of HIV transmission among differentintensity IDUs networks.
  • MA Li,ZHONG Yong,HUO Yin-yu
    Computer Engineering. 2014, 40(8): 302-309. https://doi.org/10.3969/j.issn.1000-3428.2014.08.057
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Realtime behavior requirements of hard realtime system cannot be changed,however,the realtime behavior requirements of soft realtime systems can be delayed,replaced or compensated.Most current formal specification languages focus on the researches of the hard realtime system,and cannot describe the characteristics of soft realtime system perfectly.To resolve above problems,this paper tries to use a kind of extended language based on ObjectZ to describe soft realtime system,which expresses obligation policies by extended objectZ history invariants,and the language can precisely describe the default policies,the compensation policies and the other soft realtime policies in soft realtime system.The example of a meeting system shows,the method can preferably specify soft realtime behaviors formally and has a good applicability.
  • QIAN Qin,DONG Bu-yun,TANG Zhe,FU Xiao,MAO Bing
    Computer Engineering. 2014, 40(8): 310. https://doi.org/10.3969/j.issn.1000-3428.2014.08.058
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Traditional methods of memory acquisition focus on the persistent data of disk or hard disk in the attacked computers.However,as the growing use of encryption routines or rapidly increasing storage capabilities of hard drives,it is very difficult to get data in time with the original method that is meant for persistent data.So in the field of computer forensics,people start to change the data source and focus on the volatile information in RAM.This paper specifically describes the prevailing methods of memory acquisition and analysis and the process of memory forensics.It explains the characteristics of each method and gives the advantage and disadvantage of them.In the end,it concludes all these methods and gives some suggestions of the future of computer forensics.
  • REN Wei,ZENG Yi-cheng,CHEN Li,YANG Dan
    Computer Engineering. 2014, 40(8): 318-320. https://doi.org/10.3969/j.issn.1000-3428.2014.08.059
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    An improved search strategy method based on Free Search Algorithm(FSA) is proposed to design Infinite Impulse Response(IIR) digital filters,for the use of the existing FSA has the problems of the low efficient later stage.The method by dynamically adjusting individual search radius and regularly search on axial parameters space,improves the ability of search in a multidimensional space.A suitable constraint on the parameters spaces is presented and the optimization model is established under minimizing mean square error and minimizing the ripple magnitudes of both passband and stopband optimality criteria,designing IIR digital lowpass filter based on the proposed method.The effectiveness and superiority of the introduced method are demonstrated by experimental results on the IIR digital filters,which is compared with other algorithms.