Author Login Editor-in-Chief Peer Review Editor Work Office Work

15 October 2015, Volume 41 Issue 10
    

  • Select all
    |
  • SHEN Yizhou,JIANG Rongxin
    Computer Engineering. 2015, 41(10): 1-5. https://doi.org/10.3969/j.issn.1000-3428.2015.10.001
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In the current network video surveillance systems,it is easy to make some gateways overload that binding gateway with camera manually under large-scale real-time transcoding.In view of this problem,an adaptive binding mode based on load balancing strategy is proposed.This mode consists of two mechanisms which are device sleeping and device migration.Gateway reduces load by itself through putting overtime idle device into sleeping state.Gateway dispatcher gets every gateway’s load timely and advances overload judging time by importing alarm region and double exponential smoothing model to predict load.Overload gateway migrates device by last-call-finish-first principle with device active degree to realize load transfer indirectly.Under the testing environment of 4 gateways and 36 patrol groups,the experimental result indicates that,compared with manual-binding mode,the proposed mode can shorten live start averageresponse time by 42.9%.
  • KANG Lei,ZHANG Shuben,YANG Jian
    Computer Engineering. 2015, 41(10): 6-9. https://doi.org/10.3969/j.issn.1000-3428.2015.10.002
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Due to the low power and poor performance on an ordinary PC equipped with the graphics card,it can not meet the needs of large-scale Compute Unified Device Architecture(CUDA) parallel computing.Aiming at these problems,the compute-intensive user tasks are transferred to the GPU cluster system,the GPU cluster management system is designed and implemented based on B/S mode.The user submits the CUDA code through the web and gets results from the GPU cluster management system.Test result shows that the compute-intensive tasks can be done on any browser-based electronic devices.It brings convenience for the users,accelerates the process of running the program,saves the users’ time and greatly improves the users’ efficiency.

  • ZHOU Dongxu,JIA Yueling,GUO Jianxin,ZHENG Hang
    Computer Engineering. 2015, 41(10): 10-13,19. https://doi.org/10.3969/j.issn.1000-3428.2015.10.003
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    The problem of high Peak to Average Power Ratio(PAPR) is an obstruction to the hardware implementation which restricts the practicability of Non-contiguous Orthogonal Frequency Division Multiplexing (NC-OFDM) seriously.To solve this problem,an improved PAPR reduction algorithm based on the Tone Reservation(TR) technique is proposed.According to the results of spectrum sensing,the bands ware divides into two categories according to whether or not there are primary users.Secondary users choose different numbers and amplitudes of the reversed subcarriers in those bands separately.In this way,secondary users get a substantial PAPR reduction while avoiding the interference to the primary users and increasing the utilization efficiency of spectrum.Theory analysis and simulation results show that the proposed algorithm gives a good PAPR reduction performance and supports efficient use of spectrum as well.

  • LI Jun,ZHANG Xiaomeng,HUANG Kai,YAN Xiaolang
    Computer Engineering. 2015, 41(10): 14-19. https://doi.org/10.3969/j.issn.1000-3428.2015.10.004
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Aiming at design features of System on Chip(SoC) with low power consumption,low cost and variety of the interface protocol,multiple commonly used peripheral interface protocols are analyzed,and an architecture of the reconfigurable peripheral controller with flexible multiple communication protocols is proposed.The design supports hardware reconfiguration and software configuration.Multiple standard communication protocols are supported with the configurable finite state machine.The controller can replace multiple peripheral interfaces in the design of SoC,which results in less area and power consumption and reduces the overall cost of the chip.Experimental results show that the architecture of controller is compatible with the commonly used peripheral interface such as I2C,SPI,UART,etc.It achieves 65.2% of area occupation savings and 67.8% of power consumption savings when compared with the combination of I2C,SPI and UART.

  • PEI Yuanyuan,SHI Runhua,ZHONG Hong,ZHANG Shun
    Computer Engineering. 2015, 41(10): 20-25. https://doi.org/10.3969/j.issn.1000-3428.2015.10.005
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    With the development of sensor technology and mobile communication equipments,there appears Location-based Service(LBS),which is widely used.However,privacy issues,which arise in the enjoyment of the services at the same time,are becoming the focus of research.For privacy protection issues in LBS,this paper proposes a distributed model with users’ cooperation and designs a new privacy protection scheme.In this scheme,when constructing the anonymous area,it uses Bayesian Nash equilibrium and secure multiparty summation technologies to ensure the user information privacy.When processing the query results,it introduces the Voronoi map method to increase the query efficiency.Analysis experimental result shows that the proposed scheme gives full consideration to the selfishness and unreliability of users.Hence it not only can provide the protection of user privacy,but also can improve the performance of the services.
  • SHEN Xiajiong,WU Xiaoyang,HAN Daojun
    Computer Engineering. 2015, 41(10): 26-30. https://doi.org/10.3969/j.issn.1000-3428.2015.10.006
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Researchers have made some related studies aiming at problems of over-segmentation and noise sensitive in the pre-processing and post-processing of traditional watershed segmentation algorithms.At first,this paper makes a detailed introduction on two classical algorithms,that is,superincumbent simulation rainfall algorithm and bottom-up simulation flooding algorithm.Followed,three processes are proposed.The first is input gradient image reconstitution processing before the traditional watershed segmentation algorithm,the second is the merge application of region which is partitioned after traditional watershed segmentation algorithm,and the last is the combined processing before and after traditional watershed segmentation algorithm.Then it concludes and analyzes the effect of the improvement of watershed segmentation algorithms in the pre-processing,post-processing and their combined processing.Finally,it makes a conclusion and brings up some research directions to be resolved and basic solving ideas.
  • GE Xiaoyan,ZHANG Ning
    Computer Engineering. 2015, 41(10): 31-36. https://doi.org/10.3969/j.issn.1000-3428.2015.10.007
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to research the influence of information characteristis on information spreading in social network,this paper proposes an information spreading model which combines information characteristics with memory effects.The information characteristics are measured by the attributes of information amount and information adhesion.It performs the model on regular network,small-world network,random network and BA scale-free network.Results indicate that in the speed and scope of information spreading,the attribute of information amount is more effective than the attribute of information adhesion.Under certain information characteristic,it exists a fixed value when individuals select spread information.Especially,on the regular network,the probability of spreading information can reach the maximum when individuals receive information at the second time.
  • WANG Yuzhong,FAN Lei,LI Jianhua
    Computer Engineering. 2015, 41(10): 37-41. https://doi.org/10.3969/j.issn.1000-3428.2015.10.008
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Local community detection is a hot topic in network topology research recently,this paper proposes a local community detection algorithm based on a given node.This algorithm starts from original node,finds the max connective node relevant to the original node,uses Breadth-first Search(BFS) based on node similarity to find local community,and cuts off the found community and gets the entire community which contains the original node.Experimental result shows that this algorithm reduces the time complexity to O(kd3) with high accuracy.
  • YE Xijun,GONG Yue
    Computer Engineering. 2015, 41(10): 42-46,52. https://doi.org/10.3969/j.issn.1000-3428.2015.10.009
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Diversity of recommendation system becomes an important index of evaluating the quality of the recommendation.To improve the individual diversity of traditional collaborative filtering recommendation algorithm,the improved algorithm is based on item-based collaborative filtering recommendation algorithm,which adds item category information and defines a contribution function to optimize the formula of prediction score.It increases the items scores which have not exactly the same item category with the objective item,and achieves the best items recommendation.Experimental result proves the improved algorithm strengthens the individual diversity of recommendion system which at the same time keeps a high precision.As a result,it has a higher quality of recommendation.
  • XIAO Yupeng,HE Yunbin,WAN Jing,LI Song
    Computer Engineering. 2015, 41(10): 47-52. https://doi.org/10.3969/j.issn.1000-3428.2015.10.010
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the uncertainty of sample object in real world and the fuzzy boundary between sample objects,this paper proposes a Uncertain Fuzzy C-Means(UFCM) algorithm.Because of a lot of complex integral calculation in expected distance computation,UFCM algorithm is inefficiency.Further,an improved algorithm called I_UFCM is proposed.In this algorithm,the spatial uncertain objects are transformed into the traditional certain objects for clustering.Besides,a new formula for calculation similarity is introduced instead of traditional Euclidean norm to evaluate the distance between objects.The quality of clustering results is improved by reducing the computational amount of excepted distance.Experimental results demonstrate the clustering performance of I_UFCM algorithm is more effective than UFCM and UK-Means algorithm,and its CPU time is reduced by 90%.
  • HU Feihu,TIAN Chaohui,LI Wei,HAN Xin
    Computer Engineering. 2015, 41(10): 53-58. https://doi.org/10.3969/j.issn.1000-3428.2015.10.011
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the hierarchical scheduling problem of multi-vehicle and multi-supply,this paper proposes a hierarchical scheduling scheme.A two-layer scheduling example is demonstrated for the problem,and it decouples the tow-layer example into two single-layer problems.Taking minimize system scheduling task completion time as the objective function,it uses the genetic algorithm to get the scheduling scheme which describes the specific type of carried cargo,the source and the destination for each type of vehicle in sequence.In this hierarchical scheduling scheme,the output in each warehouse is below its storage,the requirement quantity of supplies in each emergency point meets requirement by the real-time statistics in each warehouse and the calculation of carried cargo in each task.The solving process of this scheme is simple and fast.
  • ZHAO Gang,HE Feng,XU Yajun,LI Qiao
    Computer Engineering. 2015, 41(10): 59-65. https://doi.org/10.3969/j.issn.1000-3428.2015.10.012
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    According to the requirement of information transmission in aviation and aerospace integrated electronic system,flow type is analyzed based on the architecture of time-triggered bus and Time-triggered Protocol(TTP).An information scheduling mechanism based on flow conversion strategy is proposed,which includes periodic Time-triggered(TT) message scheduling algorithm and non-periodic Event-triggered(ET) message micro-reordering scheduling algorithm.The transmission delay for TT message in scheduling algorithm is analyzed by modeling message transmission in TTP bus.Moreover,combining with communication degrading strategy and network calculus method,the server curve,arrival curve and delay bound for non-periodic ET message are obtained.A TTP simulation case is constructed.The results show the maximal delay obtained in experiment is consistent with the worst-case delay in theory analysis,and the flow conversion strategy proposed in this paper can realize a real-time transmission for different flow type.
  • JIANG Lei,CHEN Peng,JIN Feng,HAN Libo
    Computer Engineering. 2015, 41(10): 66-70. https://doi.org/10.3969/j.issn.1000-3428.2015.10.013
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Doppler log test has the disadvantage of long cycle and high cost which is given priority to sea and pool testing.According to the principle of Doppler log measuring,the principle of depth and speed simulation is studied.This paper designs a new signal simulator based on Field Programmable Gate Array(FPGA),it can simulate the seabed echo signal.The seabed echo signal is received by the Doppler log.The values of depth and speed are given by calculating.By comparing the set values,the measurement is accomplished.Experimental result indicates that the measuring error of this simulator is small.
  • HONG Lei,JI Baojian,WANG Yuguo
    Computer Engineering. 2015, 41(10): 71-75,82. https://doi.org/10.3969/j.issn.1000-3428.2015.10.014
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Jack software can not realize the dynamic solution for Virtual Human(VH) motion at the state of weightlessness.Based on relative robotics and rigid body kinematics,a new concept of VH motion analysis software is presented in this paper to solve the above problems.According to the basic structure of human body and its dynamic characteristic,the VH is divided into a model of 15 rigid segments with even density and rigid kinematics model can be established,the algorithms for VH kinematics and dynamics analysis are implemented by standard C programs based on Denavit-Hartenberg method of robotic kinematic,and the visualization simulation interface is developed by OpenGL.Its feasibility is verified by the cabin floating action simulation for virtual human,and the software platform has good human-computer interaction.

  • LI Junyi,LI Shuang,ZHANG Yan,LI Renfa
    Computer Engineering. 2015, 41(10): 76-82. https://doi.org/10.3969/j.issn.1000-3428.2015.10.015
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem of low efficiency for embedded system Worst-case Execution Time(WCET) static analysis methods,this paper uses Minimum Propagation Algorithm(MPA) to analyze the program flow and obtain the min-tree of each code block.Then it gets more strict constraints through the analysis of inner loop variables of function by symbolic loop bounds computation,and gets a WCET expression through constraints of min-tree and the loop bounds.Finally,it uses static prediction method to solve the WCET by absolute valuation of underlying instruction cycle of each basic block,and calculates the final WCET value.Experimental results show that this method increases the analysis efficiency as well as ensures accuracy compared with program execution time static analysis method based on process control flow diagram.
  • DIAO Ming,GAO Lu,GAO Hongyuan,FENG Pinghui
    Computer Engineering. 2015, 41(10): 83-87. https://doi.org/10.3969/j.issn.1000-3428.2015.10.016
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to improve the array utilization rate of Direction of Arrival(DOA) estimation using Compressing Sensing(CS),this paper proposes an Orthogonal Matching Pursuit(OMP) algorithm based on Non-uniform Linear Array(NLA).For the proposed algorithm,the observation space which is divided into a plurality of parts according to the rough range of DOA by using uniform angle division and uniform sine division.It uses the NLA to accept the signal,and makes the angle divided NLA manifold as measuring matrix.It projects and measures the signal with measuring matrix to achieve the observation value which has lower dimension,reconstructs the sparse signal and estimates the DOA from the observed values.Simulation results show that this algorithm needs fewer number of snapshots,achieves excellent anti-noise performance,and gets higher array utilizable rate compared with Multiple Signal Classification(MUSIC) algorithm.It also achieves higher angular resolution and the ability of dealing with coherent sources compared with OMP algorithm and MUSIC algorithm based on Spatial Smoothing(SS) under the Uniform Linear Array(ULA).
  • MA Beilei,WANG Guizhu,ZHU Yanjuan,DING Anping
    Computer Engineering. 2015, 41(10): 88-93. https://doi.org/10.3969/j.issn.1000-3428.2015.10.017
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In traditional Spray and Wait(SAW) routing algorithm in Delay Tolerant Network(DTN),the number of copies of the message is certain,which results in some blindness about the forwarding number of the message.It causes the routing algorithm can not adapt to the network environment and cuts down the delivery ratio.To deal with this issue,it gives the relationship between the final average buffer occupancy of the node and the initial number of copies,and proposes a Spray and Wait Routing Concerned on Buffer Occupancy(SAW-BO) in DTN.The algorithm adjusts the initial number of copies dynamically based on the average buffer occupancy rate of the node.It ought to increase the initial number of copies to increase the delivery ratio when the average buffer occupancy of the node is relatively low and it should decrease the initial number of copies when the average buffer occupancy of the node is relatively high to avoid the occurrence of congestion.Simulation results show that compared with BSW routing,the proposed algorithm can significantly improve the delivery ratio and increases the average latency when the average buffer occupancy of the node is low in the network,it can also enhance the delivery ratio and reduce the network overhead when the average buffer occupancy of the node is high in the network.
  • KONG Fanfeng,OU Hongyu,LONG Linde,CHEN Xi
    Computer Engineering. 2015, 41(10): 94-98,104. https://doi.org/10.3969/j.issn.1000-3428.2015.10.018
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In Wireless Sensor Network(WSN) constructing connected dominating set based virtual backbone,help to optimize multi-level hierarchical networks,which prevents the node’s death caused by the death of the data link.However,the minimum connected dominating set can not balance the energy consumptions to premature death.This paper presents an adaptive data gathering algorithm in WSN based on connected dominating set.Connected set by selecting the node has high energy and large degree from a dominating set which forms higher energy network backbone.Data through adaptive scheduling along the smaller network backbone seek route until the base station.Simulation results show that the proposed algorithm has a good performance with fault-tolerant in smaller network size,reduces the energy consumption and prolongs the network life cycle.
  • FENG Chenwei,ZHANG Lin
    Computer Engineering. 2015, 41(10): 99-104. https://doi.org/10.3969/j.issn.1000-3428.2015.10.019
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The next generation wireless network is the heterogeneous network of coexistence of a variety of wireless access technology.In order to make full use of the resources of all kinds of wireless network,the integration of heterogeneous network is necessary.However,when it comes to the heterogeneous network integration,the call request access control problem comes.In wireless heterogeneous network composing of Long Term Evolution(LTE),Wireless Local Area Network(WLAN) and Device-to-Device(D2D),an algorithm is presented for heterogeneous wireless network selection.The proposed algorithm based on Q-learning can select the appropriate network for access according to different traffic types,terminal mobility and network load status by using the return function composing of matching coefficient reflecting the network contribution,the corresponding traffic and mobility.Simulation results show that the proposed algorithm has an efficient learning ability to achieve autonomous radio resource management,which effectively improves the spectrum utility and reduces the blocking probability.
  • ZENG Xiliang,FENG Yan,GAO Haibo,PENG Hao
    Computer Engineering. 2015, 41(10): 105-110. https://doi.org/10.3969/j.issn.1000-3428.2015.10.020
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Most existing works on Device-to-Device(D2D) communications aim to maximize network throughput,which ignoring the huge energy consumption caused by the link mode selection of D2D.To solve this problem,the D2D communication problem of Orthogonal Frequency Division Multiple Access(OFDMA) wireless network is modeled as a nonlinear integer programming problem based on a practical link data rate model,whose objective is to minimize power consumption while meeting the user data rate requirements.Therefore,an effective algorithm is proposed to solve it in polynomial time,which jointly determines mode selection,channel allocation and power assignment.Simulation experimental results show that the proposed algorithm can achieve over 57% power savings,compared with several baseline methods.

  • WANG Yong,ZHANG Yanyan
    Computer Engineering. 2015, 41(10): 111-116. https://doi.org/10.3969/j.issn.1000-3428.2015.10.021
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In Network Control System(NCS),the network-induced delay is usually random,and packet dropout often presents in both the backward and the forward channels.Aiming at this problem,stabilization is studied for NCS with random time delay and asynchronous packet dropout.Uncertain systems matrix is used to avoid the influence of random time delay existed in both S-C and C-A channels,and then NCS is modeled as an asynchronous dynamical system with four rate constrains on events by considering packet dropout.The state feedback exponential stability is proposed for the closed-loop NCS.A state feedback controller that depends on S-C packet dropout rates (packet dropout rates from sensor to controller),and C-A packet dropout rates (packet dropout rates from controller to actuator) is designed.An illustrative example of wireless transmission NCS in the Internet of things is given to demonstrate the effectiveness of the proposed method.
  • CHEN Shu,XU Bo,XU Baoguo
    Computer Engineering. 2015, 41(10): 117-120,125. https://doi.org/10.3969/j.issn.1000-3428.2015.10.022
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    For the communication link problem existing in Wireless Sensor Network(WSN),with the unequal distribution of bandwidth and nodes’ energy,as well as longer delays and poor performance in adjustment with Quality of Service(QoS),this paper puts forward a difference-elite ant colony algorithm.The novel algorithm takes advantage of differential evolution algorithm to gain the combinatorial optimization in the ant colony algorithm,and the novel algorithm has the merit of elite preservation strategy,ant sort to improve convergence speed,and sets objective function based on QoS service types.Simulation results show that compared with basic ant colony algorithm,the new algorithm can converge to the global optimal solution,and gains minimal entropy.
  • TANG Yulong,FU Ming
    Computer Engineering. 2015, 41(10): 121-125. https://doi.org/10.3969/j.issn.1000-3428.2015.10.023
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In Dedicated Short Range Communication(DSRC) network,it has the phenomenon that the topology changes frequently and channel access has unfairness.Aming at the problem,a backoff algorithm for DSRC considering survival time is proposed in the media access control layer.This algorithm adjusts the congestion window of nodes by the relationships of position and velocity of vehicles,and makes the competition of channel access reasonable.Experimental results show that compared with the traditional binary index Backoff algorithm,this algorithm has better performance in channel access fairness and network optimization.
  • XIANG Xinyin
    Computer Engineering. 2015, 41(10): 126-129. https://doi.org/10.3969/j.issn.1000-3428.2015.10.024
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    The security of traditional identity-based signatures wholly depends on the security of secret keys.Exposure of secret keys requires reissuing all previously assigned signatures.Based on this,to revoke private key leaks or malicious users in the signature scheme,an adaptive secure revocable identity-based signature over lattices is proposed,which provides an efficient revocation mechanism to revoke misbehaving or compromised users from the systems.The scheme is proved to be strongly Unforgeable against adaptive Chosen-message Attacks(sUF-CMA) under Small Integer Solution(SIS) assumption.Security analysis results show that the proposed scheme not only can meet the security of revocable identity-based signature,but also can resist the quantum attack.

  • JIANG Mengxia,JIANG Guohua
    Computer Engineering. 2015, 41(10): 130-138,143. https://doi.org/10.3969/j.issn.1000-3428.2015.10.025
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Safety-critical software behavior directly affects life and property safety,so a quantitative evaluation model is indispensible to reflect its safety.Similarities are between safety and reliability,traditional safety assessment method always takes software as a whole to evaluate through improved reliability models,but it ignores failure nature and can not evaluate behavior safety.Based on the study of software failure nature and safety-critical scenario,Software Interbehavior Model(SIBM) is proposed,and tells how to generate Interraction Mode Dependency Graph(IMDG) through relationships of software operation conditions.A safety evaluation model based on process behavior is proposed,it identifies all process behaviors with incidence rates and failure rates,risk indexes are given to every process behavior,then total risk index can be calculated.
  • TANG Pengzhi,LIU Qiwen,ZUO Liming
    Computer Engineering. 2015, 41(10): 139-143. https://doi.org/10.3969/j.issn.1000-3428.2015.10.026
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The security of the existing ID-based partial blind signature scheme proposed is analyzed.Most schemes with the loophole that public consultations vulnerability information can be tampered with,the adversary can get rid of the partial blind property of the signature without being detected by multiplying the reverse of the partial blind factor to the blind message,and the adversary can forge the public consultations vulnerability information in the signature.To cope with the problem that the consultation public information may be forged in some schemes,it presents a modified ID-based partial blind signature scheme.Partially blind and unforgeable of the modified scheme are analyzed,and it proves that new scheme has partial blindness.The modified scheme has adaptive chosen ID and ciphertext security in the random oracle model,and has more efficiency than the previous partial blind signature schemes.
  • ZHAO Fuxiang
    Computer Engineering. 2015, 41(10): 144-147,154. https://doi.org/10.3969/j.issn.1000-3428.2015.10.027
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To solve the problem of hard to generate dynamic factor and manage key in practical applications of the enciphering mode,a tweakable enciphering scheme is presented based on hardware support by using hybrid MPSoC implementation and chaotic code and dynamic key and hardware key protect technologies.Parallel computations between acquiring the tweakable factor and executing data encryption are introduced by adding small scale hardware and the overall efficiency of the system is improved.Techniques of resource-constrained application and dynamic key management are applied in the scheme.Experimental results show that the scheme can reduce the run time of system and improve the overall efficiency.
  • NIU Leyuan,YANG Yitong,WANG Dejun,MENG Bo
    Computer Engineering. 2015, 41(10): 148-154. https://doi.org/10.3969/j.issn.1000-3428.2015.10.028
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Secure Shell(SSH) protocol can ensure the implementation of network file transfer between the local host and remote host,remote login,executing commands remotely and safe execution of other applications.It plays an important role in the protecting network security.This paper studies the security of the protocol.The main content of this article is the automated analysis of Secure Shell Version 2(SSHV2) protocol.This paper introduces the architecture of SSH protocol and gets the message terms by analyzing the authentication message flow of SSHV2 protocol.Based on computational model and application Blanchet calculus to give the formal model of SSHV2 security protocol,and analyzes the protocol’s certification by applying the security protocol automatic analysis tool CryptoVerif,shows that SSHV2 security protocol has authentication under the computational model.
  • LI Xiaocui,ZHANG Xinyu,LUO Qingyun,REN Chang’an
    Computer Engineering. 2015, 41(10): 155-159. https://doi.org/10.3969/j.issn.1000-3428.2015.10.029
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The traditional symbolic algorithm of time series based on statistical feature vector can not retain the timing characteristics well and support multidimensional time series symbolic.Aiming at this problem,this paper proposes an improved symbolic algorithm of time series based on statistical feature vector.The specific methods are as follows:for single-dimensional time series,using special points’ time series segmentation method to segment the time series and making it symbolic;for multi-dimensional time series,using weighted attributes’ Principal Component Analysis(PCA) method to transform the multi-dimensional time series into single time series,then making it symbolic.Experimental result shows that the improved algorithm has higher accuracy than traditional algorithm.It can retain the timing characteristics and has more superiority in the aspect of multidimensional time series symbolization.
  • LIU Lubin,ZHU Yanmin
    Computer Engineering. 2015, 41(10): 160-164. https://doi.org/10.3969/j.issn.1000-3428.2015.10.030
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Building noise maps is always accompanied with a lot of human efforts and time costs.This paper designs and implements a system based on participatory sensing.The mobile part and the server part are two key components in this system.Smart phones in mobile part measure the noise level around them.Noise data is uploaded to the server part after calibration.The server part collects these data,recovers the loosed data and builds the noise map.Users can query the noise map according to their demands.Experimental results show that the error of the calibration is less than 3 dB and this system builds the real-time and fine-grand noise map with low overhand.
  • ZHU Guangyu,HE Lijun
    Computer Engineering. 2015, 41(10): 165-170. https://doi.org/10.3969/j.issn.1000-3428.2015.10.031
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper establishes a multi-objective Flow Shop schedule optimization model under the environment of supply chain and optimized with the Grey Entropy Parallel Analysis(GEPA) method.Based on the grey relational analysis method,which expresses the similar degree between sequences,the information entropy theory is adopted to establish grey entropy parallel analysis method.The Grey Entropy Parallel Relational Degree(GEPRD) deduced by this method is used to measure the similar degree between multi-objective Pareto solutions and ideal solution and is used as the fitness to guide the evolution of the algorithm.By this way,the shortcoming that assignment the target weight directly in multi-objective optimization problem is overcome.The Genetic Algorithm based on Grey Entropy Parallel Analysis(GEPA_GA) is tablished.Experimental results show that GEPA_GA can solve high-dimensional multi-objective Flow Shop schedule problem under the environment of supply chain effectively.The multi-objective optimal solution and performance evaluation index of GEPA_GA are all superior to Genetic Algorithm based on Random Weighting(RW_GA).
  • HUANG Zhong,HU Min,LIU Juan
    Computer Engineering. 2015, 41(10): 171-176. https://doi.org/10.3969/j.issn.1000-3428.2015.10.032
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to perform advantages of complementary multisource features and fuse decision results of multiple classifiers,a multi-feature facial expression recognition method based on decision-level fusion is proposed.Shape Feature(SF) of expression is attained by chain code and deformation feature is built to depict facial geometric changes.Meanwhile,Gabor feature fusion diagram is applied to describe local texture details of facial expression.The posterior probability of three kinds of features,which is obtained by Support Vector Machine(SVM) classifier respectively,is constructed for multiple classifiers fusion in decision-level.In order to solve the optimal fusion weights,a weight optimization strategy based on Particle Swarm Optimization(PSO) under the condition of supervised learning is put forward.Experimental results on Cohn-Kanade database show that the proposed method has better performance for average recognition rate and robustness than single classifier recognition method.Compared with existed multiple classifiers fusion methods,the weight optimization strategy has advantages in terms of recognition rate and reliability.
  • ZHAO Yongbin,CHEN Shuo,LIU Ming,CAO Peng
    Computer Engineering. 2015, 41(10): 177-180,185. https://doi.org/10.3969/j.issn.1000-3428.2015.10.033
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Imbalanced data classification problem is one of the main research field of machine learning in the real world.In order to improve the classification performance of Support Vector Machine(SVM),a kernel space confidence based cost SVM is proposed.It can improve the accuracy of minority class by injecting the strategy of misclassification cost into training.Using the imbalanced data evaluation metric as the objective function,the method optimizes the misclassification cost parameter,so as to improve the accuracy of minority class.Moreover,the weight of each instance for decision classification contribution can be obtained by calculating the class confidence on the kernel space,so as to decrease the effect of noisy and outlier instances for SVM.Experimental results show that the proposed algorithm provides a very competitive solution to other existing methods for combating imbalanced classification problems.
  • LU Mindi,ZHOU Yongquan,HUANG Kang
    Computer Engineering. 2015, 41(10): 181-185. https://doi.org/10.3969/j.issn.1000-3428.2015.10.034
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In view of the problems of easily relapsing into local extremum and low convergence precision of Fruit fly Optimization Algorithm(FOA),a fruit Complex Fly Optimization Algorithm(CFOA) encoding is proposed.It introduces the idea of complex encoding diploid,and the independent variables of the objective function are determined by the modules and angles of their corresponding complex numbers.Nine benchmark test functions are tested and comparative experimental results show that,compared with fruit fly optimization algorithm based on real encoding,fruit fly optimization algorithm based on complex encoding expands the quantity of information of individual genes and increases the diversity of population.
  • WU Daqing,SHAO Ming,LI Quan,LI Kang
    Computer Engineering. 2015, 41(10): 186-191,198. https://doi.org/10.3969/j.issn.1000-3428.2015.10.035
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To improve the convergence and distribution of Multi-objective Evolutionary Algorithm(MOEA) in dealing with large-dimensional Multi-objective Optimization Problem(MOP),a multi-objective particle swarm optimization algorithm based on human disciplinary behavior is proposed.The strategies such as promoting/punishment factor,the elite learning strategy as well as restructuring topology structure strategy with dynamic population in period are introduced in proposed algorithm,to make the algorithm have strong global search ability and good robust performance.Some typical multi-objective optimization functions are tested to verify the algorithm,and simulation results show that,compared with recent other algorithms,the algorithm can ensure good convergence while having uniform distribution and wild coverage area.
  • YIN Guoliang,BAI Ruilin,WANG Yongjia,LI Xin
    Computer Engineering. 2015, 41(10): 192-198. https://doi.org/10.3969/j.issn.1000-3428.2015.10.036
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    A method of time-optimal trajectory planning is proposed in order to improve the Delta robot movement speed.Dividing the robot working area,it selects the center point of each area as a standard,and converts them into the joint space through the inverse kinematics.It uses 5th order B-spline interpolation to constructed joint space curve.Fractional particle swarm optimal method is used to optimize the global optimum point and plan time optimal motion curve.Under the premise of the joint angular velocity,angular acceleration,angular jerk smooth and constraints,the operating speed is improved.The fuzzy controller is used to divide the distribution of time node.Experimental results show that the method is simple and practical.Delta robot in laboratory picks up object from the work area to the target position,the time range is 0.543 s~0.735 s.So the method overcomes the shortcomings of traditional trajectory planning method speeds.
  • XIANG Yan,HE Jianfeng,ZHANG Yunchun,CAI Li
    Computer Engineering. 2015, 41(10): 199-203. https://doi.org/10.3969/j.issn.1000-3428.2015.10.037
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The key strength of the Cross Cumulative Residual Entropy(CCRE) over the popular Mutual Information(MI) method is that the former has significantly larger noise immunity.But CCRE using conventional Partial Volume(PV) interpolation will result in the emergency of the local extremes on grid points,which may hamper the optimization algorithm from getting transformation parameters.In order to solve this problem,three improved PV interpolation methods are studied,including 3-order B-spline PV interpolation(BPV),Hanning windowed sinc PV interpolation(HPV) and Blackman-Harris windowed sinc PV interpolation(BHPV).Meanwhile,a new interpolation method is proposed which uses flexible neighborhood center and makes the interpolation point to distribute the weight of the joint histogram to its adjacent 9 points.Moreover,it uses a Gaussian function as the PV interpolation kernel function to overcome the mutation of weight.Experimental result shows that the registration accuracy and speed of the proposed method is higher and faster,compared with BPV,HPV and BHPV method.So it is more suitable for CCRE computing.
  • NIU Yirong,WANG Shitong
    Computer Engineering. 2015, 41(10): 204-209. https://doi.org/10.3969/j.issn.1000-3428.2015.10.038
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Traditional image segmentation methods can not effectively adapt to the case of image smearing with heavy-tailed noise.And the result of segmentation is not satisfactory in dealing with the image contaminated by heavy-tailed noise.This paper presents an image segmentation method based on Student-t distribution.The method calculates the prior probability according to the spatial relationship between the pixels,uses gradient descent method to optimize parameters so as to minimize the error function.The posterior probability values of pixels are obtained based on the optimal parameters.Image segmentation is realized by marking the pixels.Experimental results show that the misclassification ratio is lower and the performance is better when using the proposed method to deal with the image contaminated by heavy-tailed noise,compared with the traditional K-means and Fuzzy C-means(FCM),etc.
  • YANG Yan,BAI Haiping
    Computer Engineering. 2015, 41(10): 210-215,220. https://doi.org/10.3969/j.issn.1000-3428.2015.10.039
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem of weak ability and slow processing speed in dehazing program based on dark channel prior algorithm,this paper proposes a fast defogging algorithm of compensation for dark channel image.It combines median filtering algorithm to compensate the area dark channel image with weak defogging ability.The global minimum image is regarded as guided image and the dark channel image is filtered utilizing the guided filter characteristics of edgepreserving smoothing,which can remove the phenomenon of blackspot and cut down the algorithm complexity.Clear images can be restored based on the atmosphere scattering model after regarding the maximum of new dark channel image as evaluating the atmosphere light intensity simply.Experimental results show that compared with the median filtering algorithm,fast defogging algorithm,etc,the obtained defogging images of this algorithm have better sharpness and color degrees.The computational speed is greatly improved.
  • HU Lichao,SHI Zaifeng,PANG Ke,LIU Jiangming,CAO Qingjie
    Computer Engineering. 2015, 41(10): 216-220. https://doi.org/10.3969/j.issn.1000-3428.2015.10.040
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    By using Gaussian filtering for smooth processing,the original Harris feature point detection algorithm enhances its robustness.But it also increases the complexity of the algorithm which can not be applied to many image matching systems.Its positioning accuracy of T-type and diagonal T-type feature points is low.In order to solve the above problems,a new feature point detection algorithm is proposed.Amounts of non-feature points are excluded by using the principle of Features from Accelerated Segment Test(FAST) feature point detection.Some strong interference points are ruled out by using neighborhood pixels comparison method.The resulting feature points are obtained by using the improved efficient non-maximum suppression algorithm.Experimental results demonstrate that the improved algorithm has better matching accuracy and higher detection speed,its detection time is only approximately 13.9% that of the original Harris algorithm and it is quite suitable for real-time image matching systems.
  • FANG Luping,HONG Wenjie,PAN Qing,YAO Jialiang
    Computer Engineering. 2015, 41(10): 221-225,231. https://doi.org/10.3969/j.issn.1000-3428.2015.10.041
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Traditional methods of gesture recognition are incompetent to detect finger’s delicate movement due to poor segmentation effect using 2D images and limited gesture templates through supervised training of classifier.This paper proposes a method of finger joint angle measurement by Particle Swarm Optimization(PSO)algorithm,introduces Kinect depth image to optimize feature extraction and improve accuracy.Through the analysis of hand free degree,it introduces multiple constraints to reduce the degrees of freedom number,optimize PSO to calculate the best model and analyze the measurements,transform the common problems of gesture classification to the variable solution of finger joint angle.Experimental results show that this method can effectively improve the detection accuracy and reduce the situation of detection fault.
  • DU Danlei,LUO Entao,TANG Yayuan,LEE Yenchun
    Computer Engineering. 2015, 41(10): 226-231. https://doi.org/10.3969/j.issn.1000-3428.2015.10.042
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    For solving the problem that the classic Product Quantization(PQ) method is restricted on data’s independence,a Cumulative PQ(CPQ) method is proposed in this paper.Orthogonal decomposition is executed on the high-dimensional feature vectors to obtain independent sub-spaces of feature vectors,and decomposes every subspace again according to the compression efficiency,and obtains dependent sub-sub-spaces of feature vectors,uses Cumulative Quantization(CQ) method to quantify the vectors sub-sub-spaces,and uses PQ method to quantify the vectors from sub-spaces.The new method reduces the impact of data’s independence on accuracy of quantization,under the premise of maintaining the compression efficiency.Experimental results show that the new method has small code error compared with classical PQ and Cartesian K-means(CKM) methods,and high recall rate in the application of image retrieval.
  • AN Weisheng,YU Rangming,WU Yuling
    Computer Engineering. 2015, 41(10): 232-235,239. https://doi.org/10.3969/j.issn.1000-3428.2015.10.043
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    For Scale Invariant Feature Transform(SIFT) and Speeded-up Robust Feature(SURF) needing a long time in the corner detecting and feature points matching,an improved image registration algorithm is put forward.A Gaussian scale pyramid of the reference image and the matching image are established.Feature points which have different scale information are detected from each level in the image pyramid.It gets Features from Accelerated Segment Test(FAST) point with different scales.An orientation is assigned to every feature point,and feature vector is calculated by using the same way as SURF.The original matching points which have minimum Euclidean distance under some condition are determined through fast approximate nearest neighbor search.The false matching points are excluded by Randomized Sample Consensus(RANSAC) algorithm,and the transformation matrix is gained.Experimental results show that the algorithm is better than SURF and SIFT in feature detection speed and matching speed,and matching accuracy is higher.
  • MENG Dexin,WANG Minquan,HU Guowei
    Computer Engineering. 2015, 41(10): 236-239. https://doi.org/10.3969/j.issn.1000-3428.2015.10.044
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    According to the path of color image data volume,high dimension,guide marking detection algorithm and time-consuming problem in visual navigation,a fast marking detection algorithm in path image based on color feature clustering is proposed.Based on the analysis of the common features of marking detection algorithm,it establishes a color sparse matrix on color image,adopts interlacing detection feature point in suspected marking,calculates neighbor coefficient between the each feature point and clusters by using of neighbor function method,finds out the target class with most feature point which is marking,connects the path structure with feature points set,and provides the route navigation information.Experimental results indicate that compared with the conventional color space conversion or edge detection algorithm based on the Hough transform,the speed of this algorithm is fast,and can meet the real-time requirements.
  • FANG Sanyong,ZHOU Dake,CAO Yuanpeng,YANG Xin
    Computer Engineering. 2015, 41(10): 240-244,249. https://doi.org/10.3969/j.issn.1000-3428.2015.10.045
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to process the face image in different poses,this paper proposes a frontal face image synthesis method based on pose estimation.The method is based on the idea of statistical modeling to reconstruct the missing face shape and texture.Firstly,3D average model is applied to estimate the pose parameters of the test face image.Compressed sensing theory is used to filter prototype samples and then a more accurate model of deformation is built up.Secondly,the test face image is separately expressed by texture vector and shape vector.The deformation model theory is used to reconstruct front texture and shape.Finally,synthesis texture is produced according to the original texture and reconstructed texture.Experimental result shows that this method can be used to synthesize natural frontal face image from non-frontal face image with effectiveness and higher recognition rate.
  • ZHANG Mingjie,KANG Baosheng
    Computer Engineering. 2015, 41(10): 245-249. https://doi.org/10.3969/j.issn.1000-3428.2015.10.046
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to solve the traditional Gaussian mixture model’s problem which has the low speed of background modeling and high computational complexity,this paper puts forward a moving target detection method.This method can be divided into two steps.Improve the update’s process of the traditional Gaussian mixture model to realize the adaptive adjusting the number of Gaussian distribution,and introduce the illumination change parameters to update the learning-rate according to the variation of the illumination.Image’s background and foreground are segmented by the above method,optimizing the detection results of Gaussian mixture model through the calculation of pixel.Experimental results show that the new method not only can separate the goals effectively and reliably,but also can get better detection effect.
  • YAN Bin,CHEN Yaowu
    Computer Engineering. 2015, 41(10): 250-254. https://doi.org/10.3969/j.issn.1000-3428.2015.10.047
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Fragmentation of traditional file system seriously affects storage performance.Aiming at this drawback and combined with the characteristics of content-based retrieval in video surveillance,a dedicated file system for intelligent video surveillance is proposed.The proposed approach provides an Extent-based logical volume index structure,retrieves feature content based on feature video segments,and improves index efficiency by B+ tree and bitmap.According to the index strategy proposed,recycle the oldest data based on data cluster,which realizes circular storage,optimize data block allocation policy by controlling the continuous storage time and evaluating allocated time of fragmentation,ensuring a continuous,sequential data storage.Test results show that the defragmentation impact to storage bandwidth is only 2.4%,compared with the traditional file system.It increases storage efficiency by 24.5% respectively on typical storage bit rate of 1 Mb/s,while the storage space utilization is more than 99%.
  • ZHANG Pei,WANG Xiaochen,JIANG Lin,ZHANG Maosheng
    Computer Engineering. 2015, 41(10): 255-259. https://doi.org/10.3969/j.issn.1000-3428.2015.10.048
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The calculation of spatial parameters perception entropy is currently based on the Binaural Cue Physiologcal Processing Model(BCPPM),which consists of frequency-to-place transform in cochlear,delay-attenuation network and effective channel noises.However,the frequency-to-place transform in cochlea and the delay-attenuation network is difficult to quantitatively describe.In addition,the difference between the quantization error of spatial parameters and the quantization step of spatial parameters is confused in computing spatial parameters perception entropy.This paper proposes a new Spatial Parameter Perceptual Model(SPPM) to address these problems in BCPPM.The delay-attenuation network module and the effective channel noises module are replaced with the spatial parameters generation module and the JND module,and a perception amplitude compression module is added.Besides,it also analyzes the relationship between maximum quantization error of spatial parameters and the quantization step of spatial parameters,then gives a space parameter perceptual entropy formula based on SPPM.Since the spatial redundancy parameters are taken full consideration,experimental results confirm that spatial parameters perceptual entropy is smaller in the proposed method than the spatial parameters perceptual entropy based on BCPPM.
  • LEI Yingsi,YANG Yan
    Computer Engineering. 2015, 41(10): 260-264. https://doi.org/10.3969/j.issn.1000-3428.2015.10.049
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The Waveform Similarity Overlap-and-Add(WSOLA) algorithm neglects the perceptual characteristics of real sound speech signals,and employs uniform time scaling of the entire signal.When sampling rate is low or scaling proportion is large,the scale quality is degraded.Aiming at such problems,an enhanced WSOLA algorithm is proposed through analyzing the acoustic prediction characteristics of human auditory system.This method detects the turning points of the speech using a subband spectrum entropy measure and leaves them intact to ensure the turning points undamaged,while time scaling the remainder of the signal.A local compensate measure is further put forward to correct the whole scale accuracy.Simulation results show that the new algorithm improves the natural degree of the synthetic speech signals with the whole scale proportion unchanged.
  • LV Yaping,GAO Ge,CHEN Yi,ZHANG Kang
    Computer Engineering. 2015, 41(10): 265-269. https://doi.org/10.3969/j.issn.1000-3428.2015.10.050
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    For the traditional perceptual audio encoding scheme using the psychoacoustic mask effect to reduce coding rate,the channel model + signal incentive way is difficult to simultaneously realize high quality in low bit rate speech and audio signal coding.It proposes a perceptual domain audio coding algorithm based on Gaussian Mixture Model(GMM).The algorithm uses Gammatone filter groups to simulate the human auditory system,using multiplexer masking model and replace to reduce the number of pulse envelope and facilitate the use of structural model fitting,using the Gauss-Newton algorithm for the fitting of Gaussian mixture model parameters,using Gaussian mixture model parameter replace audio signal characteristics.The results prove that compared with the audio coding method based on the envelope with sparse reconstruction,subjective test is higher than 0.5 point to 0.8 point,and the objective test is higher than 5 point to 10 point,most of the speech and music signal can be restored to the effect of the original audio signal by decoding,and can be used to achieve high quality speech and audio encoding at low bit rate.
  • YUAN Jing
    Computer Engineering. 2015, 41(10): 270-274. https://doi.org/10.3969/j.issn.1000-3428.2015.10.051
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In the application of Magnetic Resonance Imaging(MRI),it is common to solve the problem by combining L1 norm with total variation operator.Because the model of solving compound regularizer is more complicated,the operator splitting technique is used to solve the problem of compound regularizer,which is in order to lower the complexity of the solving model,and puts forward a reconstruction method which is iterative weighted.The observation matrix is optimized,according to the priori statistical properties of imaging,which is under different transformations.Simulation results show that this image reconstruction algorithm not only enhances the reconstruction accuracy,but also decreases the time for the reconstruction.
  • GUO Lin,ZENG Feng,CHEN Zhigang
    Computer Engineering. 2015, 41(10): 275-279,285. https://doi.org/10.3969/j.issn.1000-3428.2015.10.052
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Traditional genetic algorithm is used in cognitive radio spectrum allocation chosen in the last step to solve the interference issue in the face of interference between cognitive users,resulting in the problem that the chromosome of carrying interference gene takes part in the whole genetic process.According to this problem,this paper targets the control of interference in the genetic process,designs rules of gene expression in the chromosome,proposes cognitive radio spectrum allocation algorithm of gene selective inheriting.It marks the dominant and recessive genes by the regulation of gene expression,and expresses the dominant genes and inhibits the recessive gene in the next generation of chromosomes,thereby ensuring the health of the chromosome,improving the efficiency of the algorithm.Simulation results show that the algorithm has better total benefit and higher access ratio compared with Genetic Algorithm(GA) and Quantum Genetic Algorithm(QGA) when more cognitive users and fewer spectrum resources are in the system.
  • LIU Chunhui,HUANG Yu,SONG Qi
    Computer Engineering. 2015, 41(10): 280-285. https://doi.org/10.3969/j.issn.1000-3428.2015.10.053
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    A time efficient AC algorithm AC_TE is suggested for multiple pattern string matching based on the analysis of AC and related algorithms.The AC_TE algorithm constructs a string shift table and two hash tables.The string shift table stores every adjacent two characters of pattern tree and their positions,while the two hash tables store last two strings and last character of pattern tree respectively.AC_TE uses multiple level skipping rules to check these three constructed tables.As a result,pattern tree’s shift distance can be shortest pattern length pluses 3 without missing matched position.To analyze the performance of the AC_TE algorithm,some experiments are done from three aspects which are pattern tree shift times,matched time and probability of different shift distance.Experimental results show that compared with AC algorithm,AC_TE has longer pattern tree shift distance and better time performance.

  • ZHU Zhijie,ZHAO Yao,WANG Zhong,PENG Fei
    Computer Engineering. 2015, 41(10): 286-289,294. https://doi.org/10.3969/j.issn.1000-3428.2015.10.054
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the issue that the coordinate of the hull’s points can not be measured in the single station because of the fitting of the inner structures,this paper proposes a partition measurement method,which establishes an army of coordinate system by moving measurement station.On the base of coordinate transforming and data calculating,the function is carried out to measure all the point’s coordinates of the hull,as well as an application is programed to calculate the measure data.To validate the measurement error,a measuring test is carried out with the physical hull model.The results demonstrate that the proposed method can satisfy the precision requirement of hull measurement in shipbuilding engineering.It is beneficial to improve the precision of hull building,the quality of building and the productivity of shipbuilding.
  • YAN Wenwu,PAN Feng
    Computer Engineering. 2015, 41(10): 290-294. https://doi.org/10.3969/j.issn.1000-3428.2015.10.055
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The process monitoring method with multivariate statistics based on Independent Component Analysis(ICA) is mainly used for fault detection,but it is not effective for fault classification.For this reason,combining with Extreme Learning Machine(ELM),a method called ICA-ELM for fault classification is proposed.ICA-ELM extracts the fault features with ICA,and then trains the networks with ELM,so as to realize fault classification.ICA-ELM is tested with Tennessee Eastman (TE) process data and compared with Probabilistic Neural Network(PNN) and Support Vector Machine(SVM).Experimental result shows that the accuracy of ICA-ELM is higher,training speed of ICA-ELM is faster.
  • HU Xiaoxue,ZHAO Songzheng,WU Nan
    Computer Engineering. 2015, 41(10): 295-301,308. https://doi.org/10.3969/j.issn.1000-3428.2015.10.056
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Based on power customers which reach a very large amount and the feature of presence of outlier,and limitations of Partitioning Around Medoid(PAM) algorithm in handling large amounts of data and predefining the number of clusters,a new hybrid clustering algorithm called SOM-DB-PAM that is suitable for fast clustering of large number of electricity customers,is proposed.In the proposed algorithm,the Self-Organizing Map(SOM) neural network is used to train input data to ind prototype vectors that represents patterns of the input data set but far less than the number of it,and the prototype vectors are clustered by the PAM algorithm and to ensure the validity of clustering,the Davies-Bouldin(DB) indexis calculated for SOM prototype vectors to solve optimal number of clusters.Experimental results show that,compared with traditional clustering algorithms,the accuracy of classification is enhanced and when the amount of electricity customers is large,the proposed algorithm can achieve a fast and effective clustering.In addition,the blindness and subjectivity of predefining the number of clusters artificially is decreased.
  • CHEN Qiang,HUANG Dandan,LI Bin,LU Yuan
    Computer Engineering. 2015, 41(10): 302-308. https://doi.org/10.3969/j.issn.1000-3428.2015.10.057
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In spite of step by step combination method is widely used by researchers,the rationality of this method in theory is not appeared in the literature review.This paper presents a general math expressions of step by step combination results,and studies the theoretical rationality of step by step combination method.The convergence of the combination results,the relationship between final combination results and the row order is studied.Several theorems and deductions are presented and proved using mathematical methods.The step by step combination algorithm mixing evidence combination formula and the weighted average method is proposed for combination of highly conflict evidence.Simulation results show that the proposed algorithm has simpler calculation process and better convergence effectiveness than representative alternative rules.
  • LIU Jing,WANG Tiancheng,WANG Jian,LI Huawei
    Computer Engineering. 2015, 41(10): 309-313. https://doi.org/10.3969/j.issn.1000-3428.2015.10.058
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Due to the increasing system complexity of hardware design,functional verification becomes the bottleneck of the design flow.General processor is one of the most complex designs of integrated circuits,and it brings a huge challenge on its functional verification.This paper proposes a constraint random instruction generation method,for the simulation-based verification of an ARMv8 processor.This instruction generation method is based on the templates that are extracted from the instruction set,which guide the valid ARMv8 instructions’ generation,and can support a variety of functional scenarios’ verification by adjusting the constraints.Based on automatically comparison of the results produced by the validation environment,it achieves fully verification of the processor,and 58 design mistakes are found.The achievement has a good foundation,for the subsequent FPGA hardware emulation.The verification results show that the method can obtain the structural coverage of 90%.
  • LIU Deliang,LIU Kaihua,YU Jiexiao,ZHANG Liang,ZHAO Yang
    Computer Engineering. 2015, 41(10): 314-317. https://doi.org/10.3969/j.issn.1000-3428.2015.10.059
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In indoor environment,Non-Line-of-Sight(NLOS) and multipath propagation of wireless signals dramatically affect the accuracy of the localization algorithms especially the geometric localization approaches based on Time-of-Arrival(TOA).To solve the above problem,a Two-Step-Weighted-Least-Square(TSWLS) of Virtual Sensor(VS) algorithm is proposed.With the known floor plan,a TOA model is built based on the VS considering direct path,reflection path,diffraction path and penetration path.The TSWLS-VS algorithm is utilized to estimate every possible position for every propagation condition.The target location is decided according to certain constrains based on geometric principles.Simulation results demonstrate that as the NLOS conditions become more extreme,the proposed method has better accuracy than others.
  • XIAN Xiaodong,LV Jianzhong,FAN Yuxing
    Computer Engineering. 2015, 41(10): 318-321. https://doi.org/10.3969/j.issn.1000-3428.2015.10.060
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The method of Continuous Hidden Markov Model(CHMM) parameter initialization for speech recognition is segmented with K-means algorithm that can lead to convergence in local optimization of model parameters.A new approach of CHMM parameters initialization is proposed based on density and distance.Computing density and distance of data,the initial cluster center is selected according to the far distance and max density,then carries the K-means clustering process to get the final cluster centers,and initializes the CHMM parameters according to the cluster center.Experimental results show that the new approach has better recognition results compared with random selection algorithm.