Author Login Editor-in-Chief Peer Review Editor Work Office Work

15 August 2015, Volume 41 Issue 8
    

  • Select all
    |
  • ZHANG Binlian,XU Hongzhi
    Computer Engineering. 2015, 41(8): 1-5. https://doi.org/10.3969/j.issn.1000-3428.2015.08.001
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The reliability and energy conservation are the key properties of many real-time systems.Hence,this paper presents a kind of scheduling algorithm of Reliability Constraint and Energy conservation based on Random Task(RCERT),which makes all the tasks executed in the same voltage/frequency in order to reduce the energy consuming.The execution voltage/frequency is raised only if there are some tasks miss their deadline.If the voltage/frequency is turned low,the algorithm provides a mechanism of task recovery time to promise the reliability of tasks.Considering that the frequency of the transient error is small in the runtime,there is an algorithm that multitasks share the same recovery time to save energy.This paper does the simulation which compares the performance of RCERT,EDF and MEG algorithm on the TI OMAP5912 and Intel PXA270 produced by the Texas Instruments,and the result shows that RCERT algorithm is much more energy-efficient when promising the system reliability.
  • XU Xiaofeng,HE Liang,YANG Jing
    Computer Engineering. 2015, 41(8): 6-12,17. https://doi.org/10.3969/j.issn.1000-3428.2015.08.002
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In the study of popularity of film and TV dramas,methods solely based on social network or search data can not reflect the accuracy which needs TV viewers in different times and always has low forecast accuracy.Thus,based on the features in social networks which has significant correlation with the drama on demand quality,microblog data before premiere and search data after the premiere,it uses multiple regression model to forecast the rank of average drama on demand in Video on Demand(VOD) system.Analysis result shows that the method which fuses social network data before premiere with search data three days later after premiere performs better than purely using social network or search data,and it reflects the TV viewers’ needs more precisely.The Spearman correlation coefficient between the prediction rank and real rank are high,nearly 0.82 and 0.89 on YouKu and IQiyi,and this method can be used to help video operators make copyrights purchase decision.
  • SHEN Lianghao,WU Qingbo,YANG Shazhou
    Computer Engineering. 2015, 41(8): 13-17. https://doi.org/10.3969/j.issn.1000-3428.2015.08.003
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Distributed storage systems are widely used in data centers,because of high performance and scalability,yet most of them are not energyefficient.This paper,based on Ceph,analyses the data layout’s disadvantage in energy saving and proposes a power group partition algorithm to increase the energysaving proportion,builds a multilevel power mode of Ceph and proposes a multilevel power management strategy,besides,it designs and implements a power management framework based on the two former points,to manage Ceph’s power dynamicly.Experimental results show that this framework can reduce the power consumption of Ceph effectively,meanwhile,the quality of service,data availablity are preserved.
  • REN Kankan,QIAN Xuezhong
    Computer Engineering. 2015, 41(8): 18-22,31. https://doi.org/10.3969/j.issn.1000-3428.2015.08.004
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    User similarity measure method in traditional collaborative filtering algorithm is based on common items to calculate the similarity between users,and the low accuracy of the similarity calculation is caused by the data sparsity problem of user-item rating matrix.In view of this problem,a novel user similarity measure method is proposed.This method calculates the similarity of users by using Jaccard similarity coefficient which is improved by correction formula,considers the relationship between common items and all items of users during the calculating,and takes into account the impact of different of user ratings in the evaluation of common items on similarity of users,and more accurate user similarity matrix is obtained.Experimental results show that this method can improve the prediction accuracy compared with Cosine(COS) similarity method and the Adjusted Cosine(ACOS) similarity method,etc.
  • CHEN Liping,GUO Xin
    Computer Engineering. 2015, 41(8): 23-31. https://doi.org/10.3969/j.issn.1000-3428.2015.08.005
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    With respect to the problems of idle consumption and luxury consumption existing in the present cloud computing platform,and the current situation that the traditional graph mining method can not satisfy the massive data mining,this paper puts forward a dynamic graph mining method for minimum energy consumption optimization cloud.It offers the cloud computing energy measurement formula,and analyzes the rationality of two types of task scheduling policy theoretically.And it considers at one time the problems of system energy optimization and system operation efficiency.Under the condition that the system operates well,it converts the problem of system energy optimization into system cost control,and proposes the total cost of the objective function,based on which to design a model for computing adaptive allocation algorithm and the minimum energy consumption optimization cloud.The paper changes the traditional graph mining serial executive mode,and raises a large-scale dynamic graph mining method based on Map Reduce mode and applies this method into minimum energy consumption optimization cloud mode to improve the integrated utilizing efficiency for the entire system.Experimental results show that the method is effective and feasible,which operates with rather high efficiency.In addition,the whole mining system energy consumption lowers obviously compared with before,especially in the case of big graph.
  • ZHANG Ying,YANG Maishun,ZHANG Xingjun,GUO Qingwei
    Computer Engineering. 2015, 41(8): 32-36. https://doi.org/10.3969/j.issn.1000-3428.2015.08.006
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper presents a Instruction-level Fault Injection Method for Linux Kernel(IFIMK),which tests the reliability and robustness of a Linux operating system.The proposed method can inject fault to special kernel instruction and its running environment to collect the information and summarize the fault models which lead to the failure of a system,so as to test the reliability and robustness of a computer system,and the fault can be designed by the testers depending on the specific needs.Through the fault injection test for the Linux kernel of version 2.6.32,it proves that the IFIMK method can inject the fault into kernel efficiently.According to this method,the sensitivity statistics of the Linux system for different faults can be achieved.
  • HU Yuexiang1,2,WEI Yehua1,3,GUAN Zhuan1
    Computer Engineering. 2015, 41(8): 37-41. https://doi.org/10.3969/j.issn.1000-3428.2015.08.007
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to realize the automatic control of coke oven locomotive,this paper combines the laser electric sensor correlation technology with absolute value encoder technology,and designs an address detector for coke oven locomotive. The detector identifies the code card by using the laser correlation tube to obtain the coke oven number and the absolute address,gets the offset address from the output of the absolute value encoder.The error of absolute value encoder can be calibrated with the absolute address by the processor LPC2148,then transmits the final address to the Programmable Logic Controller(PLC) on the coke oven locomotive. It realizes the functions of real-time information acquisition of coke oven furnace number,dynamical acquisition of locomotive address,display and uploading of locomotive address and furnace number information. Experimental result shows that address detector has simple structure,low cost and strong anti-interference ability,and it can meet the requirements of the address detection and location of coke oven automation.
  • LAI Xiaoping
    Computer Engineering. 2015, 41(8): 42-45. https://doi.org/10.3969/j.issn.1000-3428.2015.08.008
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Online advertising is one of the most important profit models in Internet companies,and it aims to maximize the profit according to auction and prediction of Click-through Rate(CTR) of users while satisfying users timely requests.In order to achieve that purpose,this paper proposes a Logistic regression based online advertising parallel model.It uses the Logistic regression modeling the CTR of users.The model contains long term factors including linear and quadratic factors,and short term context factor.And it infers the computation of parameters based on Bayesian posterior.The paper uses Thompson sampling and pre-computing to improve the computation efficiency of the model.Experimental results show that,compared with related researches,the proposed model has better prediction accuracy and quicker convergence,and thus a better computation efficiency.
  • TIAN Jinhua,WEI Changbao
    Computer Engineering. 2015, 41(8): 46-51. https://doi.org/10.3969/j.issn.1000-3428.2015.08.009
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to improve the efficiency of cache resource pool and reduce the total chip area in multicore processor,a 3D multicore structure is introduced.This structure implements a runtime application-aware job allocation and cache sharing policy.It improves energy efficiency in 3D multicore structure by providing flexible heterogeneity of cache resources and dynamically allocates job to cores of 3D multicore structure.It pairs applications with contrasting cache use,and partitions the cache resources based on the cache hungriness of the applications.Experimental results demonstrate that the proposed policy improves the system performance,reduces energy consumption and chip area.Compared with 3D multicore processor based on static cache,the proposed 3D structure improves system Energy-delay Product(EDP)and Energy-delay-area Product(EDAP)by up to 36.9% and 57.2%.
  • HUANG Guobing,LI Ruiling,LI Huali,WANG Qiong
    Computer Engineering. 2015, 41(8): 52-54,60. https://doi.org/10.3969/j.issn.1000-3428.2015.08.010
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The current highest priority in ready table is got by looking up OSUnMapTbl in μC/OS-II task scheduling algorithm. According to the definition of ready group and ready list of μC/OS-II,the generating mechanism of OSUnMapTbl in μC/OS-II is analyzed and derived inversely,and the mathematical expression of OSUnMapTbl generation is obtained. Execution efficiency of the new task scheduling algorithm is improved by accessing ready group and ready table with internal bit address area and bit operation commands as well as multi-branch case structure of MCS-51. Run test result of assembly code for improved algorithm in electronic oil pressure controller verifies the correctness of improved algorithm,and shows that its execution efficiency is improved significantly.
  • TONG Haiqi,BAO Xiuguo,TUO Yupeng,YUAN Qingsheng,YE Jianwei
    Computer Engineering. 2015, 41(8): 55-60. https://doi.org/10.3969/j.issn.1000-3428.2015.08.011
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    In the high-speed network environment,to achieve fast and accurate packet classification is of great significance to the novel network.This paper designs and implements the Hash algorithm based on dimension decomposition(HDD) on the basis of the ideological of dimension decomposition and in combination with single step mapping and hash method.The algorithm first considers accurate packet classification,then significantly accelerates searching and improves performance by introducing a hash table for mapping between the rules and the data stream.Experimental results show that the number of average memory accesses of HDD algorithm is lower than Hierarchical Space Mapping(HSM) algorithm and Recursive Flow Classification(RFC) algorithm respectively by 86% and 60%,what’s more,HDD algorithm saves 8% of space usage than RFC algorithm when the rule number is more than 2 500.

  • ZHANG Min,LI Bin
    Computer Engineering. 2015, 41(8): 61-64,70. https://doi.org/10.3969/j.issn.1000-3428.2015.08.012
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In Cognitive Radio Network(CRN),it must be ensured that the primary user’s Quality of Service(QoS)is not affected,in addition,the secondary user’s QoS of basic communication should be guaranteed.Based on the call level and packet level policies,this paper proposes the cross-layer model and performance analysis approach building on the queuing theory for cognitive radio networks.The formulas for call blocking probability and packet delay are derived for secondary users.Moreover,in order to evaluate the accuracy of the proposed analysis models,the paper performs simulations for the cognitive radio etworks.Simulation results show that the proposed analytical models can be used to accurately evaluate the performance of the cognitive radio networks,the call blocking probability and packet delay increase with the increase of the arrival rates for primary users and secondary users.
  • DANG Xiaochao,WANG Hongmei,HAO Zhanjun
    Computer Engineering. 2015, 41(8): 65-70. https://doi.org/10.3969/j.issn.1000-3428.2015.08.013
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the different demands of target area in different regional covering degree,this paper combines areal density and virtual forces to present a three-dimensional covering algorithm for Wireless Senor Network(WSN).It designs density-dependent model,uses density represents the importance of different sub-regions,and calculates the average density of the entire region by communicating with neighbor nodes. According to the density of nodes in the region,it can calculate the virtual resultant forces.Then,the deployment range can be readjusted.Experimental result shows that compared with the self-organization covering algorithm based on virtual forces,the proposed algorithm can effectively improve the coverage degree of sub-region with high density and covering efficiency of entire region,and reduce energy consumption of node deployment.
  • QI Xiaoxuan,GUO Tingting,JIA Zhiyong
    Computer Engineering. 2015, 41(8): 71-75. https://doi.org/10.3969/j.issn.1000-3428.2015.08.014
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem of cross-terms in Wigner-Ville distribution,an eliminating approach based on Fast Independent Component Analysis(Fast-ICA) is proposed.Fast-ICA method is utilized to the mixtured signal in an attempt to extract a number of independent components.Wigner-Ville analysis is employed on the above independent components respectively.The analysis results of each component are used to reconstruct the overall Wigner-Ville distribution of the original signal by linear superposition,in which there are less or even no corss-terms.The presented method provides another method of eliminating cross-terms and the simulation results show that it can eliminate the cross-terms in the Wigner-Ville distribution of the mixtured signal with advantages of fast convergence,high real-time performance as well as sound time-frequency concentration.
  • ZHOU Xiong,CHEN Guobin
    Computer Engineering. 2015, 41(8): 76-81. https://doi.org/10.3969/j.issn.1000-3428.2015.08.015
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    According to the non convexity demand for the objective function of Maximum Likelihood(ML)estimation,which results in more than one local extremum,a kind of convex relaxation location algorithm for Wireless Sensor Network(WSN)is proposed.The Second Order Cone Programming(SOCP)and Semi-Definite Programming(SDP)are used to improve the non convexity of the maximum likelihood estimation,and the Cramer-Rao lower bound expressions of the Root Mean Square Error(RMSE)is proposed.According to the three different circumstances of the WSN,the positioning schemes with different convex relaxation are respectively proposed,which improve the robustness of the algorithm.Through simulation and comparison with the existing scheme display,the proposed scheme improve the performance of RMSE in the evaluation index,and not too much increase the computational complexity of the algorithm.
  • SUN Li,TIAN Jinhua
    Computer Engineering. 2015, 41(8): 82-88. https://doi.org/10.3969/j.issn.1000-3428.2015.08.016
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the disadvantages of existing XY routing algorithms have the higher delay in Network-on-Chip(NoC).In this paper,a new fault-tolerant and congestion-aware adaptive routing algorithm for NoC is proposed.A distributed approach is employed for partitioning of the regular NoC architecture into regions controlled by local monitoring units.Each local monitoring unit runs a shortest path computation procedure to identify the best routing path so that highly congested routers and faulty links are avoided while latency is improved.To dynamically react to continuously changing traffic conditions,a ball-and-string model based shortest path computation method is proposed,which is together with the decentralized region based routing approach,and leads to minimal hardware overhead.Experimental results based on an actual Verilog implementation demonstrate that the proposed adaptive routing algorithm improves significantly the network throughput compared with traditional XY routing and DyXY adaptive algorithms.
  • XIAO Heng,LV Shaohe
    Computer Engineering. 2015, 41(8): 89-94,99. https://doi.org/10.3969/j.issn.1000-3428.2015.08.017
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    There are several shortcomings of existing wireless channel contention mechanisms,and the conflict is high and the required time is long.As a result,the coordination efficiency is quite low.This paper proposes a Parallel Frequency Domain Competition(PFC) mechanism.By utilizing orthogonal frequency division multiplexing to provide a large number of sub-carriers,the channel contention can be completed in the frequency domain.A node carrying with multiple antennas can declare its own transmission needs and priorities in the same time slot when it listens for the behaviors of other nodes.Simulation results show that compared with IEEE 802.11,when there is no synchronization error,the collision probability is less than 1% and the contention time is reduced by 50%~80%.When there is hidden terminal or synchronization error,the throughput of the proposed mechanism is increased by 10%~80%.
  • WANG Xinyan,YANG Bo
    Computer Engineering. 2015, 41(8): 95-99. https://doi.org/10.3969/j.issn.1000-3428.2015.08.018
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Focusing on the disadvantages of the traditional tree-based algorithm such as many query cycles,long response time,and large communication overhead,a novel tag anti-collision algorithm called Multi-ary Query Tree(MQT)scheme is proposed.This algorithm uses the mapping table to make the arbitration process feasible for multiple bits,overcomes the drawbacks of traditional single-bit arbitration,and reduces the number of queries.An analytic model is developed for the response time to complete identifying all tags and derive optimal Multi-ary tree for the minimum average response time.Theoretical analysis and simulation results show that MQT outperforms other tree based protocols in terms of time complexity and communication overhead.
  • FENG Minjia,WANG Yulong
    Computer Engineering. 2015, 41(8): 100-104,109. https://doi.org/10.3969/j.issn.1000-3428.2015.08.019
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper is concerned with fault detection filter design for continuous-time Networked Control System(NCS) under consideration of network-induced delays,packet dropouts and communication constrains.The network-induced delays and packet dropouts in the sensor-to-controller channel,and communication constrains in the controller-to-actuator channel are considered simultaneously,the observer-based fault detection filter is adopted to generate residual signal,then the fault detection model for continuous-time NCS is established.By defining an appropriate Lyapunov functional and adopting the convex analysis method,the observer-based fault detection filter design criteria is established.The theoretical analysis illustrates the less conservatism of the proposed fault detection filter design criteria.The designed fault detection filter can guarantee the sensitivity of the residual signal to faults,and the robustness of the considered systems to disturbances.The simulation results illustrate the effectiveness of the proposed detection method.
  • ZHOU Wenqian,MA Yan,LI Shunbao,ZHANG Xiangfen,ZHANG Yuping
    Computer Engineering. 2015, 41(8): 105-109. https://doi.org/10.3969/j.issn.1000-3428.2015.08.020
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The widely use of Quick Response(QR) code has improved social convenience to a large extent so that the security of its information becomes more significant.In order to realize the protection of QR code,a new encryption method is proposed.According to the shortages of traditional RC4 algorithm,by adding the self-error detection phase,it resists the fault induction attack,and improves the randomness of the Pseudo-random Generation(PGR) phase of the RC4 by adding chaos algorithm.These two are combined into improved RC4 algorithm.Moreover,it can be applied in QR code information encryption to improve its security and reliability.It tests the improved algorithm through two kinds of methods,the results show that the improved RC4 algorithm can satisfy the requirements of QR code in security and reliability well.
  • FENG Bin,YUAN Qiongqiong,GUO Cheng,LI Mingchu,DI Yafeng
    Computer Engineering. 2015, 41(8): 110-114. https://doi.org/10.3969/j.issn.1000-3428.2015.08.021
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The existing data-hiding schemes utilizing the properties of DNA sequences have the disadvantages of expanding the length of the reference DNA and the high modification rate.Aiming at the these roblems,this paper proposes a new DNA hiding scheme based on DNA sequence.It utilizes the complementary rule of nucleotide and designates an injective mapping between the complementary rule and the number of secret bits,and uses the interval of the two changed nucleotides to hide the secret data.The capacity of the proposed scheme is related to the secret data,but the average of the capacity is higher.In the same time,this scheme does not expand the length of the reference DNA sequence,and the modification rate of it is very low.So the proposed scheme allows the receiver to obtain a better quality fake DNA uence.Analysis and experiment result shows that this scheme has higher security compared with substitution method and interpdation method.
  • HUANG Gan,LIU Tao,GUAN Yawen
    Computer Engineering. 2015, 41(8): 115-119. https://doi.org/10.3969/j.issn.1000-3428.2015.08.022
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    To reduce the resource use of sensors and enhance the security of Wireless Sensor Network(WSN),a trust-based authentication scheme is proposed and it calculates the node trust by introducing the time slice,the coefficient of safety operations and the frequency of interaction.This makes it difficult for selfish nodes to masquerade as normal nodes,makes trust behavior closely related to the current node,and prevents nodes from achieving higher trust through few trades.Then through combining the identification,the password and the smart card,a user authentication scheme is designed.Before the user authenticates with the sensor node,the gateway node needs to query the trust of nodes and find the trusted node.The optimized certification scheme is used to realize the interaction among nodes,gateway nodes and user can change the password easily.The safety analysis,the performance analysis and the result of the simulation show that,compared with the previous proposed user authentication schemes,this scheme can resist replay attack,inside attack,masquerading,etc.Meanwhile,it costs little time.Thus,this scheme is suitable for WSN which has a high request for the security and performance.

  • QIU Bin,ZHANG Dan,WANG Zhida
    Computer Engineering. 2015, 41(8): 120-126. https://doi.org/10.3969/j.issn.1000-3428.2015.08.023
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    In view of the current algorithm for multi-image synchronous real-time encryption has low efficiency,high complexity and cannot meet the requirements of real-time transmission problem.The selectively multi-image lossless real time encryption algorithm based on significant pixels composite matrix is proposed.All the pixels of plaintexts are permutated by introducing the Zigzag mechanism;the image pixels are divided into important pixels and unimportant pixels by defining the pixels of interest selection mechanism for getting several important pixels matrix;and the composite matrix is obtained by designing the iteration plural model.Using the singular value decomposition gets the key matrix.The cipher is formed by constructing the diffusion function to diffuse the significant pixel composite matrix.This algorithm only encrypts the image significant pixels,avoiding the diffusion of non-significant pixel,resulting in low complexity.Simulation results show that this algorithm has high security,and belongs to lossless encryption;comparison with other multi-image encryption mechanism,the encryption/decryption efficiency of this algorithm is higher to meet the requirement of real time transmission.

  • ZHANG Xiaodan,LI Chunlai
    Computer Engineering. 2015, 41(8): 127-131. https://doi.org/10.3969/j.issn.1000-3428.2015.08.024
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    How to design secure steganographic algorithm is still difficult and in order to obtain better results of image steganographies,a novel steganographic algorithm based on co-occurrence matrix feature and Discrete Cosine Transform(DCT)is proposed.A distortion function is designed by using correlation of inter-block discrete cosine transforms coefficients which can reflect co-occurrence matrix feature,message embedding is completed based on syndrome trellis codes and using ±1 changing mode,and the simulation experiments are carried out to test the performance of algorithm.Simulation results show that,compared with other steganographies algorithms,the proposed algorithm has improvement in security,gets outstanding erformance in common anti-detection,and improves the speed obviously and the quality of stegano image is better.
  • LI Xiangjun,ZHANG Huawei,ZHENG Siwei,HUO Yanli,ZHANG Xinping
    Computer Engineering. 2015, 41(8): 132-139. https://doi.org/10.3969/j.issn.1000-3428.2015.08.025
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to further improve the accuracy of measuring outlier degree of data samples in Network Anomaly Detection(NAD),and to reduce the impact of noisy data on the algorithm’s detection accuracy in complex network environment,this paper proposes a network anomaly detection algorithm based on relative neighborhood entropy,called Transductive Confidence Machines for Relative Neighborhood Entropy(TCM-RNE).The algorithm redefines the outlier degree and uses the neighborhood entropy as a new tool to measure the outlier degree,to improve the detection accuracy and noise immunity of the rithm.Experimental results based on KDD Cup dataset show that the TCM-RNE improves the False Positive(FP) rate considerably and maintains a good True Positive(TP)rate,compared with the TCM-KNN algorithm.In addition,when providing training dataset contaminated by noisy data,the proposed algorithm still holds very good detection performance.
  • WANG Daxing,TENG Jikai
    Computer Engineering. 2015, 41(8): 140-143,155. https://doi.org/10.3969/j.issn.1000-3428.2015.08.026
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aggregate signature technology compresses multiple users signature into one signature,which is useful in special areas where the signatures on many different messages generated by many different users need to be aggregated,and which improves the efficiency of the signature’s verification and transmission.However,in the current aggregate signature schemes,there are problems in computational efficiency,communication cost and security aspects.An sequential aggregate signature scheme based Camenisch Lysyanskaya(CL)-signature is proposed,which is provably secure under LRSW assumptions without random oracle model.Further,the new scheme has the excellent features with short length of the public key and the signature compared with existing solutions,and it improves the computational efficiency of the signature verification algorithm.
  • GUO Hui,BAI Sen,YANG Yi,SONG Bin LI Shuyun
    Computer Engineering. 2015, 41(8): 144-149,161. https://doi.org/10.3969/j.issn.1000-3428.2015.08.027
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Because of good pseudo-randomness and security features of the high order M sequence,it is applied to the field of information security,therefore,how to generate advanced M sequence quickly and efficiently is a research focus.A new M sequence recursive upgrade construction method based on de Bruijn graph is presented.This method,under the condition of a known Hamiltonian cycle of the binary nth order de Bruijn graph,according to the theory that a Hamiltonian cycle of an nth order de Bruijn graph can construct nth order M sequence,and an Euler cycle can construct a higher order M sequence,converts the M sequence to a Hamiltonian cycle in this de Bruijn graph and determines the complementary cycles of this Hamiltonian cycle.Circles and loopbacks of diverse length constitute the complementary cycles,it obtains a Euler cycle that can construct(n+1)th order M sequence,from which a higher order M sequence is generated by successive recursive method.The generated high order M sequence is tested by NIST SP 800-22 random number test suit.Results show that advanced order M sequence has rather good randomness.
  • SHEN Junxin,GUO Xiaojun,WANG Wenhao,YANG Xu
    Computer Engineering. 2015, 41(8): 150-155. https://doi.org/10.3969/j.issn.1000-3428.2015.08.028
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to solve the problem of high complexity of multipoint interface communication strategies under the framework of traditional MapReduce,a kind of k-means based protocol group-reduced secondary parallel clustering algorithm is proposed.Group membership management protocol is defined to manage the operational group members,through the operation of broadcast,delete and add for the group reference list pID,realizes the group-reduced based synchronous operation,which reduces the time complexity of the algorithm.The number of intermediate buffer clustering is defined and combined with the k-means algorithm to reduce the input data of the secondary parallel clustering algorithm to reduce the amount of group operation,which further reducing the time complexity of the algorithm.The simulation experiment in the test data set show that,in the guarantee of enough clustering precision,the strategy greatly improves the efficiency of the algorithm.
  • TIAN Ran,SUN Linfu,WANG Nan,LI Binyong
    Computer Engineering. 2015, 41(8): 156-161. https://doi.org/10.3969/j.issn.1000-3428.2015.08.029
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    For a variety of goods,multi-unloading point,vehicle-mounted bin packing problem with multi-carrier,a vehicle-mounted bin packing model of multi-unloading point based on packing constraints is proposed.According to finding the optimal packing order and the optimal path similarity,a multi-pheromone of ant colony algorithm for solving multi-unloading point of vehicle-mounted bin packing problem is designed.This algorithm improves the convergence speed by reducing the scale of the ant routing based on two kinds of neighborhood,and improves the convergence speed and avoids local optimum by using the weight to control the local pheromone and global pheromone proportion.Experimental results show that this algorithm is better than the greedy algorithm in the volume utilization rate,capacity utilization rate and the number of vehicles,and is better than single pheromone of ant colony algorithm in the speed of convergence.
  • KANG Jihua,ZHANG Qi
    Computer Engineering. 2015, 41(8): 162-167. https://doi.org/10.3969/j.issn.1000-3428.2015.08.030
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    With the increase of the users’ input query freedom,it causes the performance that the semi-structured data retrieval method can not meet the users’ requirements.A novel semi-structured retrieval model based on the factor graph model is proposed to solve this problem.This framework incorporates term weighting,Bayesien attribute mapping and edit distance based string similarity metrics together to improve the retrieving performance.A number of queries are randomly selected from logs of a commercial search engine and manually are labeled for analysis and evaluation.Experimental results show that this model can effectively improve the retrieval performance of semi-structured data compared with Hierarchical Language Model(HLM) and Probability Retrieval Model for Semi-structured Data(PRMS),etc.

  • LING Haifeng,LIU Chaochao
    Computer Engineering. 2015, 41(8): 168-173. https://doi.org/10.3969/j.issn.1000-3428.2015.08.031
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Traditional algorithm has to face a number of problems,such as limiting of memory,lacking of parallel advantage,unable to handle distributed datasets.In order to deal with the problems,this paper proposes a parallel Ant Colony Optimization Clustering(ACOC) algorithm.The proposed algorithm solves the problem of big data by referencing the thought of the search space replication approach and the search space partition approach.The algorithm can read pheromone and dataset line-by-line to avoid out of memory when dealing with large datasets.Experimental results demonstrate that the algorithm has good scalability and high speedup when dealing with large-scale data.
  • XIA Zhuoqun,OU Hui,WU Zhiwei,FAN Kaiqin
    Computer Engineering. 2015, 41(8): 174-179. https://doi.org/10.3969/j.issn.1000-3428.2015.08.032
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at resolving the problems of the traditional k-means algorithm random selecting of initial clustering centers,having the flaw of the global consistency on the large scale whose parameters are based on manifold distance as the measure of the similarity.A hierarchical clustering algorithm based on attribute partitioning and curve distance is proposed.It is based on the attribute partitioning ideological of granular computing and max-min distance method selects initial cluster centers and makes the crude clustering by k-means to get early stage exemplars.According to new distance measure,that is curve distance and criterion function.The big similarity within class and smaller similarity between class does cluster classification to get expect exemplars.Each data points are assigned through the labels of their corresponding representative exemplars.Experimental results show that the algorithm has the good global consistency to the data set,and the running time is reduced.
  • XIONG Xiangguang,WEI Li
    Computer Engineering. 2015, 41(8): 180-185. https://doi.org/10.3969/j.issn.1000-3428.2015.08.033
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to improve the transparency and embedding capacity of traditional reversible watermark algorithm based on histogram shifting,a novel reversible grayscale image watermark algorithm based on double-embedding inverse embedding method is proposed.The possible occur problem of overflow/underflow by shifting pixel and embedding watermark is solved by histogram shifting method.The difference of two adjacent pixels is calculated and constructs difference histogram according to all differences.The proposed algorithm selects the maximum peak point of difference histogram to embed watermark signal.During the second embedding,it uses the inverse method to offset some pixels expansion,thus ensuring the embedding capacity increase while reducing image distortion.Experimental results show that compared with other similar algorithms,the proposed algorithm has better transparency and the embedding capacity is improved.
  • HU Yan,WANG Huiqin,HUANG Dongyu,MA Zongfang
    Computer Engineering. 2015, 41(8): 186-189. https://doi.org/10.3969/j.issn.1000-3428.2015.08.034
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the lower generalization ability and high false rate of the present pattern recognition algorithms with fixed flame image characteristic,the algorithm of adaptive selection flame image features is proposed in this paper.According to the two basic principles of characteristic reduction,genetic optimization is introduced into the attributes reduction of Rough Set(RS).The ratios of crossover and mutation are changed with individual’s fitness to protect good individual and eliminate bad individual.It dynamically clips the similar individuals and adds new individual,increases the diversity of population to improve the global optimization ability of Genetic Algorithm (GA).Experimental results show that the algorithm can reduce the dimension of feature space,and the average recognition rate of the flame is increased by 16% compared with the image fire detection algorithm based on Support Vector Machine(SVM).
  • CAO Miaoke,FANG Faming,XU Yingying,SHEN Chaomin
    Computer Engineering. 2015, 41(8): 190-195. https://doi.org/10.3969/j.issn.1000-3428.2015.08.035
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at overcoming the defaults that classical active contour models cannot segment objects accurately with details,this paper proposes a hybrid approach for natural image segmentation using region based active contours and matting.Since localized contour methods can segment objects with heterogeneous,which is a difficult task for a global method,it combines matting and Chan-Vese models,uses the way of local area segmentation.The closed form matting is used as a trimap in the localized Chan-Vese model and the energy functional is built.It implements the model by variational methods,and obtains the optimal solution by iteration.The comparison results illustrate that the model can extract objects accurately.Compared with Chan-Vese model,it has salient features of robustness and insensitive to initial curve.
  • PENG Cheng,ZOU Changchun
    Computer Engineering. 2015, 41(8): 196-201. https://doi.org/10.3969/j.issn.1000-3428.2015.08.036
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to effectively acquire geological information from imaging logs,the method of crack extraction automatically based on improved ant colony algorithm is proposed.This method generates accumulator image using the move of artificial ants on the images,the gray image is segmented into the accumulator segmentation,realizing the acquirement of crack pixels.Combined with Hough transform,it extracts crack parameters of crack.The image with non-fracture information from ultrasonic imaging logs is processed by using this method.The method realizes the extraction automatically of crack on the actual images.Results show that improved ant colony algorithm can be used for eliminating non-fracture information of vertical extension and segmenting low angle cracks and horizontal cracks.It plays the role of improving the identification precision in crack extraction automatically.
  • WANG Lanzhong,ZHAO Peng,LI Chenglong,ZHONG Fan
    Computer Engineering. 2015, 41(8): 202-206. https://doi.org/10.3969/j.issn.1000-3428.2015.08.037
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    For the problem of face recognition with noise,illumination,and occlusion,Class Dependent Kernel Sparse Representation based Classification(CDKSRC)is proposed.The basic idea is that redundant dictionary is composed of many sub-dictionaries and kernel technology is used to improve face recognition rate.CDKSRC model is constructed by each class sub-dictionary and error matrix.Using the basic idea of Orthogonal Matching Pursuit(OMP),Class dependent Kernel Regularized Orthogonal Matching Pursuit(KROMP)technology is proposed to solve this model to obtain sparse representation coefficients.The reconstruction error associated with the each class can be calculated to achieve classification of the test sample by the sparse coefficients and each class sub-dictionary.Compared with state-of-the-art methods,the proposed algorithm gets a higher recognition rate,while it has good robust to noise,illumination,and occlusion,etc.Experimental results validate the effectiveness of the proposed algorithm.
  • SHI Yi,WANG Zhongyuan,HU Jinhui,YANG Cheng
    Computer Engineering. 2015, 41(8): 207-211,217. https://doi.org/10.3969/j.issn.1000-3428.2015.08.038
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In one thread,Average Bit Rate(ABR) control algorithm of X264 encoder can use the deviation of all frames to adjust Quantization Parameter(QP).But in multi-threads,it has some frames before current one that is not encoded,so it only uses deviation of frames that have been encoded to adjust QP,which results real bit deviate target bit severely.To solve this problem,this paper proposes a method to optimize the rate control under multi-threads conditions.This method estimates the deviation of non-encoded frames based on the actual deviation of encoded frames,and achieves deviation of all frames like in one thread,then uses all frames’ deviations to adjust the current frame’s QP,and gets the goal of increasing accuracy of rate controlling.Experimental results show that the actual bit rate produced by this method is more close to the target one,rate error reduces by up to 2.27%,and the bit rate curve appears steadier than ever.
  • YANG Yan,LEI Yingsi,YUE Hui
    Computer Engineering. 2015, 41(8): 212-217. https://doi.org/10.3969/j.issn.1000-3428.2015.08.039
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The Synchronized Overlap-add(SOLA) algorithm of speech Time Scale Modification(TSM) neglects the natural characteristics of real sound speech signals that different kinds of speech segments change differently under the change of speech speed and applies a same scaling factor to all the speech segments.When scaling proportion is large,the output speech signal is distorted.Aiming at such problems,a greedy adaptive algorithm is proposed.This algorithm applies different scaling factors to different speech segments and puts forward an adaptive algorithm.It changes the scaling factors dynamically,the defect of the whole modified proportion is further ameliorated and a greedy adaptive algorithm is created.Experimental results show that,under the Matlab environment,in the comparison simulations of speeches from TIMIT speech base,this algorithm improves the natural degree of the synthetic speech signals compared with the existing algorithms like Waveform Similarity Synchronized Overlap-add(WSOLA) algorithm and Time Domain Pitch Synchronized Overlap-add(TDPSOLA) algorithm.The scaled time deviation of the greedy adaptive algorithm is small.
  • HUANG Jiechen,NI Ming
    Computer Engineering. 2015, 41(8): 218-222,226. https://doi.org/10.3969/j.issn.1000-3428.2015.08.040
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Video transcoding is one of the important technologies of Video On-demand(VOD)website.In order to solve the problem of large overhead and time consuming when videos are transcoded by single node,based on study of different type of video transcoding solutions,a distributed video transcoding solution is proposed.The whole video transcoding process can be devided into three parts,video spliting,video transcoding and video merging.Video spliting process and video merging process can be accomplished by single node,while videos,which are splited in video spliting process,are transcoded by MapReduce program of Hadoop,using FFmpeg,a video processing tool,running on several nodes at the same time.Experimental results show that,compared with single node,the proposed distributed video transcoding solution implemented on 8 nodes can save about 65% of the transcoding time.Experimental results also show the video transcoding time changes with the size of video segmentation.
  • LI Ao,LI Yibing,LIN Yun
    Computer Engineering. 2015, 41(8): 223-226. https://doi.org/10.3969/j.issn.1000-3428.2015.08.041
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper proposes a novel image compressive reconstruction algorithm based on Approximate Message Passing(AMP).The novel algorithm derives the formulation in wavelet domain under the AMP framework,and proves that the filter function effects are the wavelet coefficients of image.It uses the wavelet transformation to increase the sparsity of processing objects,and incorporates the Wiener function to decrease the computational complexity.Experimental results demonstrate that,compared with the reconstruction algorithm based on gradient projection and the orthogonal matching pursuit algorithm,the proposed method is realized easily and shows some advantage on both of reconstruction accuracy and visualization.
  • LIN Xianghong,ZHANG Ning,CUI Wenbo,FENG Lixia
    Computer Engineering. 2015, 41(8): 227-232. https://doi.org/10.3969/j.issn.1000-3428.2015.08.042
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Using the Time-to-first-spike(TTFS) coding strategy,a novel image segmentation method of spiking neural network model is proposed.The image pixel values are encoded into the spike timings of neurons in the input layer.The encoded results are delivered into the middle layer of spiking neural network through the different receptive fields,and spikes are triggered by a threshold condition.The spike timings of the neurons in the output layer are divided into two categories by the segmentation threshold.The corresponding image segmentation experiments show that the parameter changes of receptive filed size,threshold potential and segmentation threshold have significant impacts on the image segmentation result which is evaluated by the maximum Shannon entropy.Compared with the maximum between-cluster variance method and the pulse coupled neural network method based on maximum entropy,the proposed method has stronger robustness for image noise in image segmentation.
  • FENG Bao,QIN Chuanbo
    Computer Engineering. 2015, 41(8): 233-237. https://doi.org/10.3969/j.issn.1000-3428.2015.08.043
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Independent Component Analysis(ICA) is widely used in function Magnetic Resonance Imaging(fMRI) data analysis.However,recent studies show that the independence assumption for ICA based method is sometime violated in practice.In order to overcome this problem,combined with the characteristics of fMRI data,this paper proposes a new blind separation method,which exploits sparsity and non-negativity of sources,for brain image data.Compared with independence assumption,sparsity and non-negativity assumptions are considered more realistic to fMRI data.Based on non-negativity and sparsity assumptions,the new method estimates the source components by finding the extreme points of the observed fMRI data constructed convex set.Numerical results show that voxels selected by the proposed method are more related to task function and easily interpretable.
  • LI Xiaohua,WANG Yujie,YANG Li,NIE Juan,LIAN Shibin,YUAN Lei
    Computer Engineering. 2015, 41(8): 238-243,251. https://doi.org/10.3969/j.issn.1000-3428.2015.08.044
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Mountainous environmental factors are complicated and changeable,the stability and reliability of radio frequency signal propagation is also severely affected.Data collection strategy based on traditional radio propagation model may not play the expected performance even fall into disuse.In view of this phenomenon,this paper proposes a Wireless Sensor Network(WSN)data acquisition system based on multi-attribute evaluation model.The architecture and working mechanism of the system is introduced.The key implementation technology of cluster head election and next routing hop selection is described.Simulation and analysis verify its performance.The results show that the system is able to measure the comprehensive evaluation indexes comprehensively,which can determine the cluster head and next routing hop scientifically and reasonably,and can better adapt to the data collection needs of mountainous orchard precision management.

  • SU Mengmeng,XIA Yinshui,CHU Zhufei
    Computer Engineering. 2015, 41(8): 244-251. https://doi.org/10.3969/j.issn.1000-3428.2015.08.045
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to effectively improve the yield of CMOS/nanowire/MOLecular hybrid(CMOL) circuit,this paper proposes a defect-tolerant mapping algorithm based on segmented serpentine coding. According to the clustering characteristics of defects distribution,the classification method of defect clusters distribution is presented to realize segmented serpentine coding. It avoids defective cells under the nano-array connectivity domain constraints,improves the success rate of the circuit mapping edges,and obtains optimized initial mapping solutions. It sets the constraint-violated penalties of mapping edges,establishes objective function,employs adaptive genetic algorithm to search the solution space,and realizes circuit defect-tolerant mapping. Compared with existing defect-tolerant mapping algorithm,the proposed algorithm has great performance on the aspects of run time,circuit scale and mapping success rate by testing the ISCAS89 standard circuits.
  • SUN Chen,ZHAO Yiqiang,LIU Qiang,LI Xu
    Computer Engineering. 2015, 41(8): 252-255,261. https://doi.org/10.3969/j.issn.1000-3428.2015.08.046
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Cascaded Integrator Comb(CIC)filter is always used as decimator or interpolator in broadband communication chips because of its simple construction and high efficiency.As the development of communication system and very large scale integrated circuit,chip’s integration density becomes higher,so it is important to optimize the area of CIC filter.This paper designs a CIC interpolation filter for wireless broadband radio frequency chip.The proposed design reduces the bit width of internal nodes of the filter by bit width optimization.In addition,for gain calibration,the proposed design uses Canonic Signed Digit(CSD)code multiplication cutting off the output data’s bits width,instead of two’s complement multiplication.Experimental results show that the filter optimizes the area of multipliers by 58% area reduction compared with preoptimized CIC interpolation filter.

  • CHEN Yong,ZHANG Wei,HU Xiaohui
    Computer Engineering. 2015, 41(8): 256-261. https://doi.org/10.3969/j.issn.1000-3428.2015.08.047
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    【Abstract】 Aiming at the characteristics of train tracking interval dynamic change under moving block condition,an improved EventB modeling method of highspeed train tracking operation model is built.The new method combines with the Agent theory to improve the dynamic property of Machine EventB method.The model realizes the formal define of highspeed multi trains following running.Simulation study influences of the speed change in train following process and different interval time on line traffic,and conclusions corresponding quantitative analysis are obtained.Simulation results show that the proposed improved EventB model can realize complex formal description of the train control system and help to dynamically control train spacing,and it has the feasibility and effectiveness.
  • PENG Xindong,YANG Yong
    Computer Engineering. 2015, 41(8): 262-267,272. https://doi.org/10.3969/j.issn.1000-3428.2015.08.048
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to effectively portray the non-membership situation of hesitant fuzzy soft set parameters in decision making process,based on the non-membership characteristics of dual hesitant fuzzy set and the parameterization of soft set,the concept of dual hesitant fuzzy soft set is proposed.On the definitions of dual hesitant fuzzy soft set,some operations on the dual hesitant fuzzy soft set such as complement,union,intersection,extended intersection,restriction union,AND,OR,difference,average,and geometric are defined,meanwhile,some corresponding operations of results are presented.The basic properties of dual hesitant fuzzy soft set are also presented and discussed.Finally,a correlation coefficient algorithm based dual hesitant fuzzy soft set is developed,which can solve the problem of non-membership in dual hesitant fuzzy soft set.
  • XI Liang,XIE Kefu
    Computer Engineering. 2015, 41(8): 268-272. https://doi.org/10.3969/j.issn.1000-3428.2015.08.049
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to put gray images focused on different object into one completely clear image,this paper presents a reconfigurable decomposition method of digital image based on quantum mechanics theory,and proposes a new multi-focus image fusion algorithm.The normalized gray image is expressed as the form of quantum bit,and establishes quantum correlation system,the image is decomposed into several characteristic sub-images.According to the different meaning of characteristic sub-images,it uses different fusion rules,and is reconstructed to obtain the fusion image.Experimental results show that the method can get clearer fused images than the traditional weighted average method and the image fusion method based on wavelet transform.
  • FENG Jinhai,YANG Lianhe,JIANG Xinlong
    Computer Engineering. 2015, 41(8): 273-278,285. https://doi.org/10.3969/j.issn.1000-3428.2015.08.050
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Based on Wireless Local Area Network(WLAN),positioning methods can provide a relatively accurate indoor positioning information,but it can not effectively utilize the hidden information of user trajectory.This paper presents an recommended algorithm based on indoor user trajectory clustering,combines with WLAN positioning technology to extract Point of Interest(POI) from a user trajectory,uses DBSCAN algorithm to find and extract user POI feature,and employs Decision Tree(DT) algorithm to realize user classification and personalization recommendation service.Indoor POI recommendation system based on weChat platform is designed and implemented to verify the validity of proposed algorithm,and it can provide personalized recommended and recommended service based on content for users.
  • MA Zhenyuan,LIANG Yubin,LI Jun
    Computer Engineering. 2015, 41(8): 279-285. https://doi.org/10.3969/j.issn.1000-3428.2015.08.051
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    According to the low accuracy in solving the traditional two-echelon vehicle routing problem,an Optimal Cutting Algorithm(OCA) and a full path matching cross Memetic algorithm are proposed and combined.Firstly,according to the distribution coupling characteristics of the one and two stage,OCA is used to determine the suboptimal solutions of capacity for transfer station,which is used as the basis for the optimization of the distribution of customers;Secondly,in order to improve the efficiency of the algorithm,the full path matching crossover Memetic algorithm is proposed,and then the hill climbing method is used for local search.OCA and improved Memetic algorithm are executed in order,which realizes the synchronization optimization for the capacity of transfer station and customer distribution of two-echelon.The experimental results show that,compared with Branch and Cut and Multi-start algorithm,the proposed optimization algorithm can achieve better performance in terms of both convergence precision and convergence speed.
  • YANG Benchen,WANG Cuiqin,WANG Xinrui
    Computer Engineering. 2015, 41(8): 286-290,295. https://doi.org/10.3969/j.issn.1000-3428.2015.08.052
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Most of the existing algorithms have problems of low accuracy,low coverage,and high cost.This paper proposes a positioning mechanism based on improved Received Signal Strength Indication(RSSI) and tetrahedral model.This mechanism determines the space tetrahedron where the unknown node locates according to the spatial improved RSSI location algorithm.The positions of unknown nodes are drawn by translating the volume coordinate of space tetrahedron.Thus the unknown nodes can be located.Then apply the positioning mechanism to underground coal mine.According to the simulating comparison with the underground localization algorithm based on projection model,this algorithm has higher accuracy and positioning coverage,and also reduces the overhead of network cost by cutting the node energy consumption.
  • LIU Yuanhong,LIU Jianmin,FENG Fuzhou,JIANG Pengcheng
    Computer Engineering. 2015, 41(8): 291-295. https://doi.org/10.3969/j.issn.1000-3428.2015.08.053
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    A construction method of fault diagnostic strategy based on rollout algorithm and information heuristic function is proposed,which is used for constructing binary value and multi-value test diagnostic strategy,aiming at overcoming the shortcomings of the traditional methods,for example,poor universality.The main idea is to obtain a new strategy by rollout algorithm and information heuristic function based benchmark strategy.The new strategy is updated by making new strategy above serve as a benchmark strategy,so as to approach the optimal strategy gradually by iterative computation.The validity of the proposed method is verified by the binary-value and multi-value test optimization cases,and the time complexity is calculated.The results show that the proposed method not only is applicable to binary value,multi-value and uncertain tests,but also has high diagnostic precision and moderate time complexity,which makes it applicable to diagnostic strategy optimization of complex system.
  • XU Guangxian,WU Wei,ZHOU Jia
    Computer Engineering. 2015, 41(8): 296-300. https://doi.org/10.3969/j.issn.1000-3428.2015.08.054
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Network coding technology has great advantages in improving network throughput and transmission efficiency.However,this technique requires additional coding operation and increases coding overhead.In order to reduce coding overhead by reducing the number of coding edges,this algorithm introduces a network coding optimization scheme based on multi-objective niche genetic algorithm.The algorithm uses multi-objective optimization to structure fitness function.By this way,it can reduce the number of encoding side,while taking the network bandwidth utilization into account.In addition,the algorithm uses adaptive crossover and mutation probability in the operation of niche genetic,to avoid the invalid operations and to improve operational efficiency.Experimental results show that this algorithm can reduce the coding overhead effectively.Compared with Simple Genetic Algorithm(SGA),this algorithm has better convergence and gets less coding side in a shorter time.
  • CAI Xuan,WANG Changlin
    Computer Engineering. 2015, 41(8): 301-305. https://doi.org/10.3969/j.issn.1000-3428.2015.08.055
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Double 2-vote-2 redundant structure is the technological trend of on-board Automatic Train Protection(ATP)safety computer.For the dual-module synchronization problem,this paper achieves local clock synchronization of the dual-module by using public external clock to provide a reference clock signal for the dual-module and compensate the local clock drift based on clock drift rate bounded model.Based on this,system software uses task cycle scheduling-control and dual-module communication to process the input,calculation and output tasks,synchronize in each task point of the control cycle and achieve the task synchronization.The software algorithms and intersystem communication are used to achieve the cycle synchronization of the two 2-vote-2 systems.Experimental results show that,the synchronization mechanism can satisfy the requirement of the double 2-vote-2 safety computer platform.
  • ZHAO Dengbu,BAI Ruilin,SHEN Chenghui,LI Xin
    Computer Engineering. 2015, 41(8): 306-312. https://doi.org/10.3969/j.issn.1000-3428.2015.08.056
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In the view of the trajectory of Point-to-Point(PTP) motion for Selective Compliance Assembly Robot Arm(SCARA)robot,in order to make the trajectory smoothly and time optimally with robot operation,this paper proposes a velocity trajectory planning method based on time-delay exponential function.Getting the time gain from executor’s limitation,the displacement is obtained from the initial position to the target position of each joint space.It is needed to move by the inverse kinematics of robot,and according to the joint space displacement to obtain the delay time.Determining the velocity trajectory planning expression of exponential function,through the simulation,it is combined with S-curve velocity trajectory method.Results show that the formula of the proposed method has small amount of calculation,the trajectory is smooth and the approximate time is optimal.
  • HE Xueming,MIAO Yannan,LUO Zailei
    Computer Engineering. 2015, 41(8): 313-316. https://doi.org/10.3969/j.issn.1000-3428.2015.08.057
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The Proportional-Integral-Derivative(PID)controller parameters tuning,is seeking the optimal value in the space of three parameters to achieve the optimal control performance of the system.Teaching-learning optimization is a new kind of swarm intelligence optimization algorithm.The algorithm is simple and easy to understand,and has less parameters,high solving speed,high precision and strong convergence ability.A new PID controller parameters tuning method based on the teaching-learning optimization algorithm is proposed.The parameters optimization of PID controller is realized by using teaching-learning,and simulation examples are done by Matlab.In the simulation examples,compared with the PID controller parameters tuning methods based on Particle Swarm Optimization(PSO)algorithm and Genetic Algorithm(GA),the results show that this method is simple with high precision.And it can effectively achieve the self-tuning of PID controller parameters quickly.
  • LIANG Bo,SONG Ying,WANG Bo,GUO Jian
    Computer Engineering. 2015, 41(8): 317-321. https://doi.org/10.3969/j.issn.1000-3428.2015.08.058
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper analyzes current status and challenges of the temperature monitoring system for the critical components of data center.To address the challenges,the design of real-time wireless high frequency multi-point temperature sampling system based on nRF24L01 is presented,and hardware circuit as well as software implementation of the system is described.Temperature sensors of the hardware circuit are in accord with the I/O ports of the micro control unit,and the I/O ports are also used to identify the temperature sensors to improve the efficiency of wireless transmission;At the software implementation,in order to reduce the data transmitted as well as improve efficiency of the wireless transmission and reduce power consumption,the data is compressed,thereby the transmission,display and storage of the temperature data are implemented.Measured results show that the system can sample the temperature for many points with the frequency of 1 Hz at the same time.The error is less than 0.5 ℃,the rate of package loss is less than 5%,and improves the transmission efficiency of communication.