Author Login Editor-in-Chief Peer Review Editor Work Office Work

15 June 2015, Volume 41 Issue 6
    

  • Select all
    |
  • XIA Bin,WANG Guanghao,WU Yue
    Computer Engineering. 2015, 41(6): 1-5. https://doi.org/10.3969/j.issn.1000-3428.2015.06.001
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    In vehicular network,broadcasting messages are not often able to arrive and receive properly due to the vulnerability of wireless channels and high mobility of vehicles. To solve this problem,this paper proposes an index coding based message broadcasting scheme in purpose of improving the message transmission efficiency. Index coding is a variant of source coding scheme that exploits the side information at different receivers,and this paper focuses on implementing index coding technology in the message broadcasting of vehicular networks. It proposes a distributed feedback based side information collection mechanism and an improved graph coloring algorithm to find the maximum clique,and the indexing coding is done. Simulation experimental results show that the scheme can reduce the number of transmissions,thus save wireless channel bandwidths and improve broadcasting efficiency.

  • ZOU Rong,CHEN Xiangxian,BIAN Jidong,TANG Zhifeng
    Computer Engineering. 2015, 41(6): 6-11. https://doi.org/10.3969/j.issn.1000-3428.2015.06.002
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Aiming at the problem of the now high localization degree of China’s urban rail transport Automatic Fare Collection(AFC) system,a rail transport automatic fare collection oriented bill acceptor is designed. The whole structure is of high-reliability so that it can promote the security and reduce jammed banknote. In hardware aspect,it is of modular design which helps product maintenance and update. High performance with ARM processors of every independent module,high-speed brushless DC motor,an array of sensors ensure the processing speed of the system. And software aspect,it uses the PCA dimensionality reduction algorithms and Euclidean distance to build the template. Test result shows that the bill acceptor can discriminate the bill quickly and reliably,and reach the engineering requirements on receiving rate and bill jammed rate. In addition,the receiving speed can be improved by more than 1 s,which solves the problem of big services density,and sensitive failure rate.

  • ZHANG Youpeng,WEI Lei,ZHAO Bin,ZHANG Fengxia
    Computer Engineering. 2015, 41(6): 12-17. https://doi.org/10.3969/j.issn.1000-3428.2015.06.003
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    When environment of ballast is very terrible,the working states of track circuits are difficult to differentiate in spatial domain. In view of this,the time domain finite difference method is used to obtain the time domain solutions of track circuits in this paper. The partial differential equations group is dispersed based on the theory of partial differential equation numerical solution,and a differential formation of track circuits is got. Then time domain responses of track circuits for adjusting state are analyzed according to the boundary conditions of voltage and current at the beginning and terminal. Through an example,voltage changes of the receiving end for the adjusting state are simulated corresponding to different initial electrical parameters. Results show that the solutions of time domain conform to the transmission characteristic of track circuits,therefore this method can provide the theoretical basis for transient analysis on track circuits.

  • LIU Yingdong,NIU Huimin,WANG Jianqiang
    Computer Engineering. 2015, 41(6): 18-23. https://doi.org/10.3969/j.issn.1000-3428.2015.06.004
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    On the basis of the NaSch (NS) model,a one-dimensional Cellular Automaton (CA) traffic flow model is proposed,in which the relationship of vehicles moving distance and the vehicle speed is emphasized,the safety distance is considered,and the reduction process and location update rules are redefined. The model confirms current vehicle speed by vehicles moving distance and previous vehicle speed. By computer simulation,the relationships among speed,density and traffic volume are given to show the influence of moving distance on the traffic flow. The presence of metastable state,phase separation and hysteresis are revealed,which have been observed in real traffic. Simulation result shows that the model is reasonable and effective,and when the vehicle is on the road,its availability of road resources is high and traffic flow is big.

  • WANG Zhipeng,LUO Xia
    Computer Engineering. 2015, 41(6): 24-27,32. https://doi.org/10.3969/j.issn.1000-3428.2015.06.005
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    The current train energy-saving optimization only considers the coordination between individual interval or train to lead poor applicability and guide optimization results. Aiming at the problem,this paper uses the energy-saving optimization of a single line interval as the research object to establish the interval train energy-saving optimization model,and resolves the model by genetic-annealing algorithm. It sets the time step-length to bring about the interval operation mode transition from the time-saving to energy-saving,constructs a knapsack problem to come true the energysaving optimization of line via distributing line reserve time and considering the time value of passengers,and solves it by using an improved greedy algorithm. A simple line is as an example to illustrate the maneuverability of the energy-saving optimization method. Experimental results show that the optimization method is effective,the energy consumption saves 41. 54% compared with time-saving mode,and it has good energy saving effect.

  • XU Hongzhi,LI Renfa,ZENG Lining
    Computer Engineering. 2015, 41(6): 28-32. https://doi.org/10.3969/j.issn.1000-3428.2015.06.006
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Cyber Physical System(CPS),as a new topic in research of computer science and technology,is the fusion of computation,communication and control. The Auto Adaptive Cruise Control (ACC) system is thought as a typical CPS,which has wide application prospect. This paper builds the mathematical model for the vertical travel of the car,and gives the architecture and the system state machine model of the ACC based on the theory of CPS. It designs the model of vehicle ahead,own vehicle and the distance between the two vehicles based on Ptolemy,and builds the hierarchical model of the system. In the submodel of the system,it constructs the system behavior model which is combined with model based on time and state model with the modal model. Simulation results show that the method can meet the requirements and ensures the security of ACC system.

  • FENG Yan,CHEN Fuzan
    Computer Engineering. 2015, 41(6): 33-37,42. https://doi.org/10.3969/j.issn.1000-3428.2015.06.007
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    The problem of Quality of Service (QoS)-based Web Service Composition (QWSC),i. e. ,selecting an optimal / satisfactory Service Composition Plan(SCP) from numerous candidate plans on the basis of QoS properties,is the most critical issue in the service-oriented computing. In this paper,the problem of QWSC is formulated as a Multi-Attribute Decision Making (MADM) representation. Furthermore,an intelligent evolutionary algorithm:Genetic Algorithm based Compromise Ratio Method (GACRM) is developed to solve the MADM problem. Combining with the advantage of Compromise Ratio Method(CRM) in terms of ranking alternatives,together with the superiority of Genetic Algorithm(GA) in terms of global search,GACRM is capable of finding an approximate optimal solution from a massive search space. Experimental result shows that GACRM is highly efficient and scalable for large-scale QWSC problems.

  • LI Xuezhu,CHEN Guolong
    Computer Engineering. 2015, 41(6): 38-42. https://doi.org/10.3969/j.issn.1000-3428.2015.06.008
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    It is important to study on methods of big data analysis that is used to find the relationship and rule between data to predict the future trend of things. This paper presents a big data analysis and forecasting method based on Support Vector Machine (SVM),which solves the problem that traditional prediction methods have low precision and poor generalization. The Probability Density Function(PDF) control based model parameters selection criterion is proposed to make the modeling error track a target Gaussian PDF. A contract Particle Swarm Optimization ( PSO) algorithm is adopted to tune the parameters. The proposed modeling approach is validated using the practical data,and the results show its efficiency. Compared with LSSVM,the accuracy of the proposed method is higher.

  • CAO Buqing,LIU Jianxun,TANG Mingdong,XIE Fenfang
    Computer Engineering. 2015, 41(6): 43-48,55. https://doi.org/10.3969/j.issn.1000-3428.2015.06.009
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    With the release of more and more Web API services on Internet,it becomes a challenging research problem that how to recommend Web APIs that developer user are interested in and reputation degrees are high,to construct high quality and trustworthy software service system. This paper presents Web API service recommendation approach based on user usage history and reputation evaluation (WASR). It computes the similarity between user history records and Web API services,and gets user interest degree. Service reputation degree is computed by considering the user score of Web API,the score contributions of those Mashup services calling the Web API,and traffic flow of Web API based on statistical data by Alexa. It ranks and recommends Web API services according to the user interest degree and service reputation degree of Web APIs. Experimental results show that this approach can recommend Web API services with higher DCG of user interest degree than those of SR-based approach,and higher DCG of service reputation degree than those of UI-based approach.

  • LEI Xiaofeng,LI Qiang,SUN Gongxing
    Computer Engineering. 2015, 41(6): 49-55. https://doi.org/10.3969/j.issn.1000-3428.2015.06.010
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    High energy collider produces several billions of events in the whole life time. Physical analysis is to select thousands of meaningful events from them and it is a typical big data processing and data mining application. Therefore, it is significantly important to design an efficient data structure,storage and access mechanism,so that the meaningful events can be selected quickly. This paper introduces event data structure,storage and processing technology in popular. This paper analyses the features of high energy physics analysis and proposes a new technology of data storing and processing for high energy physics. This paper fertilizes HBase to store data,uses MapReduce to implement parallel processing and selects ROOT and BEAN as high energy physics analysis frame. This paper also describes the specific design and implementation of the new platform. Test result shows that compared with traditional data storage system of high energy physics,the system has quick data processing speed,it can use effectively I / O and CPU resources when reselection goes into effect.

  • PENG Yuanhao,PAN Jiuhui
    Computer Engineering. 2015, 41(6): 56-60,65. https://doi.org/10.3969/j.issn.1000-3428.2015.06.011
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Change data can be captured by scanning log files in the database. However,methods proposed in most existing studies have many limitations and can only be applied to certain types of DBMS. Moreover,they have not provided effective ways of eliminating redundant information. This paper,while analyzing these limitations,proposes a universal model based on log analysis for incremental data detection and its net effect processing,and describes the typical procedures for incremental detection,which are log extraction,log analysis and net effect handling. Finally,experiments are also conducted to analyze the net effect handling speed,redundant data compression ratio,network transmission speed and other factors,through which show that the net effect handling helps to decrease the time of network transmission and data update,and improves operating efficiency.

  • REN Jianxin,ZHU Cuitao,LI Zhongjie,WANG Hanxin
    Computer Engineering. 2015, 41(6): 61-65. https://doi.org/10.3969/j.issn.1000-3428.2015.06.012
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    This paper presents a kind of fast and low complexity linear Bregman algorithm and lists the specific implementation steps of the algorithm in order to improve the performance of the wideband spectrum sensing in doubly selective fading environment,which is used for the process of testing wideband compressed spectrum based on cyclic spectrum estimation. This algorithm aims at the flaw of redundant iterative computation in general linear Bregman algorithm. It can accelerate the convergence speed by adding auxiliary variables to estimate iterations of residual staying almost constant in linear Bregman algorithm and updating auxiliary variables to step out the process of redundant iteration. At the same time,it reduces the complexity of the algorithm. As a result,compared with general linear Bregman algorithm,the effect of compressive sampling reconstruction,the detection probability and convergence speed of the modified algorithm are improved in doubly selective fading environment.

  • XU Wei,LIU Duanyang,BAO Zhanbing
    Computer Engineering. 2015, 41(6): 66-70. https://doi.org/10.3969/j.issn.1000-3428.2015.06.013
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Wireless Sensor Network (WSN) is widely used in military surveillance,environmental monitoring and remote medical fields. However,due to the problem of size,the impact of monitoring environment and other factors, wireless sensor nodes can carry limited energy. Consequently,in order to prolong the lifetime of the network,energy balancing becomes a hot topic in WSN. In this paper,based on the classic shortest time divisible load scheduling,it optimizes the divisible load model of residual energy for star WSN,and two load schedule algorithms are designed for the star topology and divisible load WSN:Residual-energy-sorting Schedule(RESS) algorithm and Virtual-Ability-Sorting Schedule (VASS) algorithm. Simulation result shows that RESS and VASS algorithm can both prolong the network lifetime effectively,what’s more,VASS algorithm is more stable than RESS algorithm.

  • GE Weimin,ZHU Haiying,LI Juan
    Computer Engineering. 2015, 41(6): 71-75. https://doi.org/10.3969/j.issn.1000-3428.2015.06.014
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    In order to improve the traditional Transmission Control Protocol(TCP) performance in wireless networks, this paper solves throughput rate model of three TCP protocols,TCP Reno,TCP Vegas and TCP / NC based on network coding through Matlab,and analyzes the theoretical value of the three models in different networks,and verifies whether TCP / NC model can improve its throughput rate. The three models are simulated through NS-2,and simulation results show the effectiveness of TCP / NC protocol and its analysis model.

  • CAI Wenxue,QIU Zhucheng,HUANG Xiaoyu,XIAO Chaowu,CHEN Kang
    Computer Engineering. 2015, 41(6): 76-82. https://doi.org/10.3969/j.issn.1000-3428.2015.06.015
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    As one of the most promising indoor positioning technologies,the WiFi-based fingerprinting model has attracted much research attention in recent years. However,most of the existing studies mainly focus on the single point positioning method,while in the actual applications,it often needs to track the object trajectory. Hence, in this paper, the Hidden Markov Model ( HMM ) based on kernel function is presented, which is on the basis of the single-point positioning model. The Gaussian kernel function is used to calculate the likelihood between fingerprints. The transition probability is defined and the search range is limited by a threshold value, which can improve the efficiency of the algorithm. HMM is applied to the track positioning. Experimental result shows that the proposed algorithm outperforms the benchmark algorithms significantly on both of the precision and distance measures.

  • LIU Wenfeng
    Computer Engineering. 2015, 41(6): 83-89. https://doi.org/10.3969/j.issn.1000-3428.2015.06.016
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Mobile phones with a rich set of embedded sensors enable sensing applications in various domains. This paper proposes to leverage cloud-assisted collaborative sensing to reduce sensing energy consumption for mobile phone sensing applications. It formally defines a minimum energy sensing scheduling problem and presents a polynomial-time algorithm to obtain optimal solutions,which can be used to show energy savings that can potentially be achieved by using collaborative sensing in mobile phone sensing applications, and can also serve as a benchmark for performance evaluation. Under realistic assumptions,it presents two euristic algorithms to find energy-efficient sensing schedules. Simulation results based on real energy consumption and location data show that collaborative sensing significantly reduces energy consumption compared with traditional approaches without collaborations,and the proposed heuristic algorithm performs well in terms of both total energy consumption and fairness.

  • WANG Qian,SI Hu,XIONG Yan,LI Mingxi
    Computer Engineering. 2015, 41(6): 90-95. https://doi.org/10.3969/j.issn.1000-3428.2015.06.017
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    To solve the low throughput of current ALOHA anti-collision algorithms,the Signal Strength Grouping Framed Slotted ALOHA(SSGFSA)algorithm is proposed,which exploits the energy transmission characteristics between Radio Frequency Indentification(RFID) reader and tags,improves the Framed Slotted ALOHA(FSA)algorithm to group tags,and distributes tags into frame slot according to received energy strength for reducing the probability of collision. Simulation results show that the maximum throughput of the algorithm is up to 50% ,better than the theory value of the FSA algorithm and Collision Grouping Algorithm(GCA) in the case of the ratio of the tag number and the frame time slot number is less than 1. 8.

  • CHEN Shuguang,ZHANG Yankang,ZHANG Zhen
    Computer Engineering. 2015, 41(6): 96-101. https://doi.org/10.3969/j.issn.1000-3428.2015.06.018
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Considering the throughput reduction caused by the packet collision of the PRP-MAC,a novel Stream Control Multiple Access Protocol with Collision Avoidance (SCMA-CA) protocol for Multiple Input Multiple Output (MIMO) Ad Hoc network is proposed in this paper. Based on the control packets defined in PRP-MAC,SCMA-CA introduces a new type of Clear to Send(CTS) packet,implements collision avoidance and maximizes stream number by employing MIMO technology. The simulations are conducted to evaluate the performance of the proposed protocol. The results show that SCMA-CA can effectively eliminate collisions between control packets and data packets when transmitters increasing in the network. Other than the network throughput improvement,SCMA-CA can also be applied in antenna-heterogeneous environments,which enhances the proposal’s applicability.

  • SUN Youwei,WANG Jinhai,FAN Huihui,ZHANG Huiqi
    Computer Engineering. 2015, 41(6): 102-109. https://doi.org/10.3969/j.issn.1000-3428.2015.06.019
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    In order to solve ‘hot’ issue in Wireless Sensor Network (WSN) and to avoid the network premature invalidation caused by local emergency,an energy-efficient routing algorithm with mobile node for WSN is proposed in this paper. To save energy consumption,the network is divided into several square virtual grids in this algorithm,a cluster head is selected by weighted sum of the residual energy of node and the distance of the node to the areal coordinate,and the sink node receives data from the cluster head by controllable moving scheduling strategy. Moreover,to extend the network lifetime,unlimited energy mobile relay is introduced,and is in the service of information transmitting large area. Influence of two parameters,the movement speed of the sink node and the weighting coefficient,on the performance of the proposed algorithm is detailed analyzed by simulation. The results show that the lifetime of the network,the total energy consumption and the total amount of data the sink node receives of the proposed algorithm are better than MSEERP and TTDD algorithm. When the movement speed of the sink node is equal to 5 m / s,the weighting coefficient is 0. 6,and the performance of the proposed algorithm is best.

  • LI Qinan,DONG Yijun,LI Jiao,OUYANG Zhifan
    Computer Engineering. 2015, 41(6): 110-115. https://doi.org/10.3969/j.issn.1000-3428.2015.06.020
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    As anti-collusion parameter values increase,Cover Free Family(CFF)code coding efficiency is on the decline. In order to solve the difficult problem,this paper presents a new anti-collusion digital fingerprint coding scheme,designs the code generation algorithm and detection algorithm. The former encodes the new code with the small parameter CFF code for internal code and I code for outer code,of which coding efficiency is equivalent for embedding the CFF code coding efficiency and effectively. The latter first explores the location which the inspected code is in the CFF internal code block,then tracks fingerprint characteristic codeword position to identify traitor. Test results on the asymmetric digital fingerprint audio piracy tracking system show that the new code can keep the characteristic that the CFF code resistances a variety of collusion attack, the success rate of traitor tracing is above 80% ,no misjudgment phenomenon happen and can provide non-repudiation legal evidence in which traitor is sentenced by the law.

  • JIN Ge,XUE Zhi,QI Kaiyue
    Computer Engineering. 2015, 41(6): 116-120. https://doi.org/10.3969/j.issn.1000-3428.2015.06.021
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Bootkit originates from Rootkit and can bypass most security software through loading the malicious code during windows booting period. This paper uses formal description to depict the procedure of malicious operation hiding and develops the cooperative concealment. Because most Bootkit hide malicious PE files in disks, a PE matching algorithm is designed and implemented. This algorithm will search some byte sequences with specific pattern to find potential hidden PE files and experiments show that this algorithm can gain a high detecting accuracy towards some Bootkit samples.

  • HE Fengying,ZHONG Shangping,XIAO Yulin
    Computer Engineering. 2015, 41(6): 121-125. https://doi.org/10.3969/j.issn.1000-3428.2015.06.022
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Aiming at the problem for high-dimensional and large-sized data,such as high computational complexity and low classification accuracy,this paper proposes a high dimensional steganography blind detection method based on Random Subspace Method and Principal Component Analysis (RSM-PCA) feature weighted Support Vector Machine (SVM). The method selects features randomly from original high-dimensional features to form feature subsets using random subspace method,the principal component analysis is adopted to carry out feature extraction on feature subsets and the chi-square statistic is adopted to measure the weights of extracted features,the feature weighted kernel function is used to train SVM and the finial decision is yielded using the majority vote method. Experimental results about HUGO steganographic algorithm show that this method can effectively reduce the computational complexity of SVM,compared with traditional algorithms,this method effectively improves the detection rate of steganalysis in JPEG images and achieves faster speed of image classification.

  • QU Juan,PENG Yang,TAN Xiaoling,ZHANG Jianzhong
    Computer Engineering. 2015, 41(6): 126-129,135. https://doi.org/10.3969/j.issn.1000-3428.2015.06.023
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    This paper analyzes a remote user authentication scheme based on biological features and quadratic residues, points out that the scheme is vulnerable to impersonation attack,server spoofing attack,session key disclosure attack and denial of service attack. To overcome these security flaws,the paper proposes a biological features based anonymous remote user authentication scheme with smart card, the scheme mainly includes register, login, authentication and password update. Analysis result shows that the proposed scheme not only solves the existing problems of previous scheme,but also can resist smart card lost attack,replay attack,and it implements user anonymity.

  • ZHANG Wei,DU Weizhang
    Computer Engineering. 2015, 41(6): 130-135. https://doi.org/10.3969/j.issn.1000-3428.2015.06.024
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    On the basis of the existing secret sharing schemes based on LUC cryptosystem,this paper proposes a new dynamic multi-secret sharing scheme. The scheme does not need to build secure channel between secret dealer and members,and the members and secrets can be dynamically added and deleted. In the phase of secret recovery,cooperative members only need to submit shadow shares for designated combiner,and the verifier can verify the validity of shadow shares publicly. Thus the system does not need renew secret shares in sharing multiple secrets and multi-group multisecret. The security of the scheme is proved in the random oracle model,under assumption of the discrete logarithm problem,the result shows that this scheme is semantic security,and in the aspect of calculation,security,the overall performance is superior to the traditional secret sharing scheme.

  • ZHAO Xiangmo,MIN Haigen,CHANG Zhiguo,XU Zhigang
    Computer Engineering. 2015, 41(6): 136-142. https://doi.org/10.3969/j.issn.1000-3428.2015.06.025
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    In order to improve the intelligent management level of public transport system,this paper proposes an automatic passenger flow statistical method. It uses a background extraction algorithm based on statistics of histogram combining with multi-frame average to extract video background, and uses background edge removal algorithm to eliminate most of the background information. According to the circular-like feature of passenger head contour,it employs Hough transform based on gradient information detection and Camshift target tracking based on Kalman filtering prediction to complete passenger tracking and counting. Experimental results show that this method can effectively eliminate the background noise and background edges in the picture,identify the passengers target to track them and count accurately,and improve the efficiency of urban public transportation.

  • SHEN Jian,JIANG Yun,ZOU Li,CHEN Na,HU Xuewei
    Computer Engineering. 2015, 41(6): 143-146. https://doi.org/10.3969/j.issn.1000-3428.2015.06.026
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Directed Acyclic Graph Support Vector Machine ( DAG-SVM ) is a novel algorithm of multi-class classification. For an N-class classification problem,DAG-SVM can construct N × (N - 1) / 2 SVM classifiers ( one classifier for a pair of classes) but DAG-SVM may behave poor due to the poor selection of nodes,concerning the situation raised before,the new method is proposed and the nodes selection is to establish alternative sets of nodes for every layer,and it chooses the nodes group which gets the highest training classification accuracy as the lower layer of current layer form the alternative sets of nodes,so as to optimize the topology structure of DAG-SVM. Experimental results show that compared with other methods like DAG-SVM,1-vs-1 SVM and 1-vs-a SVM,the classification accuracy of this method is high.

  • ZHOU Changxi,MAO Li,WU Bin,YANG Hong,XIAO Wei
    Computer Engineering. 2015, 41(6): 147-151. https://doi.org/10.3969/j.issn.1000-3428.2015.06.027
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    An efficient modified Artificial Bee Colony(ABC) algorithm is proposed for function optimization problems to overcome the drawbacks of low computational accuracy and slow convergence of conventional ABC algorithm. In this algorithm,in order to enhance the local search capability of the ABC algorithm,and avoid the premature convergence effectively,onlooker bees do the local search around the current optimal solution,and the radius of the search around the current optimal solution for scout bees is gradually decreased with the increase of iterations. Simulation results of six standard functions show that compared with the basic ABC algorithm,the modified ABC algorithm can attain significant improvement on solution accuracy and convergence rate.

  • LU Zhigang,LIN Ka
    Computer Engineering. 2015, 41(6): 152-157. https://doi.org/10.3969/j.issn.1000-3428.2015.06.028
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    In order to find the best composition of supply chain partners efficiently and accurately,a multi-objective programming model for supply chain partner selection is proposed in this paper. Maximizing the enterprise’s reputation stability,fitness,and the reputation value of the composition of supply chain partners are used as the objective functions; an elitist non-dominated sorting Genetic Algorithm(GA) is introduced to solve the problem to reduce the possibility of weeding out excellent enterprises in the iterative process,so as to find the Pareto optimal solution. Experimental results show that compared with independent decision model and TOPSIS model,the model not only can help to choose the stable partners,but also maximizes the comprehensive utility of the supply chain.

  • YANG Changjian,DENG Zhaohong,JIANG Yizhang,WANG Shitong
    Computer Engineering. 2015, 41(6): 158-164. https://doi.org/10.3969/j.issn.1000-3428.2015.06.029
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Many practical applications for epilepsy detection,the diversity of the health status of epilepsy patient and the timing of Electroencephalogram(EEG) signal measurements lead to the mismatching between the source domain used for classifier trained and target domain used for testing. The classifiers usually do not perform well on the target domain. In order to overcome this shortcoming,an improved Principal Component Analysis(PCA) feature extraction method called Subspace Similarity Measure Based Principal Component Analysis (SSM-PCA) is proposed,and a new classification method,named as SSM-PCA-LMPROJ is proposed by integrating SSM-PCA and the classical classifier Large Margin Projected Transductive Support Vector Machine (LMPROJ). Experimental results show that the proposed method has obvious advantages compared with the traditional method,such as the method combining PCA and K Nearest Neighbor (KNN) classifier.

  • XU Xiaolong,WANG Shitong,MEI Xiangdong
    Computer Engineering. 2015, 41(6): 165-171. https://doi.org/10.3969/j.issn.1000-3428.2015.06.030
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Traditional K-means clustering algorithm is sensitive to the initialization. Spectral clustering operates on the similar matrix,and severely affects the cluster result. Clustering with local and global regularization does not take the distribution of data set into consideration. To solve this problem,this paper introduces the dispersion matrix to improve the clustering on the base of local and global regularization. The proposed algorithm takes the distribution of data set into consideration which combines the local information and dispersion matrix. The global optimal information is considered, and then it gets the final optimization problem which can be solved by the eigenvalue decomposition of a spare symmetric matrix. Several mentioned algorithms are tested on UCI machine learning data sets and public data mining data sets. Experimental results and comparison results show the greater performance of the proposed algorithm.

  • SHI Li,HUANG Ke,SUN Gang,WEN Bo
    Computer Engineering. 2015, 41(6): 172-177. https://doi.org/10.3969/j.issn.1000-3428.2015.06.031
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    The consistency analysis of linguistic assessment matrix and determination of objective weight of expert are two important problems for linguistic multiple attribute group decision making,which are studied as a whole in this paper. The consistency of linguistic assessment matrix is defined based on the concept of deviation of assessment matrix. Combining the optimization idea of group consistency,a weight optimization model based on linguistic decision matrix is constructed and the numerical solution of optimization model is presented. And a method of linguistic multiple attribute group decision making based on objective weight of expert is put forward. The proposed method is illustrated that the objective weights of experts obtain satisfactory consistency through a case study.

  • ZHU Qi,ZHANG Huifu,YANG Yubo,YANG Quanqing
    Computer Engineering. 2015, 41(6): 178-182,187. https://doi.org/10.3969/j.issn.1000-3428.2015.06.032
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Aiming at the problem that the traditional Hierarchical Clustering ( HC ) algorithm is facing enormous challenges in computation,this paper proposes an algorithm for fast clustering. The algorithm based on the size of the data point density values determines the initial cluster centers sequentially,and for the disadvantages of HC,merger needs to be updated every time in the distance matrix. It uses the minimum spanning tree algorithm to store the similarity distance between the initial cluster centers, finds the optimal merging path, reduces the amount of computation and space complexity to update the distance matrix,and optimizes the convergence function. Experimental results on UCI datasets show that the algorithm is faster,high efficiency than the traditional clustering algorithm. With the increasing of data,the advantage of this algorithm in terms of time consumption is the more obvious.

  • HUANG Wei,LIN Jie,JIANG Yu’e,JIANG Binhua
    Computer Engineering. 2015, 41(6): 183-187. https://doi.org/10.3969/j.issn.1000-3428.2015.06.033
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Automatic classification of software bug reports save a large number of time and human resources. However, the bug reports submitted by users have a strong subjectivity,with casual text descriptions. This results in ineffective classification. Two improved algorithms are proposed to reduce feature dimensions in classifying bug reports from their text descriptions. These two algorithms are based on the traditional Term Frequency-Inverse Document Frequency(TFIDF) algorithm,combined with the term frequency in documentations and the distribution of the term in the same category and different types of categories. One weight processing is used after feature dimension reduction in order to get a better result. Experimental results indicate that the proposed algorithm has better performance in term of precision, recall,F1 score,and accuracy than the current algorithms.

  • YI Tangtang,HUANG Lihong
    Computer Engineering. 2015, 41(6): 188-194,200. https://doi.org/10.3969/j.issn.1000-3428.2015.06.034
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    A new algorithm with adaptive feature extraction and feature selection simultaneously is proposed to improve the performance of Content-based Image Retrieval(CBIR). Semantic gap between low-level visual features and high-level semantic information is reduced by synchronization in feature extraction and feature selection. A parameterized wavelet is used to improve accuracy of image details. Mother wavelet function of color histogram feature is optimized and interval parameters are quantified using multiple gravity search algorithm. Experimenal results on 1 000 images searched by Corel show that compared with the most relevant algorithm,fusion algorithm of Gravitational Search Algorithm and Support Vector Machine(GSA-SVM),fusion algorithm of Fuzzy Color Histogram and Fuzzy String Matching(FCH-FSM),the retrieval accuracy is higher,and the and average time consumption is less.

  • DONG Que,WANG Jiangqing,SUN Yangguang
    Computer Engineering. 2015, 41(6): 195-200. https://doi.org/10.3969/j.issn.1000-3428.2015.06.035
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    The traditional two-dimensional Otsu thresholding segmentation algorithms do not think about human vision characteristics and the result of segmentation can not match up to the visual perception of human eye. In order to solve this problem,an algorithm based on the two-dimensional Otsu algorithm and the lateral inhibition network is proposed. In this algorithm,the lateral inhibition network of human visual system that has the features of enhancing center and inhibiting surroundings is fully used. The lateral inhibition network is utilized to process the original picture and obtains the lateral inhibition picture. A two-dimensional histogram based on the gray information and lateral inhibition information of pixels is established. The maximum between-cluster variance is chosen as the criterion to select the optimal threshold. Experimental results show that this algorithm not only is well adapted to the contrast and illumination intensity, but also has the capacity for fitting the breaks compared with the traditional Otsu algorithm and two-dimensional Otsu algorithm. It improves the robustness to image noise and obtains more perfect segmentation results.

  • ZHANG Zijuan,KANG Baosheng
    Computer Engineering. 2015, 41(6): 201-205,210. https://doi.org/10.3969/j.issn.1000-3428.2015.06.036
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    All kinds of the seam carving algorithms are widely applied to image scaling. But the original image can produce distortion or excessive deleting problems after the seam carving algorithm. So an improved image adaptive algorithm is proposed. Adjacent seams can be merged by using low-pass filter. Undeletable points can be marked by applying the threshold technology to avoid that new seam pass through high energy points in the next iteration. The application of stopping criterion is able to prevent the seam pass through the main target because of the further iterations when seam carving operations are to a certain extent. Experimental results show that compared with forward and backward seam carving algorithms,this improved algorithm avoids the problems about the distortion of seam carving algorithm and excessive deletion without the increase of complexity in time and space. Less distortion is produced and scale effects are better.

  • QIAN Jie,YI Sanli,SHAO Dangguo,GUO Beibei,MIAO Ying
    Computer Engineering. 2015, 41(6): 206-210. https://doi.org/10.3969/j.issn.1000-3428.2015.06.037
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Probabilistic fiber tracking algorithm only uses the maximum probability to track fibers,and ignores some orientations which have big probabilities. And the speed of calculation is slow. So a fast probabilistic fiber tracking algorithm based on the threshold is proposed. This paper sets the threshold which can find more crossing and branching fibers. Simplifying the parameters of calculation can improve the speed without affecting the effect of fiber tracking. Experimental result shows that this algorithm can reflect the distribution of the neural fibers in the cerebral white matter and reduces the time of calculation compared with probabilistic fiber tracking algorithm.

  • CAO Xiang,CHEN Xiuhong,PAN Ronghua
    Computer Engineering. 2015, 41(6): 211-215,220. https://doi.org/10.3969/j.issn.1000-3428.2015.06.038
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    The traditional Super Resolution ( SR ) algorithm via over-complete sparse representation has several problems,such as too large training patches,long training and iteration time,and fixed sparse degree. In view of these disadvantages,a fast SR algorithm is proposed. The core of this algorithm is to estimate the scale of the training patches by introducing Fast Kernel Density Estimation(FastKDE) to get the reasonable number of training patches in the stage of dictionary learning,and to overcome the shortcomings of greed series of sparse representation algorithms with fixed sparse degree and shortens the iteration time by using improved Generalized Orthogonal Matching Pursuit(GOMP) algorithm in the stage of sparse representation. Experimental results show that compared with the traditional dictionary training algorithm,this algorithm can improve the accuracy of SR reconstruction,and the average iteration time is less.

  • YI Sanli,MIAO Ying,QIAN Jie,GUO Beibei,XIANG Yan
    Computer Engineering. 2015, 41(6): 216-220. https://doi.org/10.3969/j.issn.1000-3428.2015.06.039
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    The normal evaluation algorithm does not bring out the edge features which are taking along important information in the image. Though Region of Interest(ROI) and dual-scale edge structure similarity algorithm take the importance of edge information into consideration, this algorithm is not an ideal one to the identification of edge information. Based on the above information,this paper proposes the edge structure similarity quality algorithm which is based on the interesting region. The algorithm departs the image into interesting and non-interesting region,and evaluates these two regions with the edge structure similarity quality algorithm separately,and realizes weighted combination. Experimental results show that the algorithm has a stronger ability to identify the edge information and is more sensitive to the variations of the image quality.

  • TANG Hao,LI Xiaoxia,ZHONG Ying
    Computer Engineering. 2015, 41(6): 221-226. https://doi.org/10.3969/j.issn.1000-3428.2015.06.040
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    According to the problem of fast pedestrian detection in complex background,this paper proposes a two stage cascade fast pedestrian detection algorithm. At the first stage,most of non-pedestrian areas can be excluded by using the vertical edge symmetrical feature and the weak classifier based on pedestrians prior knowledge. During the second stage, the remaining areas are further accurately detected with Histograms of Oriented Gradient(HOG) features and the sparse representation classification algorithm based on LC-KSVD dictionary learning. Experimental results show that the algorithm ensures the detection accuracy and shortens the pedestrian detection time. It also has good robustness to occlusion. In the INRIA database,the average time required for each image is only 69 ms,the logarithmic mean missing rate is 38% ,compared with CENTRIST + C4 algorithm and HOG + SVM algorithm,the relative missing rates are decreased,and the relative detection speeds are increased.

  • WANG Min,ZHOU Zhaozhen,LI Changhua,WEI Mingfei,MAO Li
    Computer Engineering. 2015, 41(6): 227-230. https://doi.org/10.3969/j.issn.1000-3428.2015.06.041
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Against the issues of Harris algorithm for image processing for extracting feature points less real-time,the
    calculation is larger,and sensitive to noises. This paper proposes a Harris corner detection algorithm combined with
    pixels’ gray level difference. In this algorithm,the point is detected taking contradistinction from adjacent 16 pixels on
    the circumference of a circle that radius equals to 3. The number of non-similar pixels is calculated and judged as a
    candidate corner and then the Harris corner response function is calculated to extract corner,combing with the idea of
    SUSAN algorithm to remove false corners. Experimental results show that the algorithm improves original algorithm’s
    real-time while increasing the number of corner extraction and removing most of the false corners effectively,and helps to
    improve the speed and accuracy of image corner extracting.

  • ZOU Yanni,LIU Xiaoping,LI Chunquan,HU Lingyan
    Computer Engineering. 2015, 41(6): 231-235. https://doi.org/10.3969/j.issn.1000-3428.2015.06.042
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    In order to obtain high quality simulation effect with low time and space complexity,and the stability of system,a novel improved algorithm for shape mathcing based on Splat graphic element is presented. In the algorithm,a new Splat graphic element is adopted instead of the classical point graphic element,the surface of object is seamlessly covered with the least number of Splats to ensure rendering quality,which can be achieved by controlling the sampling density and automatically adjusting the radius of circular Splats. The deformation of Splats is calculated with shape matching algorithm. Experimental results show that for the same geometric model,the new algorithm can reduce about 50% storage space and improves the computational efficiency by about two times compared with the classical algorithm. The algorithm has stability in dynamic simulation.

  • JIANG Xiaoliang,LI Bailin,DONG Yang,CHEN Shaojie,HE Biao,WANG Qiong
    Computer Engineering. 2015, 41(6): 236-239,253. https://doi.org/10.3969/j.issn.1000-3428.2015.06.043
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Local Binary Fitting(LBF) model based on region has great advantage in dealing with the segmentation of images with intensity inhomogeneity. Because it only considers original image statistical information,it cannot efficiently segment images with heavy noise. In order to overcome these problems,this paper proposes a novel local active contour model based on the original image and the difference image. This model that combines the difference image information is on the basis of the original image intensity statistics. The energy function is then constructed with Gaussian function as the kernel function. The contour can be driven to the objective boundaries by using the gradient descent method. Experimental results show that the proposed model is able to deal the image with noise and low signal-to-noise ratio,and can also reduce the sensitivity on the initialization of active contour when compared with the classical active contour model.

  • Yasen Aizezi,Aishan Wumaier
    Computer Engineering. 2015, 41(6): 240-246,257. https://doi.org/10.3969/j.issn.1000-3428.2015.06.044
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    In the respect of moving target detection,focused on the high complexity of present algorithms,a gray feature model-based background subtraction method is proposed. By extracting the gray features of the pixels in the video image, the pixel of the video image can be presented by a set of gray features,which is taken as a basis for determining the background / foreground state of the corresponding pixel in the video image by computing the distance between the gray value of the pixel in the video image and the gray value of the pixel in the gray feature set. Experimental results show that,the gray feature model-based background subtraction method can significantly enhance the processing speed and reduce the time complexity of the moving target detection,in case of the same detecting results.

  • ZHOU Yufeng,LI Zhi,ZHENG Bin
    Computer Engineering. 2015, 41(6): 247-253. https://doi.org/10.3969/j.issn.1000-3428.2015.06.045
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    In order to promote the logistics system,a joint Location-Inventory Problem(LIP)model with lead-time is built,considering disruption risks,stochastic demands,facility capacity constraints. The goal is to minimize system cost and maximize customer satisfaction. A discrete nonlinear mixed integer programming model with 2 goals is built to describe the problem. An improved Non-dominated Sort Genetic Algorithm ( NSGA) based on niching technology is worked out to solve the model. Numerical example and control experiment indicate that the Pateto front solution set can be obtained and the improved NSGA has obvious advantages compared with standard NSGA. In practical application, optimal decision schemes can be selected from a cluster of Pateto solutions according to the preferences and actual needs of decision makers.

  • XIONG Wanqiang,WANG Beili,SUN Xiaoguang
    Computer Engineering. 2015, 41(6): 254-257. https://doi.org/10.3969/j.iss10.3969/j.issn.1000-3428.2015.06.046
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Vocabulary learning is the foundation of English learning. Traditional memory models advise the user to memorize vocabulary by a specific time period. These static models are so complex in making review plan that it is not effective to memorize vocabulary for the user. This paper proposes an intelligent vocabulary memory model. It adopts the power function to quantify the Ebbinghaus biological memory curve and depicts the memory effect of every word with the curve to remind the user to review the word in time before forgetting it and dynamically adjusts the curve. Experiments show that the model can make precise review plan for the user which effectively saves about 37. 04% of the time and is more effective compared with the traditional memory models.

  • XU Ying,DUAN Fajie,JIANG Jiajia,XU Fei,LIANG Chunjiang,FENG Fan,BO En
    Computer Engineering. 2015, 41(6): 258-262,268. https://doi.org/10.3969/j.issn.1000-3428.2015.06.047
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    The object under the action of external force tends to have structural damage due to its displacement and deformation. To handle the problem,the three-dimensional mathematic model of pipe must be established with binocular vision principle. By studying camera calibration and feature extraction,matching the feature and restoring 3D geometric information of the marking point,displacement of the object can be obtained. Due to the traditional camera calibration method is high-precision in conventional applications,once beyond the field range,its measurement accuracy decreases rapidly,so it is not suitable for field measurement of the project. This paper introduces a method about nonparametric camera calibration for field measurement,and establishes the imaging model based on the new method. By conducting accuracy comparison experiments according to the calibration method in different environments,and the test results show that,precision of nonparametric camera calibration is approximately 72. 5% higher than the traditional method,as well as good stability,which can meet the measurement requirements of displacement.

  • YANG Yao,LU Yongzhong,HUANG Jincai,LIU Zhong,BAO Weidong
    Computer Engineering. 2015, 41(6): 263-268. https://doi.org/10.3969/j.issn.1000-3428.2015.06.048
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Most network models pay only attention to topology structure, and overlook nodes’ and edges ’ own property. Therefore, to overcome this shortage, this paper establishes heterogeneous network model including heterogeneous nodes and heterogeneous edges, and takes it as nonlinearity system. At the same time, it establishes heterogeneous network topology structure model based on logical trigger,simulates network information flowing,and forms heterogeneous network system abilities,and conducts experiment on military system. Result shows good effect for heterogeneous network modeling based on logical trigger in network analysis,which has better usability and scalability for research heterogeneous network.

  • LIU Niantang,WENG Yu,LIN Yu,ZHANG Wenrui,WEI Zhilei,SHAO Kun
    Computer Engineering. 2015, 41(6): 269-273,279. https://doi.org/10.3969/j.issn.1000-3428.2015.06.049
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    In order to effectively manage embedded systems, especially reduce power consumption of the mobile terminal,this paper proposes a power management scheme,which is based on the design of a more refined dynamic power management scheme. It is based on Application Program Interface ( API) behavioral characteristics,using BP neural network algorithm to predict the type of application,through the effective prediction of application types. It can adjust the system state in advance,without affecting system performance,effectively reducing power consumption,realize the power of real-time embedded devices and dynamic management.

  • WANG Rui,MENG Lingkui,ZHANG Wen,LI Jiyuan
    Computer Engineering. 2015, 41(6): 274-279. https://doi.org/10.3969/j.issn.1000-3428.2015.06.050
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    This study,based on the frequent logouts of the P2P distributed system nodes due to the unsteady network communication during the voyage,aims to optimize the fault-tolerant strategies for P2P distributed structure and promote the system’s ability to cope with disasters. It also explores and designs the fault-tolerant mechanism for distributed system in ocean environments,and specifies node selection methods more applicable to brand new fault-tolerant mechanism which improves the reliability and stability of the node selection. With the optimization of the P2P structure,the study designs and tests the distributed data platform prototype system for the ocean environment. With simulation experiments, the study proves that the system boasts strong flexibility,independence and stability.

  • TIAN Jianwei,QI Wenhui, LI Xi,LIU Xiaoxiao,XIAO Degui
    Computer Engineering. 2015, 41(6): 280-286,293. https://doi.org/10.3969/j.issn.1000-3428.2015.06.051
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Due to the difficulty of handling data concurrency in supervisory and control system of charging facility,this paper proposes a data concurrency control algorithm based on M / M / c queuing theory. The real-time and concurrency of data transmission is identified as the key to improve performance of supervisory and control system. The process of data frame into the communication server is substantially Poisson distribution according to stochastic process theory so that the concurrency control theory model is based on queuing theory. The data concurrency control algorithm is implemented by the multithread and multi-queue. Based on performance indicators,it presents quantitative analysis for the relationship between costs and the number of optimal gateway,obtains optimization method for the optimal number of gateways. The concurrent processing algorithm is applied to large-scale charging station for analysis,and experimental results show that the algorithm not only can meet the needs of real-time and concurrency data frame transmission,but also proves that the gateway optimization method can accurate derive optimal number of gateways.

  • ZHANG Liang,WANG Tiancheng,WANG Jian,LI Huawei,GUO Jian
    Computer Engineering. 2015, 41(6): 287-293. https://doi.org/10.3969/j.issn.1000-3428.2015.06.052
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Intel’s third generation of PCI Express bus technology can satisfy the requirements to the development of the computer system from the bus bandwidth on the structure,which leads to the vigorous development of IC design based on PCIE. The verification of the PCIE becomes an important part of function verification of the SoC. This paper designs and implements an automatic verification platform driven by combination both state graph and coverage, which mainly includes the mechanism of test generation,automatic check and coverage analysis. It uses the platform to verify the function of a protocol stack chips based on PCIE. Experimental results show that the verification platform has a good stimulus generation mechanism,which can conduct a comprehensive verification of the protocol stack chip design. In addition,the platform has the advantage of good reusability and expandability,which can verify the system of the multiple protocol stack interconnected.

  • CUI Wenshun,ZHANG Zhiyi,YUAN Lizhe,CUI Shuo,LI Jianling
    Computer Engineering. 2015, 41(6): 294-299,305. https://doi.org/10.3969/j.issn.1000-3428.2015.06.053
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    In the construction of intelligent sunlight greenhouse Internet of Things(IOT) as the goal,this paper puts forward a service platform for sunlight greenhouse group IOT based on cloud computing. The service platform is divided into five layers,which are named the perception manipulation layer,the acquisition control layer,the network transmission layer,the portal service layer and the background cloud layer,realizes the function of greenhouse group data storage and management control cloud data analysis. It studies the sunlight greenhouse production process oriented dynamic real-time cloud early warning technology and cloud analysis modeling system,promotes the fine work of the level of production in sunlight greenhouse. Application results show that the service platform can expand the management scale of sunlight greenhouses,reduce the cost of construction and operation IOT,improve the storage and data analysis abilities of IoT. The platform has good scalability,security,and stability,which has a good application prospect in the information management and control fields of facility agriculture.

  • SHEN Yanxin,PAN Feng
    Computer Engineering. 2015, 41(6): 300-305. https://doi.org/10.3969/j.issn.1000-3428.2015.06.054
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Not relying on mathematical model,model-free control is a control method which has good control effect in complex system. The stability and fast of the large time-delay control system is a pair of conflicting issues,and more attention is paid to its stability studies than rapidity in the past. Based on the basic model-free control method and drawing lessons from the control method of function combination,this paper obtains an improved model-free control method that pulls the speed factor v(k) in fan model and improves the control method for controlling the speed of delay system giving consideration to both its stability and speed control. This paper proves that the improved method is feasible by certifying Bounded Input Bounded Output (BIBO) stability and convergence of the improved model-free control method. Matlab simulation test also proves that the method improves control speed of the large time-delay system.

  • XU Luqiang,XIAO Guangcan
    Computer Engineering. 2015, 41(6): 306-309,315. https://doi.org/10.3969/j.issn.1000-3428.2015.06.055
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Due to the volume conduction multi-channel Electroencephalogram(EEG)recordings give a rather blurred image of brain activity. Three-channel motor imagery EEG is analyzed. The result of EEG recognition is not satisfactory. It studies multiple classifier fusion to improve the motor imagery EEG classification accuracy,extracts feature from power spectrum calculated from the EEG, and designs classifiers based on the well-known Linear Discriminant Analysis(LDA)method. The fusion of the individual classifiers is realized by means of the Choquet fuzzy integral. BCI competition 2003 dataset III is used to validate fusion method. It demonstrates that the proposed method comes with better performance when compared with single techniques,and shows the effectiveness of the proposed method for dealing with EEG.

  • RONG Quanbing,WANG Fengshan,ZHANG Hongjun
    Computer Engineering. 2015, 41(6): 310-315. https://doi.org/10.3969/j.issn.1000-3428.2015.06.056
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    To improve the scientific and intelligent peculiarity in the earthquake-damaged risk judgments for underground structure,an earthquake-damaged risk information fusion method is proposed based on evidence theory. On describing the principle of such risk intelligent monitoring structures for underground engineering, the earthquakedamaged risk fusion rule is proposed,and the earthquake-damaged risk fusion model is erected for underground structure on intelligent monitoring and expert judgments. Following the nature ideology and model rule of evidence theory,the uniform status space and risk probability expression mode is established,the fusion frame is built,the decision-making rule is determined,and the fusion model component is designed for the risk evidences about such earthquake-damaged underground structure. Example shows that the model can effectively improve the decision-making credibility about the earthquake-damaged information fusion for underground structure,fully demonstrating the effectiveness of such fusion model component.

  • FANG Shu,NI Yude,LIU Yi,CHANG Yingxin,WU Tiancheng
    Computer Engineering. 2015, 41(6): 316-321. https://doi.org/10.3969/j.issn.1000-3428.2015.06.057
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    In order to identify spoof target in Automatic Dependent Surveillance Broadcast (ADS-B) system,a new hybrid location method is proposed by combining Time Difference of Arrival(TDOA) and Time Sum of Arrival(TSOA) to obtain the real space location,identify the spoof target by comparing with the location data in ADS-B message. In order to effectively improve location accuracy,in this paper,the least square algorithm is used to get initial location result, acting as the initial value in Taylor series iteration to get accuracy results. Simulation results show that this mixed algorithm can solve the convergence problem of Taylor method. With the same arrangement of ground stations,the new hybrid method shows better accuracy and is more stable than traditional TDOA location method.