Author Login Editor-in-Chief Peer Review Editor Work Office Work

15 April 2017, Volume 43 Issue 4
    

  • Select all
    |
  • CHEN Dong,SHAO Zengzhen,WEI Zhengzheng,LIU Yanmin
    Computer Engineering. 2017, 43(4): 1-7. https://doi.org/10.3969/j.issn.1000-3428.2017.04.001
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The trajectory data of moving objects contains a large amount of spatio-temporal information,and mining the periodic pattern hidden behind the spatio-temporal information is of great significance.In this paper,an algorithm for detecting the periodic pattern of the moving objects based on three stages is proposed.Through the study of the temporal and spatial characteristics of the trajectory points,it identifies and eliminates duplicate data.Density clustering algorithm is used to find the dense region of the locus and the periodic pattern of each moving object in the dense region,which solves the problem of the repetition of the trajectory data,the incontinuity of sampling data and the finding of the periodic pattern period of the moving objects.Experimental results based on 2003—2015 China birding record center,China Birding Report(CBR) and other public data show that this algoithm can process the trajectory data effectively and dig out the periodic pattern of the moving objects with regularity accurately.
  • WU Kehe,ZHU Yayun,LI Haoyang,LI Quan
    Computer Engineering. 2017, 43(4): 8-14. https://doi.org/10.3969/j.issn.1000-3428.2017.04.002
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper researches on the real-time predication of power grid Time-series Data(TSD) and puts forward a predication framework based on Storm platform and Autoregressive Integrated Moving Average(ARIMA) model.It analyzes the characteristics of different types of power grid TSD and presets fitting model to reduce the blindness of model building and to shorten the time of predication.Meanwhile,it designs a new storage mode for TSD based on HBase to accelerate the speed of data retrieval.It compares the influences of different data samples on the results of predication and analyzes the prediction error in real time through the concurrent prediction of massive TSD sources.Finally,three aspects including prediction precision,computing speed and resource occupancy are chosen to verify the effectiveness and practicability of the proposed framework by authentic cases.
  • JIANG Hang,LU Tun,GU Hansu,DING Xianghua,GU Ning
    Computer Engineering. 2017, 43(4): 15-20,27. https://doi.org/10.3969/j.issn.1000-3428.2017.04.003
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The static energy consumption outlier detection method prones to misjudgment of justice in the dynamic campus building energy consumption environment.Therefore,an improved outlier detection method for energy consumption of campus building is proposed.The method uses SA-DBSCAN algorithm based on the statistical characteristics of energy consumption data to identify the building energy consumption mode adpatively.Then it uses C4.5 algorithm to build energy consumption pattern decision tree.After the corresponding category of the real-time energy consumption data is obained,according to the decision tree,it uses LOF algorithm to realize outlier analysis and anomaly detection.The normalized energy consumption is updated incrementally to the building energy consumption mode,and the anomaly detection model is dynamically adjusted according to the update results.Experimental results show that the method can detect the abnormal energy consumption data effectively and fit the change of the campus building energy environment step by step which reduces misjudgments.
  • CHEN Guangsheng,CHENG Yiqun,JING Weipeng
    Computer Engineering. 2017, 43(4): 21-27. https://doi.org/10.3969/j.issn.1000-3428.2017.04.004
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The parallel RDD-DBSCAN algorithm has a repeated access to the data set in the data partition and region query steps,which reduces the efficiency of the algorithm.Aiming at the above problems,a parallel DBSCAN algorithm based on data partitioning and fusion stragy(DBSCAN-PSM) is proposed.It imports the KD-tree to partition the data,merges the partition and region query steps,reduces the number of access to the data set and decreases the influence of I/O on the algorithm.Data fusion method is realized by determining the clustering characteristics of the spatial boundary points,which avoids the time overhead of global markup.Experimental results show that DBSCAN-PSM algorithm runs faster than RDD-DBSCAN by 18%.It can deal with mass data clustering problem more effectively.
  • LU Feng,LI Hairong,HAN Yan
    Computer Engineering. 2017, 43(4): 28-33,38. https://doi.org/10.3969/j.issn.1000-3428.2017.04.005
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In Web service recommendation process,the prediction accuracy of missing Web Quality of Service(QoS) value has important impact for the service recommendation rationality.Therefore,this paper puts forward a novel collaborative filtering recommendation algorithm of Web services based on QoS value and spatial and temporal similarity sensing.Firstly,the framework of Web service recommendation system is designed from the perspective of Web QoS collaborative forecasting and the definition of relevant parameters is given.Secondly,according to the problem of dissimilarity between part of the service and the target service in traditional top-K algorithm,spatial and temporal similarity perception with similar weight is used to predict missing data,which can improve the prediction accuracy.Then the calculation process is presented by the simple example.Finally,the effectiveness of the proposed algorithm is verified by experimental results.
  • LI Limiao,CHEN Zhigang,LIU Zhixiong,YE Hui
    Computer Engineering. 2017, 43(4): 34-38. https://doi.org/10.3969/j.issn.1000-3428.2017.04.006
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Because the individual behavior has diversity characteristics under the big data environment,it results that the consideration of the general individual behavior trust evaluation model based on local information is not comprehensive,causing serious trust crisis of individual.Therefor,this paper presents an improved trust evaluation model of individual behavior.Firstly,it uses multi-data fusion to get the result of trust evaluation,and then fuses the mass function of the individual trust of relative trust evaluation and the evaluation results by using the theory of D-S.It gets the probability of the individual appearing distrust.Each related individual distrust situation is obtained by the fusion with individual trust situation factor,and it can get the fusion of weight about each individual participating in the individual behavior trust evaluation and the individual behavior trust evaluation.Experimental result shows the proposed model has higher reliability and security than the general trust evaluation model based on local information.
  • QIU Xiaojie,AN Hong,CHEN Junshi,CHI Mengxian,JIN Xu
    Computer Engineering. 2017, 43(4): 39-45. https://doi.org/10.3969/j.issn.1000-3428.2017.04.007
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Keeping processor power consumption under budget can reduce the cost of cooling and improve the system reliability.Most existing energy efficiency optimization schemes are profile-based offline schemes,which may reduce practicality.Furthermore,centralized algorithms which exhaustively search for the optimal configuration may be too complex.In this paper,a power consumption optimization scheme PPCM is proposed,which utilizes Dynamic Voltage and Frequency Scaling(DVFS) to control CPU power consumption under buidget and improve energy efficiency.PPCM decouples power consumption control and power consumption allocation to improve flexibility.It uses a dynamic linear model to estimate power consumption and manages it based on feedback control technology.It uses the ratio of computation to memory access as an indicator to allocate power consumption among applications,and then considers the features of multithread application and allocates power consumption among threads.Experimental results show that PPCM outperforms Priority algorithm by 10.7% in speed in average,5.1% in energy saving in average and Energy-Delay Product(EDP) is reduced by 14.3% in average.It is superior to PCMCA by 4.5% in speed in average and 5.0% in EDP in average.
  • JIANG Shengjian,HU Xiangdong,YANG Jianxin
    Computer Engineering. 2017, 43(4): 46-51. https://doi.org/10.3969/j.issn.1000-3428.2017.04.008
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In Simultaneous Multithreading(SMT) processors,different threads have different demands for floating-point and integer resources.How to allocate shared resources among threads is the key point to improve the whole performance for SMT processors.Aiming at this problem,this paper proposes an instruction fetch policy with different allocations of floating-point and integer resources,which reasonably allocates the usage of each thread for floating-point and integer resources.Experimental results show that,compared with ICOUNT,STALL policy,etc.,the proposed policy improves performance when using average IPC and harmonic average IPC as a metric.Meanwhile,it also has advantages when processing the programs mixing floating-point and integer.
  • LI Ruixiang,LIN Zhitao,MA De
    Computer Engineering. 2017, 43(4): 52-59. https://doi.org/10.3969/j.issn.1000-3428.2017.04.009
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The system bus is the module of IP communication in System on Chip(SoC).Existing researches are usually static methods,which maintain the highest frequency in need according to the demands of performance,thus resulting in much power consumption.In this paper,a kind of dynamic power consumption management strategy is proposed,and an adaptive frequency scaling system is realized.By monitoring the bus masters’ load status,the bus load status in next timeslot is predicted on the basis of historical results of program running.Then according to the predicted bus load status,the optimal bus frequency is optimized.Thus,the power consumption is the minimum value when meeting the demand of performance.This strategy can quickly establish the stable frequency scaling mode,therefore it has good stability and reliability.Experimental results show that,compared with original static management strategy,the proposed strategy reduces the bus power consumption by 45.6%~50.7%.
  • ZHANG Jing,CHEN Yao,FAN Hongbo,SUN Jun
    Computer Engineering. 2017, 43(4): 60-66. https://doi.org/10.3969/j.issn.1000-3428.2017.04.010
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The computer temporal constraint and the hardware energy consumption threshold may have a conflict,when the tasks of Cyber-physical System(CPS) are scheduled,and that cannot guarantee correct execution sequence.In order to solve this problem,a control strategy of task scheduling permissions is proposed.Super-dense time model is used to express global time signal.The remainder value density,execution enthusiasm and resource consumption of events are used to represent the quantity of value,deadline and energy consumption of tasks.Equipment and resources are distributed for real-time tasks,and the occurrence of thrashing are reduced.Simulation results show that this strategy can improve the cumulative value of tasks,and reduces the energy consumption and execution time,achieves higher overall efficiency of the system.
  • ZHAN Ling,FANG Xieyun,LI Daping,WAN Jiguang
    Computer Engineering. 2017, 43(4): 67-72,83. https://doi.org/10.3969/j.issn.1000-3428.2017.04.011
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to solve the problem of long response time of metadata writing in Ceph file system,a metadata cache backup scheme is proposed.The scheme backups the metadata among multiple servers to guarantee the reliability of the metadata,and multi-queue cache organization structure based on the heat of the metadata and the relevant cache prefetching and destage algorithms are designed.Compared with the performance of the Ceph metadata management subsystem,experiment results show that the performance of the proposed scheme is improved by 15.7%.
  • FANG Guoqing,LI Wenming,YU Yang,ZHANG Yang,YE Xiaochun,AN Hong
    Computer Engineering. 2017, 43(4): 73-78,89. https://doi.org/10.3969/j.issn.1000-3428.2017.04.012
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The rapid development of high-throughput applications makes simulation speed increasingly become the bottleneck of large-scale many-core architecture research.In order to solve this problem,based on the simulation platform of high-throughput many-core architecture,a series of simulation acceleration techniques are proposed.The lookup table method is used to accelerate the decoding of instructions.In the aspects of event scheduling algorithm,time stepping algorithm and lock-free queue,the parallel discrete event simulation framework is optimized.Memory pool policy is adopted to improve the efficiency of memory management.Experimental results show that lookup table method,parallel discrete event simulation and memory pool policy achieve improvement at their corresponding stages in respect of simulation speed compared with the non-optimized ones.
  • QI Zhiyuan,LI Zhifeng
    Computer Engineering. 2017, 43(4): 79-83. https://doi.org/10.3969/j.issn.1000-3428.2017.04.013
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to realize the flexible construction of microgrid with distributed generation,ZigBee is used to transmit information for the monitoring network of microgrid.Communication address that is composed of dynamic short address and virtual short address is proposed to identify unique address of each unit of microgrid.The ZigBee network nodes preferentially connect with the channel of high peak energy to improve the reliability of communication link.Experimental results show that the combination of dynamic short address with virtual short address can correctly identify the address of the field device.Compared with the 64 bit unique address assigned by ZigBee manufacturer,the application of short address can reduce more effectively the data packet length during communication.The tests of network verify that both communication distance and stability of data transmission meet the monitoring network requirement of microgrid.
  • LI Jianhui,ZHANG Yongtang
    Computer Engineering. 2017, 43(4): 84-89. https://doi.org/10.3969/j.issn.1000-3428.2017.04.014
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    A novel Offset Encoded Trie(OET) IP addressing algorithm is proposed.It uses OET to represent a set of IP prefix rules,significantly reducing the storage space requirements.Each OET node maintains only one next hop step and a bitmap offset value,without the need of child pointers and pointer to the next hop step,thereby improving the IP addressing performance.The actual IP prefix rule sets are used for experimental evaluation.Compared with bitmap trie,OET reduces the storage space overhead on actual IPv4 and IPv6 prefix rule sets by 60%~76% and 55%~63%.Therefore,OET is an efficient data storage structure.The entire OET may be stored in on-chip memory to achieve high-speed IP address lookup,meeting scalability requirements of the virtual routers and software routers.
  • LIU Rui,LI Jun
    Computer Engineering. 2017, 43(4): 90-93,99. https://doi.org/10.3969/j.issn.1000-3428.2017.04.015
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Due to the leaving copies everywhere cache decision-making strategy that Content-centric Networking(CCN)uses by default has the cache redundancy problem,and although the cache decision-making strategy based on betweenness reduces the cache redundancy,its computational complexity is very high,thus not suitable for deployment in CCN network layer. Thus,this paper proposes a cache decision-making strategy based on K-core decomposition method. It calculates the K-core value of every node in the current network,the higher the K-core value is the more important the node to the network of information dissemination when cache data are needed to on the request path,and gives decision-making strategy according to remain cache space and request hop when same the K-core value. Simulation results show that compared to the cache decision-making strategy based on betweenness,cache decision-making strategy based on K-core decomposition method has faster convergence speed. When the network reaches a steady state,it has a higher cache hit ratio and faster response speed.
  • MA Lu,LU Gang,GUO Junxia
    Computer Engineering. 2017, 43(4): 94-99. https://doi.org/10.3969/j.issn.1000-3428.2017.04.016
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Time-varying performance and differentiation of node fitness are abstracted into time-varying difference fitness,and an improved network evolution model based on the fitness is proposed. In the network,the newly joined nodes tend to connect some nodes that have larger degree or attraction. And in the evolution process,the nodes are connected and disconnected with other nodes at any time.On this basis,a series of mechanisms including preferential attachment,random add edges,random delete edges and the nodes’ mutual fans are used to achieve the evolution of the network. The influence of node time-varying performance and differentiation on network evolution is analysed seperatly. Through simulation analysis,the model of the distribution follows a power law distribution and with a small world phenomenon,and has high degree of fitting with the real network. The result verifies the correctness and validity of the model.
  • JING Zhen,XIE Zhijun,SHI Shoudong,JIANG Xianliang
    Computer Engineering. 2017, 43(4): 100-104,109. https://doi.org/10.3969/j.issn.1000-3428.2017.04.017
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The node power control in Wireless Body Area Network(WBAN) needs to be dynamically adjusted according to the current link state.Energy consumption is great under the change of human body environment when it is based on fixed function model and link quality reflected by the Received Signal Strength Indicator(RSSI).To solve this problem,this paper designs a feedback algorithm to select the transmission power level reasonably.It uses a mean value of Link Quality Indicator(LQI) to reflect the link quality and Proportional,Integral,Differential(PID) for feedback control.Experimental results show that,on the premise of gaining 97.6%packet reception rate,and the average energy consumption of the algorithm is reduced by 17.3 mW and 13.7 mW respectively,the average network lifetime is increased by 28.7% and 23.4% as compared with traditional methods like Multiplicative Increase Additive Decrease(MIAD) and Dynamic Postural Position Inference(DPPI).
  • WANG Xin,HAN Yan,SUN Qiang,XU Chen
    Computer Engineering. 2017, 43(4): 105-109. https://doi.org/10.3969/j.issn.1000-3428.2017.04.018
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to minimize the power consumption of downlink multi-user small-cell network,taking the quality of service requirements of all users and the power constraint per antenna into account,this paper provides a network power consumption optimization model. It transforms the initial power consumption minimization problem into a convex problem by the semi-definite relaxation method, and obtains the optimal solution by the convex optimization toolbox. The power consumption minimization depends on the relationship of 1∶1 between the number of selected antennas and the number of served users.Using this characteristic, the power consumption can be further reduced by antenna selection before power consumption optimization.Simulation results demonstrate that compared with the method of only optimizing power consumption and the one with no power consumption optimization and antenna selection, the proposed power optimization method can minimize the proposed power consumption of small-cell network.
  • LIU Qinghua,ZHOU Xiuqing
    Computer Engineering. 2017, 43(4): 110-115. https://doi.org/10.3969/j.issn.1000-3428.2017.04.019
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The sparse coprime array with determined elements can increase the array aperture.In order to make full use of the coprime relationship of the array to form differential coprime synthetic array with more degrees of freedom,this paper exploits the method of sparse reconstruction,and proposes a new Direction-of-Arrival(DOA) estimation algorithm based on iterative weighted l1 norm.The aperture of sparse coprime array can be increased by vectoring.The l0 norm of the observation model corresponding to the related overcomplete basis is successively obtained,and the l0 norm constraint is substituted by weighted l1 norm constraint for reconstruction.Multiple iterations are used to obtain the optimal solution to realize DOA estimation.Experimental results show that,compared with other algorithms,the proposed algorithm can effectively utilize the aperture of the sparse coprime array to improve the direction finding accuracy,and can reduce the gap between l1 norm and l0 norm by multiple iterations so that the shortcoming of conventional l1 norm constraint type algorithms can be overcome.
  • LI Gen,MA Maode,LIU Chunfeng,SHU Yantai
    Computer Engineering. 2017, 43(4): 116-121,125. https://doi.org/10.3969/j.issn.1000-3428.2017.04.020
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Most exiting routing protocols in Vehicular Ad Hoc Network(VANET) generally make use of a specific character of vehicular network,which leads to low delivery ratio or high delivery delay.In order to solve this problem,a multiple attribute decision making routing protocol which integrates both Dedicated Short Range Communication(DSRC) and Long Term Evolution(LTE) techniques is proposed for heterogeneous vehicular network.Integrating LTE and taking buses and bus stops as backbone,the new protocol uses multiple attribute decision making method to choose route for message forwarding.Simulation results show that this protocol can efficiently boost delivery ratio and decline delivery delay compared with the geographical Greedy Perimeter Coordinator Routing(GPCR) protocol and public transport based Distance Vector Routing(DVR) protocol.
  • ZHONG Xianfeng,TANG Yu,JIN Biao,WU Teng,LI Fengzhi,LIU Weiyue
    Computer Engineering. 2017, 43(4): 122-125. https://doi.org/10.3969/j.issn.1000-3428.2017.04.021
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Quantum key distribution applies fundamental laws of quantum physics to guarantee secure communication.Few number of information exchanges is the key of satellite-based quantum key distribution.According to the characteristics and requirements of data reconciliation in satellite quantum key distribution,this paper presents a kind of new data reconciliation model of satellite-based quantum key distribution based on the Turbo codes.The coding and decoding models of Turbo codes are modified and designed,and this model solves the numbers of information exchanges.Simulation result shows that through a number of iterations,Turbo codes can complete the reconciliation of the key in different bit error rate.

  • LIU Yuyang,ZHAO Yiming
    Computer Engineering. 2017, 43(4): 126-132,140. https://doi.org/10.3969/j.issn.1000-3428.2017.04.022
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Attribute-based Signature(ABS) protocol has the property of anonymity,and its private keys are not band to identities.Malicious users might make use of this to leak their private keys for financial benefits.Therefore,this paper proposes a traceable signature scheme,which allows the system to break anonymity and trace the identity by the leaked private key.Acryptography model of attribute-based digital signature protocol along with white-box traceability is presented.It achieves white-box traceability by injecting Boneh-Boyen signature algorithm into private keys.Analysis results show that the scheme has provable security of unforgeability and perfect privacy,and its time complexity differs only one comstant from the current best traceable attribute signature scheme.
  • XU Fu
    Computer Engineering. 2017, 43(4): 133-140. https://doi.org/10.3969/j.issn.1000-3428.2017.04.023
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aimming at the problem that Greedy Perimeter Stateless Routing(GPSR) protocol is widely applied in Mobile Ad Hoc Network(MANET) and there is no proper intrusion detection technology for it,based on the analysis of intrusion detection technology for MANET,this paper proposes an intrusion detection method for GPSR according to the characteristic of GPSR protocol and typical attacks it faces.Five detection constraints are provided,according to which it constructs the basic framework and running process of the intrusion detection algorithm.Experimental results show that the proposed intrusion detection method has high detection rate and low false alarm rate when being used to detect typical attacks.
  • CHEN Yanqin,ZHANG Wenying
    Computer Engineering. 2017, 43(4): 141-144,153. https://doi.org/10.3969/j.issn.1000-3428.2017.04.024
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper studies the capability of impossible differential cryptanalysis on block cipher SIMECK32/64.The 11-round impossible differential path of SIMECK32/64 is constructed using the meet-in-the-middle technology.The attack on 19-round SIMECK32/64 is analyzed by adding 4 rounds on the top and 4 rounds at the bottom.The analysis result shows it only needs to guess 29 bit sub key,and it also verifies that the complexity of SIMECK32/64 algorithm with the impossible differential cryptanalysis is greatly reduced than that of zero correlation linear cryptanalysis.
  • DU Ruiying,HU Li,CHEN Jing,CHEN Jiong
    Computer Engineering. 2017, 43(4): 145-153. https://doi.org/10.3969/j.issn.1000-3428.2017.04.025
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to improve the network control ability and security,researchers usually utilize the features of centralized control and flow table control of Software Defined Network(SDN) to develop a lot of security applications.However,those security applications concentrate on single function and have coarse protection granularity,which cannot form comprehensive protection for the whole network.Aiming at the problem,this paper designs multi-granularity traffic identification system.It manages the infrastructure layer by group based on the thinking of Group Based Policy(GBP),defines the notion of module chain to realize the transition from hardware to software service for security detection,defines the notion of security detection module and classifies it into three modules including statistical detection module,correlation matching module and regular expression matching module.DBP is used to generate module chain and then the different security detection combination module is mobilized by the module chain to implement multi-granularity security detection.The usability of the system in SDN environment is verified by experiments,and it has the characteristics of fine granularity and good expansibility.
  • WANG Lei,GAO Maoting
    Computer Engineering. 2017, 43(4): 154-159. https://doi.org/10.3969/j.issn.1000-3428.2017.04.026
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem of unreasonable distribution of weights in the evaluation process of steganography analysis algorithm based on grey relational analysis,Criteria Importance Through Intercriteria Correlation(CRITIC) method is used to determine the initial weights of each index.The weights of the active modifying indexes are increased or decreased,and then equally distributed to the passive indexes according to the actual decision.The final weights of each index are obtained by adjusting.Grey relational analysis is used to calculate the weighted grey correlation degree between each object to be evaluated and the optimal sequence.The steganography analysis algorithm is evaluated by the correlation value.Simulation results show that the proposed method can evaluate various kinds of steganography analysis algorithm more effectively than other steganography analysis algorithms.
  • SUN Kui,ZHANG Zhiyong,ZHAO Changwei
    Computer Engineering. 2017, 43(4): 160-165. https://doi.org/10.3969/j.issn.1000-3428.2017.04.027
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to improve the classification accuracy of released data under the same privacy preserving strength,on the basis of DiffGen algorithm,an enhanced differential privacy data release algorithm named as GiniDiff is proposed.This algorithm completely generalizes original dataset,selects specialization scheme by using exponential mechanism in each round of iteration,and classifies specialized records into new equivalence classes in the way of building decision tree,and uses Laplace mechanism to add noise to counters of equivalence classes,and generates dataset for release.Owing to the fact that the algorithm uses gini-index gain for the utility of different specialization schemes,reasonable privacy budget allocation and dynamical budget consumption calculation,the utility of the dataset for release is effectively improved.Experimental results show that the algorithm outperforms DiffGen algorithms in classification accuracy and the classification accuracy is close to ideal level.
  • YANG Xiaodong,LI Yanan,ZHOU Qixu,GAO Guojuan,WANG Caifen
    Computer Engineering. 2017, 43(4): 166-170,176. https://doi.org/10.3969/j.issn.1000-3428.2017.04.028
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Signature verification in the existing ID-based proxy re-signature schemes includes time-consuming bilinear pairing operations,which are not suitable for low-power devices.In order to improve the efficiency of signature verification,combined with server-aided verification signature and ID-based proxy re-signature,the concept of ID-based sever-aided verification proxy re-signature is introduced and its security definition is presented.An ID-based sever-aided verification proxy re-signature scheme is designed,which is proven to be secure under collusion attack,adaptive chosen identity and message attacks.Analysis results show that most computing tasks of signature verification in the proposed scheme can be accomplished by a server,and the computation of bilinear pairing is effectively reduced.Hence,this scheme greatly reduces the computational complexity of the signature verification algorithm,and is more efficient than the existing ID-based proxy re-signature schemes.
  • ZHANG Xiaoyi,LU Yan,ZHAI Huiliang
    Computer Engineering. 2017, 43(4): 171-176. https://doi.org/10.3969/j.issn.1000-3428.2017.04.029
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Taking sina Microblog bot users as the object of study,this paper analyses and extracts features of the bot user and proposes a new Microblog bot user identification method.Through the Analytic Hierarchy Process(AHP),it constructs an index system and makes quantitative evaluation of each index feature.It uses Support Vector Machine(SVM) to construct a bot-user identification model.It tests different kernel functions that the importance prediction of each classification index,compared with the result of quantitative evaluation.Meanwhile,using different kernel functions tests the classification accuracy.According to the two results,the optimal classifier is selected.Experimental result shows that the identification method can make an accurate detection to the bot user.

  • SUN Yujie,QIN Yongbin
    Computer Engineering. 2017, 43(4): 177-182. https://doi.org/10.3969/j.issn.1000-3428.2017.04.030
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Using Latent Dirichlet Allocation(LDA) to mine topics which users are interested in is the most popular topic-mining method.In order to improve users′ experience and recommend fresh microblog which users are interested,this paper puts forward a multi-angle microblog recommendation algorithm which is based on the LDA model,then the author takes advantage of microblog′s publish time,forwards,comments and other features to calculate the microblog′s importance and uses the user-topic matrix and topic-word matrix generated by the LDA model to calculate the users′ interest in microblog.It tries to score the microblog by comprehensively considering the microblog′s importance and the users′ interest to the microblog.Microblog is recommended according to the score.Experimental results show that the microblog recommendation algorithm can effectively improve the accuracy of microblog recommendation.
  • ZHU Pu,HUANG Zhangjin
    Computer Engineering. 2017, 43(4): 183-187,193. https://doi.org/10.3969/j.issn.1000-3428.2017.04.031
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Through using sparse Bayesian inference thought,a Feature Selection Probabilistic Classification Vector Machine (FPCVM) is designed which can learn optimal classifier and automatically select the most relevant feature subset.FPCVM is an extension of Probabilistic Classification Vector Machine(PCVM),which improves the performance of PCVM on high dimension datasets.It uses zero-mean Gaussian distribution as priori to introduce sparseness both in kernel functions and feature space;these priors are preformed as regularization items in the likelihood function to acquire more generalized model.Experimental results on high dimension datasets and low dimension datasets show that the algorithm has better classification and feature selection.
  • ZHENG Huafei,ZHOU Xiangdong
    Computer Engineering. 2017, 43(4): 188-193. https://doi.org/10.3969/j.issn.1000-3428.2017.04.032
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Rapid increasing volume of product reviews as well as their variant qualities makes it time-consuming and tedious for individuals to perceive valuable reviews.Therefore,this paper proposes an approach to automatically assess helpfulness of online product reviews based on word vector.The approach introduces word vector as a deep text feature and incorporates it with structure feature,sentiment feature and meta feature to learn a regression model for automatically helpfulness assessing.A ranking procedure based on helpfulness is performed.Compared with UGR+LEN+STR’s approach and the baseline,experimental results conducted on Amazons dataset show that this approach achieves promising performances both on regression and ranking.Furthermore,this paper explores domain-specific word vector model,which can improve assessing effect on RMSE,NDCG and other evaluation indexs.
  • DONG Yadong,LI Zhengyu,WANG Yang
    Computer Engineering. 2017, 43(4): 194-199,206. https://doi.org/10.3969/j.issn.1000-3428.2017.04.033
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Classical machine learning methods are not enough for dealing with tree data because the tree contains not only node information but also structure information.Therefore,this paper proposes an approach of tree echo state network,applicated to tree-structured,it uses the tree-structured echo state network to model tree-structured data,gets a fixed-size space model,and converts the complex tree-structured data into points in the model space.Based on the idea of model space,the similarity between the tree-structured data is measured by the distance between the models.It combines the model with the kernel methods to facilitate classification performance.Experimental results show that,compared with traditional algorithms,the tree echo state network has better performance in related datasets.
  • YOU Feng,CAO Tianliang,LU Gang
    Computer Engineering. 2017, 43(4): 200-206. https://doi.org/10.3969/j.issn.1000-3428.2017.04.034
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Online Social Network(OSN)sampling method is usually used as the benchmark to evaluate other sampling methods.However,the poor performance of UNI limits its application.In this paper,a sampling method called adaptive UNI is proposed.In this method,the whole space of user ID system is divided into intervals.The probability of sampling is adaptively adjusted in each interval according to the real hit rate of the interval.In this process,a threshold is set as the lower limit to solve the cold start problem,while the sampling rate of the interval is used to avoid local optimum.The validity of the method is proved by applying it to real sampling from Weibo.Experimental results show that the method can improve the sampling efficiency and hit rate.
  • LI Shu’ai,YANG Jing,GU Junzhong
    Computer Engineering. 2017, 43(4): 207-211,,216. https://doi.org/10.3969/j.issn.1000-3428.2017.04.035
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    In recent years,the research of automatic summarization is mostly about multi-documents and Web pages,but less about website summarization.A method that summarizes a website automatically based on the hierarchical structure of the website and Latent Dirichlet Allocation is proposed.This method gets the information from web pages in the given website and fuses it,and calculates the weight of sentences according to the proposed sentence weighting formula,and selects the highest weight sentences as the website summarization.An experiment is done based on 20 commercial websites and academic websites,and using ROUGE evaluation.Results show that compared with the summaries only using LDA,ROUGE-1 and ROUGE-L are increased by 0.32 with no stop words;ROUGE-1 is increased by 0.39 and ROUGE-L is increased by 0.38 with stop words.Compared with the summaries only from homepage,ROUGE-1 is increased by 0.03 and ROUGE-L is increased by 0.06 with no stop words;ROUGE-1 is increased by 0.08 and ROUGE-L is increased by 0.07 with stop words.

  • LING Xiao’e
    Computer Engineering. 2017, 43(4): 212-216. https://doi.org/10.3969/j.issn.1000-3428.2017.04.036
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Social tag contains rich content of commodity and reflects user’s personal preference for commodity.Historical price curve of commodity impacts user’s purchase-tendency.Therefore,this paper proposes a personalized recommendation method based on social tag and historical price curve.It breaks up user-tag-commodity tripartite graph network into user-tag and tag-commodity bipartite graph network.In user-tag bipartite graph,weight of historical commodity price curve is used to compute score based on the historical price curve,and positive interest is distinguished from negative one.In tag-commodity bipartite graph,weight of tag to commodity is computed.The two methods are synthesized for commodity recommendation.Experimental result verifies the proposed method can improve the recall and diversity of recommendation effectively.
  • WANG Hao,JI Donghong,HUANG Jiangping
    Computer Engineering. 2017, 43(4): 217-221,227. https://doi.org/10.3969/j.issn.1000-3428.2017.04.037
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the situation that the existing coordinate noun phrase identification cannot deal with the implicit information of the phrase sequence,this paper proposes a new coordinate noun phrase indentificaiton method.Conditional Random Fields(CRF) model and Latent Structured Perceptron(LSP) model are used to identify the sequence of coordinate noun phrases as well as conjunctions and punctuation used to connect coordinate noun phrases in a sequence.In this paper,the task description is firstly carried out for the sequence of coordinate noun phrases.Then the corpus is constructed,and the typical recognition features of the coordinate noun phrase are selected for the experiment.Experimental results show that,compared with traditional CRF model,LSP model with latent information gives superior performance,gains the F score up to 86.36%,and proves that the model can be used for information extraction oriented coordinate noun phrase identification.
  • TIAN Weidong,MIAO Huijun
    Computer Engineering. 2017, 43(4): 222-227. https://doi.org/10.3969/j.issn.1000-3428.2017.04.038
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Most existing opinion target extraction methods are based on the heuristic rules or machine learning using features such as part of speech,morphology and etc.,but the defect of these methods is that deep association relationship mined by dependency syntax analysis is not used.In order to solve this problem,a novel opinion target extraction method for Chinese short critical texts is proposed based on frequent tree patterns mined from dependency relation tree bank.First,this method labels the initial tagging opinion target based on frequent sub-tree patterns,and then it trains out an ordered rule set based on error-driven TBL framework which can be related to the combination of opinion targets.Finally,opinion target is extracted based on the ordered rule set.Experimental results show that this method has good stability and precision,and is better than Support Vector Machine(SVM)-based method on indicators such as recall and F1-score.
  • LI Rui,SUN Fuming,TONG Yujun
    Computer Engineering. 2017, 43(4): 228-233. https://doi.org/10.3969/j.issn.1000-3428.2017.04.039
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to make fourth party logistics(4PL) offer secure and efficient logistics service,the model of 4PL resilient network design for the logistics distribution task of many supply-demand pairs is presented.An optimization model which minimizes the total logistics costs is established.It considers disruption of service nodes and transportation routes,makes the network resilient by constructing backup paths,and takes the number of shared service nodes and transportation routes of main and backup paths as the resilience index.An artificial bee colony algorithm is designed to solve the model.Also,the important parameters of the model of the problem are analyzed.Simulation results show that the presented artificial bee colony algorithm can solve the problem effectively.
  • YANG Xiujuan,DONG Jun,LI Huihui,YUAN Yanzhong,CHEN Xiaodan
    Computer Engineering. 2017, 43(4): 234-238. https://doi.org/10.3969/j.issn.1000-3428.2017.04.040
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    When using PIV algorithm to build a Metric index,it is needed to calculate the distance between the convex hull vertices and all the data points in the convex hull.When the data set is large,this wastes storage space and increases the consumption of query.In order to solve this problem,this paper improves Metric index so that only the distance between the convex hull vertices and part of the data points within the convex hull is stored,and puts forward a method using the distance between the point of convex hull and the convex hull vertices to judge whether the point is the query result.Test results show that compared with the PIV algorithm,the proposed method can get the correct results of reverse furthest neighbor query,reduce the amount of storage space and query consumption,and improve the query efficiency.
  • YAO Qi,YIN Zhi,YI Yunfei,LI Yuanxiang
    Computer Engineering. 2017, 43(4): 239-243. https://doi.org/10.3969/j.issn.1000-3428.2017.04.041
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the bi-objective optimization network model,which has the Hub node aggregation behaviors,the conclusion that the origin of the fractality is associated with the aggregation behaviors of Hub node is presented.The box-covering method is used to regularize the three optimization networks of the model,verify fractal properties and scaleinvariance.The average shortest path of some real networks and optimized networks is further compared.The critical condition of the skeleton structure is analyses.Experimental result shows that as long as the structural equilibrium of the fractal network is satisfied,certain proportion of Hub node aggregation and Hub node exclusion behaviors is available.
  • ZHAI Liang,LI Yu,SU Yi
    Computer Engineering. 2017, 43(4): 244-250. https://doi.org/10.3969/j.issn.1000-3428.2017.04.042
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem of large censoring error of the existing intelligent detection algorithm for ship targets in SAR image,a novel method for ship detection based on Anomaly Detection(AD) and two-layer censoring algorithm is presented.The theory of anomaly detection of hyperspectral image is introduced into ship target detection in SAR imagery.The SAR image is transformed into the hyperspectral form by image conversion,and an anomaly detection is used algorithm to achieve the ship target detection preprocessing and the region of interest about ships.By two-layer censoring mechanism,accurate modeling of background clutter and ship target rapid detection are realized.Experimental results show that the algorithm can reduce the censoring error,and eliminate the number of false alarm and side-lobe effectively.Meanwhile,it can obtain higher fidelity of structure.
  • PENG Hao,ZHOU Xinzhi,LEI Yinjie
    Computer Engineering. 2017, 43(4): 251-256. https://doi.org/10.3969/j.issn.1000-3428.2017.04.043
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    When using vector active contours C-V model for image segmentation in the background with high brightness,target region with low brightness and image with discontinuous target color,the segmentation effect of target region is not ideal.Aiming at this problem,this paper constructs a color image segmentation model combined with prior color covariance and C-V model.It designs local Sigmoid function to process internal and external regional energies of curve evolution to accelerate the process of segmentation.For avoiding the segmentation ambiguity by brightness,it proposes secondary segmentation framework and background regional process system to ensure the fine segmentation of color image.Experimental result shows that the proposed model can extract the target region quickly and accurately.The segmented regions are with higher degree of internal homogeneity.
  • JIANG Fan,LIU Hui,WANG Bin,SUN Xiaofeng,DAI Zhaokun
    Computer Engineering. 2017, 43(4): 257-262. https://doi.org/10.3969/j.issn.1000-3428.2017.04.044
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Convolutional Neural Network(CNN) model has been widely used in image recognition.However,there is always room for further improvement of recognition accuracy.Aiming at this problem,a new model is proposed in this paper named CNN-GRNN.It utilizes CNN to extract hierarchical features from sample spaces,and then employs General Regression Neural Network(GRNN) to replace Back Propagation(BP) neural network to enhance generalization and robustness.In the end,it uses Root Mean Square(RMS) and gradient descent method to train the proposed model.Based on COIL-100 and gesture database,experimental results show that the proposed model is respectively improved by 42.2%,13.43%,3.99% and 1.86% in recognition rate compared with the Gray Level Co-occurrence Matrix(GLCM),HU invariant moments,CNN and CNN-SVM model.Therefore,it meets the real-time needs.
  • SONG Ruixia,WANG Meng,WANG Xiaochun
    Computer Engineering. 2017, 43(4): 263-268,276. https://doi.org/10.3969/j.issn.1000-3428.2017.04.045
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    When the V-transform is used to express an image,a coarse to fine multi-scale feature description can be obtained by increasing the number of basis functions of the V-system.But the V-system is not suitable for describing the direction feature of images.As the Contourlet transform has multidirectional characteristics,this paper proposes an image fusion algorithm by combining the V-transform with the Contourlet transform to obtain multi-scale and multi-direction feature description,and applies it to the multi-focus image fusion.In addition to the improved mathematical tools,a fusion scheme of frequency domain coefficients with weighted local statistical characteristics is designed.The weights are chosen from a Gaussian distribution which can effectively depict the correlation of the coefficients.Experimental results show that the fused image obtained from the proposed algorithm has clearer edge details and better visual effect,and the proposed algorithm improves some objective evaluation indexes,such as amount of information,clarity and correlation degree with the original images,compared with multi-resolution fusion algorithms based on wavelet transform,Contourlet transform and Contourlet-wavelet transform.
  • XU Guangyu,LIN Yu’e,SHI Wenbing
    Computer Engineering. 2017, 43(4): 269-276. https://doi.org/10.3969/j.issn.1000-3428.2017.04.046
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    The intensity similarity weight function adopted in bilateral filter is subjected to the influence of image noise.Moreover,this filter causes blindness in dealing with image details.To overcome the above drawbacks,this paper proposes a novel trilateral filter for filtering Gaussian noise.It uses Singular Value Decomposition(SVD) to estimate the geometry structure information of the image,and constructs feature information which can describe the difference of image contents.On this basis,it designs intensity similarity weight function based on the image feature classification and incorporates the structure feature into the bilateral filter framework by introducing the structure similarity weight to preserve more image details.It uses the trilateral weighting approach to provide more reliable similarity measurement between the target pixel and its neighbors.Finally,it uses the local adaptive filtering parameter choosing method for better performance.Experimental results show that the proposed filter can obtain better filtering results when preserving image edges and textures well.

  • LUO Xiaoping
    Computer Engineering. 2017, 43(4): 277-280,286. https://doi.org/10.3969/j.issn.1000-3428.2017.04.047
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Focusing on complicated and costly problems caused by segmentation of words or characters in document rectification algorithm based on 2D image processing techniques,this paper proposes an improved rectification algorithm.Firstly,it applies modified region growing method to search lines space,which implements the segmentation between lines and gets warped line images.Then it uses improved two-pass scanning method labels characters 4-connected components on each of curving line images,and uses least-squares to line base points to fit degree three polynomial.Finally,it straightens polynomial curve and stitches a whole image.Experiments on OCR and comparison with other methods show that the proposed algorithm can improve recognition ratio and produce visually pleasing output.
  • ZHENG Yanmei,ZHANG Fangli,LU Bibo,KANG Rui
    Computer Engineering. 2017, 43(4): 281-286. https://doi.org/10.3969/j.issn.1000-3428.2017.04.048
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to solve the problem of accurate display for High Dynamic Range(HDR) images on the traditional display device,this paper converts the uncertain visual attention mechanism to the number of certain features and proposes an evaluation system based on image feature to identify whether the resulted image of the tone mapping method keeps the features or not.Drastic contrast adjustment is made for the images,and the brightness of the scene is transformed into a range that can be displayed,while maintaining details and color which are very important for the display of the original scene.Seven global tone mapping methods are compared and analyzed,which are LogarithmicTMO,ExponentialTMO,TumblinRushmeierTMO,SchlickTMO,FerwerdaTMO,WardHistAdjTMO and DragoTMO.The overall effect of WardHistAdjTMO tone mapping is optimal.
  • XIAO Dawei,ZHAI Junyong
    Computer Engineering. 2017, 43(4): 287-291. https://doi.org/10.3969/j.issn.1000-3428.2017.04.049
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to reduce the deviation brought by feature points matching and optical axis inclination,this paper proposes a new method to measure the target distance for wheeled mobile robot based on the monocular vision,which can extend the plane objective to the three-dimensional one and achieve high measurement accuracy without adjustments.Firstly,it extracts internal and external camera parameters by camera calibration.Secondly,by establishing the pinhole imaging model,it sets up the corresponding relationship between world coordinate system and pixel coordinate system.Then,with respect to the singularity caused by matrix transformation,it introduces the concept of area and obtains the relationship between target distance and size of pixel area under a specific singularity case.Experimental results indicate that the comprehensive error ratios of the proposed method are all under 0.7%,which can satisfy the system requirements of instantaneity and reliability for monocular distance measurement of wheeled mobile robot.
  • CHANG Tongli,LIU Xuezhe,GU Xincen,GUO Zhipeng
    Computer Engineering. 2017, 43(4): 292-297. https://doi.org/10.3969/j.issn.1000-3428.2017.04.050
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The design structure on leg of bionic quadruped robot leg is different from the actual structure of the biological leg,and the rigid contact force between foot and ground has an adverse effect on the control of motion stability and convergence.To solve the problems above,after analyzing the bone structure of the German shepherd and through image processing and analysis,the movement law of each joint of limbs during diagonal trot is obtained,and a quadruped robot is designed.The foot end of the robot has a mechanism to converse the contact between the foot and ground from rigid to flexible.According to the theory of forward kinematics and inverse kinematics,the foot end workspace of the model is analyzed.The force curve gotten by simulation is compared with the actual force curve,and the result shows that the function of motion control can achieve smooth motion,and the existence of the flexible mechanism design is more helpful on stability of quadruped robot motion.
  • LI Yugang,YE Qingwei,ZHOU Yu,WANG Xiaodong
    Computer Engineering. 2017, 43(4): 298-303. https://doi.org/10.3969/j.issn.1000-3428.2017.04.051
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The pre-processing and system order determination of Intrinsic Time-scale Decomposition(ITD) algorithm involve some human factors,which causes error on extraction of modal parameters,and ITD algorithm is also sensitive to noise.To solve these problems,an improved ITD algorithm is proposed.Firstly,the stochastic subspace algorithm based on data driving is used to deal with the original data.The data obtained by orthogonal triangular decomposition is used as the input data of ITD method.The sparse optimization orthogonal matching pursuit algorithm is used to find the feature matrix characteristic value,modal frequency and damping ratio can be calculated.The real modal is selected from many modal parameters through the method of statistics,which effectively avoids the false modal.Compared with ITD algorithm,the improved ITD algorithm reduces the influence of noise,solves the problem of system order determination and makes the extraction of modal parameters more precise.
  • PENG Xiaoli,ZHENG Linjiang,WEI Hongchun,HOU Xiang
    Computer Engineering. 2017, 43(4): 304-309. https://doi.org/10.3969/j.issn.1000-3428.2017.04.052
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Because the original data stream of Radio Frequency Identification(RFID) technology itself contains a large number of uncertain data,it cannot be used directly by businesses and users.This paper discusses the generation and classification of RFID uncertain data,finds out that these uncertain data are caused by duplicated reading,repeated reading and missed reading,and proposes a hierarchical processing model for practical application to delete,correct and fill these uncertain data.Experimental results show that the proposed model can improve the recognition rate of the label and the object through the hierarchical processing,and ultimately improve the reliability of the system.
  • LI Dai
    Computer Engineering. 2017, 43(4): 310-316. https://doi.org/10.3969/j.issn.1000-3428.2017.04.053
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Concerning that many existing methods cannot construct data similar clustering and the cost of data exchange communication is high in Wireless Sensor Network(WSN),the method using data prediction to reduce clustering cost is proposed,which can save energy of the sensor nodes to prolong the lifetime of WSN.Minimum communication cost is used to build a cluster with similar data and uniform distribution.The intra-cluster communication is reduced using the prediction framework based on adaptive-normalized least mean squares.The three-fold compression method is used to compress the floating point number in the coding stream,and it makes full use of the spatial and temporal correlation of the data during the compression process.Compared with method of multiple Cluster Clustering Head(CCH),method of Energy Efficiency Data Collection(EEDC) and method of Collective Prediction exploiting Spatio Temporal (CoPeST),the proposed method achieves significant data reduction in both the intra-cluster and the inter-cluster communications under the premise of ensuring the accuracy of data.And the energy balance and utilization efficiency of the network are greatly improved.
  • ZHAO Rui,ZHU Meiling,XU Yong
    Computer Engineering. 2017, 43(4): 317-321. https://doi.org/10.3969/j.issn.1000-3428.2017.04.054
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The tracking consensus problem in the case of reference leader based on second-order multi-agent system with nonlinear dynamics is studied.It is assumed that network topology between the followers is a directed graph.As followers cannot access their velocity information,a distributed observer is designed for each follower to estimate their velocity.For the system under switching network topology which is directed graphs,an adaptive control protocol based on observer is proposed.Using Lyapunov stability theory and matrix theory analysis,the sufficient condition which guarantees the system to reach a leader-follower tracking is obtained.Simulation results show that the follower can track the leader in the case of local observer,control protocol and adaptive control.