Author Login Editor-in-Chief Peer Review Editor Work Office Work

15 July 2015, Volume 41 Issue 7
    

  • Select all
    |
  • XUE Yayong,GAO Xiaoguang,WEN Zengkui
    Computer Engineering. 2015, 41(7): 1-5. https://doi.org/10.3969/j.issn.1000-3428.2015.07.001
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The normal flight can not stand the low efficiency and low real-time performance of airborne fire control system fault diagnosis.To deal this problem,this paper proposes a method of real-time airborne fire control system fault detection based on reversal reference mechanism.This method is based on fault tree analysis and establishes client/server communication model based on double buffer queues in VxWorks operating system.This paper uses VC6.0 to compile the program of client and utilizes Tornado to compile the program of server.It also uses priority to allot tasks and uses count semaphore to synchronize different missions.The simulation experiment makes a successful implementation of high efficiency real-time fault diagnosis mission.The efficiency of airborne fire control system can be proved and real-time performance can also be ameliorated.
  • ZHANG Bei,YANG Chunling,ZHENG Bowei
    Computer Engineering. 2015, 41(7): 6-10. https://doi.org/10.3969/j.issn.1000-3428.2015.07.002
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In Distributed Video Coding(DVC),the quality of the side information at the decoder plays a critical role in DVC.The accurate side information can improve system Rate Distortion(RD) performance.Based on the analysis of the existing interpolation and extrapolation side information generation algorithm,this paper proposes a novel extrapolation side information generation algorithm.The hierarchical motion estimation and weighted median filtering techniques are used to generate more accurate motion vectors.Adaptive search range method is utilized to guarantee the coherence of the spatial motion field.Motion projection technique is used to capture the true motion trajectory for getting accurate edge information frame.Experimental results show that the proposed algorithm can significantly increase the accuracy of side information and RD performance of DVC,and it is suitable for sports violent video sequence.
  • DING Xiaobo,MA Zhong,DAI Xinfa
    Computer Engineering. 2015, 41(7): 11-16,24. https://doi.org/10.3969/j.issn.1000-3428.2015.07.003
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to decrease the I/O response time of Virtual CPU(VCPU) with I/O task and CPU intensive task in the default Credit scheduling algorithm of Xen Virtual Machine Manager(VMM),this paper presents a dynamic time slice self-adaptive algorithm based on Credit,named SACredit.It assesses the number of virtual machines of VCPU queue and I/O events by monitoring system response of VCPU queue and I/O event.BOOST mechanism is used in the algorithm,where VCPU can be waked up by I/O event at any time.The I/O response time can be reduced obviously,and with the way of dynamic time slice self-adaption,the fairness and balancing of credit scheduler can be reserved.Experimental result shows that SACredit has low I/O response delay,and retains the advantages of Credit scheduling algorithm in the aspects of resources proportional fair distribution and load balancing.
  • Computer Engineering. 2015, 41(7): 17-24. https://doi.org/10.3969/j.issn.1000-3428.2015.7.17
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    In order to enhance the performance of Dynamic Differential Evolution(DDE) for solving high dimensional optimization problems,an Orthogonal Dynamic Differential Evolution(ODDE) algorithm is proposed.ODDE is based on the framework of DDE algorithm,so it has very powerful global search ability.At the same time,orthogonal crossover operator based on orthogonal experiment design method is used to enhance the local search ability of algorithm.Nine commonly used benchmark problems with different dimensional size 30,100,300 and 500 are used to evaluate the performance of ODDE which is compared with Differential Evolution(DE),DDE,Orthogonal Crossover Differential Evolution(OXDE) algorithm.Numerical results show that the performance of solving accuracy and convergence rate of ODDE is superior to other algorithms,it can be widely used to solve high-dimensional optimization problems in engineering application.

  • WANG Yuwei,NIU Yun,WEI Ou
    Computer Engineering. 2015, 41(7): 25-30,35. https://doi.org/10.3969/j.issn.1000-3428.2015.07.005
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Current machine learning-based Protein-protein Interaction(PPI)identification systems make predictions solely on evidence within a single sentence and suffer from small training set.In this paper,a hybrid similarity model-based approach is proposed to address these issues.A basic Relational Similarity(RS) model is established to make initial predictions.Word similarity matrices are constructed using a corpus-based approach.A clustering algorithm is applied to group words according to their similarity.The obtained word clusters are introduced to the basic RS model to build a hybrid model.Experimental results show that the basic RS model achieves higher and well-balanced precision and recall,and the introduction of the word similarity model further improves the F-score.This approach makes use of known PPI information,thus releases the burden of manual annotation.
  • ZHAI Xiaofang,LIU Quanming,CHENG Yaodong,HU Qingbao,LI Haibo
    Computer Engineering. 2015, 41(7): 31-35. https://doi.org/10.3969/j.issn.1000-3428.2015.07.006
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Microblog is a new type of news media,and its influence and propagation speed surpasses traditional major media.Therefore,it has a great importance to predict hotness in microblog for public opinion monitoring,government propaganda,corporation marketing and popular issues pushing.Through analyzing microblog forward level which combining the effects of the forward index,forward depth and breadth index,this paper gives a new definition of calculating the hotness index of microblog.Then depend on this definition,the hotness index of the microblog is classified as five levels.The goal is to predict the hotness of microblog whose repost count is over 100 to achieve a specified level.By using supervised machine learning algorithm,it successively extracts the static attributes and dynamic repost characteristics of the training samples to train hotness prediction model.The training samples is from Sina microblog is caught by using self-developed BigData open crawler platform.Experimental result by using 10-fold cross-validation shows that,compared with hotness prediction model based on static attributes,the model with dynamic features can effectively improve the prediction performance,and F1-measure achieves 76.9%.

  • PENG Min,GAO Binlong,HUANG Jimin,LIU Jiping
    Computer Engineering. 2015, 41(7): 36-42. https://doi.org/10.3969/j.issn.1000-3428.2015.07.007
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Automatic document summarization is an important approach to obtain key information of microblog platform.Most existing methods on microblogs automatic summarization pay more attention to extract sentences or key phrases from the set of documents,but there are few effective and commonly used methods on reducing the redundancy and noise,which results in the poor content quality of the extracted microblog messages and directly affects the performance of summary.This paper takes microblog platform as research object,proposes an information extraction method based on time-frequency transformation,and extracts a series of high quality microblogs which are highly related to one topic and with less redundancy and abundant informativeness.The sentences in the set of high quality microblogs are scored based on the weights of sentence characters,and the summary of microblogs is generated by ranking and selection of the sentences.Experimental results show that the method is effectively in filtering the redundancy and noise of microblogs,and the final summarization results based on automatic evaluation and manual evaluation outperform other automatic summarization methods’ results.
  • XIAO Jianqiong,GAO Jiangjin,ZHOU Xiaoqing
    Computer Engineering. 2015, 41(7): 43-47,54. https://doi.org/10.3969/j.issn.1000-3428.2015.07.008
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to solve the local extremum problem in cloud service selection process,this paper designs a cloud service selection algorithm based on Technique for Order Preference by Similarity to an Ideal Solution(TOPSIS).The entropy value assignment method is used to simplify the criteria weights selection,and then decision matrix for each period of QoS characteristics is built based on the available cloud service,and fuzzy TOPSIS rank selection and time-varying weight are made fusion decision to obtain a high quality of cloud services,by which a reasonable choice of cloud service is realized.Simulation experimental results show that the proposed algorithm is better than contrast algorithm in success rate and robustness of cloud service selection,it can effectively curb the bad QoS data interference,and strengthen well faith service sharing.
  • ZHANG Sheng,HU Jiajing
    Computer Engineering. 2015, 41(7): 48-54. https://doi.org/10.3969/j.issn.1000-3428.2015.07.009
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    With high speed developing of the informatization,the coming of big data era brings some revolution to the world,and it becomes a challenge for big data searching by its explosive growth and variety.MapReduce is commonly used in processing big data and shows its great advantages.Combined with the relative knowledge of lattice,this paper uses Form Concept Analysis(FCA) to discover the relationships among textual documents and expresses them with lattice,and proposes a novel conceptually index structure,which supports large scale data retrieval.In addition,it describes the related algorithms for building conceptual index.Compared with Lucene index,conceptual index supporting queries has better efficiency.Experimental results show that using lattice to express the relationship of documents and indexing it with conceptual can significally improve the performance of large scale documents retrieval.
  • LI Xuezhu,CHEN Guolong
    Computer Engineering. 2015, 41(7): 55-59. https://doi.org/10.3969/j.issn.1000-3428.2015.07.010
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    With the help of virtualization technology,cloud computing will be unified computing resource management to improve the use efficiency,but the combination of virtualization and cloud computing platform brings a new mode of use,especially the physical server resource boundary limits the resources optimization ability.This paper improves the global optimization framework,increases is idle the minimum memory boundary value when internal resources of virtual machine.Based on the framework,the memory resources are divided into virtual machine for the low utilization rate and high utilization rate of the situation,and gives two kinds of control algorithm,as well as the relationship between these two kinds of algorithms.The method not only reduces the number of the exchanging between virtual machines and platform,but also reduces the switching frequency,so average memory resource utilization efficiency can be greatly improved.
  • GE Jingjun,KONG Fanzhi,ZHANG Ming,TENG Jianfeng,LIU Xin
    Computer Engineering. 2015, 41(7): 60-65. https://doi.org/10.3969/j.issn.1000-3428.2015.07.011
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper studies key technology of field heterogeneous data integration from the aspect of distributed heterogeneous data integration and mass data sharing,proposes a semantic integration method for field heterogeneous data,and builds Nested Object Model(NOM) for all kinds of heterogeneous data.Combining with virtual view and Mashup framework,integration method of heterogeneous data service are proposed to describe,organize and display for data and data relationship.It realizes data interaction and maintain data synchronization by using semantic mapping,provides dynamic data integration services.Example analysis result shows that oil and gas well production optimizing decision and diagnosis platform based on proposed data integration method can realize professional data integration and inter professional application,provide support for oil and gas well production dynamic monitoring and diagnosis decision.
  • WANG Pengcheng,XIAO Zheng,LIU Hui
    Computer Engineering. 2015, 41(7): 66-70. https://doi.org/10.3969/j.issn.1000-3428.2015.07.012
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    The hotness trend modeling of topic in online forums is one of the main contents of the existing public opinion analysis.Under the situation that the existing methods ignore the influence of opinion tendency on the topic hotness,a hotness trend modeling approach for the topic is proposed considering opinion tendency.The sentimental tendency of topics can be obtained by sentimental classification methods.This sentimental information is leveraged over the hotness computation,which flexibly reflects the influence of sentimental tendency on topic hotness.Gamma distribution is used to fit the hotness curve.Experimental results show that the proposed method can fit the hotness trend more accurately compared with the Gauss model.

  • ZHANG Tao,BAI Ruilin,ZOU Junyu
    Computer Engineering. 2015, 41(7): 71-74,81. https://doi.org/10.3969/j.issn.1000-3428.2015.07.013
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    For the Garbage Collection(GC) low efficiency in embedded virtual machine environment,this paper proposes an improved generational GC algorithm based on lifespan prediction.Through the prediction of object’s lifespan,objects predicted to be long-lived are allocated directly into the old generation,and the need to copy such objects from the young generation is eliminated,thereby reducing the execution time of GC.In young generation,this paper adopts a kind of un-stop-the-world strategy which objects allocation and promotion perform concurrently.In old generation,it uses lazy-buddy algorithm combining with mark-sweep algorithm to achieve fast allocation and recovery.It not only avoids the copy operation,but also controls the amount of memory fragmentation.Experimental results show that,with this algorithm,the GC time decreases by about 23.9% and the program running time decreases by about 17.2%,the overall system execution performance is significantly improved.

  • HU Rui,MA Peng,ZHANG Jianxiong
    Computer Engineering. 2015, 41(7): 75-81. https://doi.org/10.3969/j.issn.1000-3428.2015.07.014
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Cache,as an important part of the microprocessor,accounts for an undueproportion of the chip area and power consumption,which is a problem needs to be solved,especially the intensiveenergy cost.Under the background and based on two kinds of micro structural-level low-power optimization techniques for set-associative Cache——phased Cache,as well as the way-predicting one,this paper proposes a low-power set-associative pre-access Cache strategy.In this strategy,Buffer is added in Cache to store the hit tag and data subarray-information from hit Cache.It selects the buffer before accessing the tag,and then matches the tag from two parts——accessed Cache and Buffer.Referring to the matching result,way-predicting or phased Cache can be chosen to access.Experimental results through MiBench benchmarks,Simple-Scalar and Sim-Panalyzer show that Energy-delay Product(EDP)can be reduced by 25.15% in this strategy.
  • SHEN Jian,XIAO Tiejun,YU Jinhua
    Computer Engineering. 2015, 41(7): 82-85,90. https://doi.org/10.3969/j.issn.1000-3428.2015.07.015
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The correction of color temperature is an important part of Liquid Crystal Display(LCD) color correction system,which directly affects the final display quality and the user watching experience.A scheme of LCD color temperature real-time correction based on RGB color space is researched and designed,in order to improve the LCD display and cater to the current trend of high definition video development.Compared with the traditional implementation methods,this paper proposes a reference white point estimation method based on luminance in the proposed scheme,which simplifies the selection of the reference white point.To improve the accuracy of correction,DDR3 SDRAM is used as frame buffer to realize frame to gain correction of own frame data.This paper verifies the development board Kintex7 Field Programmable Gate Array(FPGA) which is the core chip of XC7K325T-2FFG900 made by company Xilinx,and result shows that the scheme corrects the color temperature of video data within the FPGA and restores the user preset color temperature by the method of look-up table to further reduce the complexity of the calculation,and it is feasible.
  • WANG Shitao,ZHANG Ji,LI Jian,TANG Lisan
    Computer Engineering. 2015, 41(7): 86-90. https://doi.org/10.3969/j.issn.1000-3428.2015.07.016
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Through the study on load balancing algorithms of the total run queue in VxWorks operating systems and Linux based on scheduling domain,it is found that there are high-priority tasks unable to seize the low priority tasks running on other CPU in Linux operating systems.Based on the above problems,Linux load balancing algorithm is improved.On the basis of the length of run queue,considerations of the task priority are increased,that is,the task priority is included in the load factor.Experimental results show that this method can solve the above problems.And while ensuring load balancing,real-time concept of systems is improved,enabling the system to run the high-priority task as soon as possible.
  • HAO Meng,FAN Yixiang,XIA Xiang,LIU Long,WANG Peng,HUANG Ning
    Computer Engineering. 2015, 41(7): 91-94. https://doi.org/10.3969/j.issn.1000-3428.2015.07.017
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To utilize the inverse t-distribution in a practical code development project,a numerical algorithm based on the incomplete Beta function is analyzed and implemented.First,an initial approximation of the inverse of incomplete Beta function is deducted according to the formula.Then Halley root-finding algorithm is employed to hunt exact solution of the inverse.At last,the value of inverse t-distribution is obtained on the base of exact solution.The analyzed algorithm can be regarded as a supplement to that of Matlab.It can be applied to the project needing to calculate the value of inverse t-distribution in code embedding manner.The algorithm is implemented in C++ language.Compared with the result of Matlab and Excel,the relative error between the algorithm and Matalb and Excel is less than 10-11,this data shows the correctness of the algorithm in engineering applications.
  • QI Changcheng,YANG Yanxiang,ZHANG Ping,LIU Changwen
    Computer Engineering. 2015, 41(7): 95-99. https://doi.org/10.3969/j.issn.1000-3428.2015.07.018
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the defects of the traditional bootloader in programming vehicle Electronic Control Unit(ECU) on-line,a smart node with the capability of two-level bootloader and its supporting software on ECU are designed based on CAN bus and the international standard CAN Calibration Protocol(CCP),which use USB Disk to carry target software code.A special scheme and its software project are presented with MC9S12G128 of Freescale.Evaluation and test results show that the smart node is convenient,feasible and with high reliability as updating instrument.Moreover,defects of current software updating methods like the insecurity of application code,a high cost and requirement of more flash memory are avoided successfully and simultaneously.
  • ZHOU Liucheng,ZHANG Ji,LI Jian,SUN Chenwei
    Computer Engineering. 2015, 41(7): 100-105. https://doi.org/10.3969/j.issn.1000-3428.2015.07.019
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    XtratuM is a hypervisor designed to meet the requirements of embedded safety critical area.It is possible to run different guest operating systems which are modified on each partition of XtratuM system,but μC/OS-II can not work as XtratuM’s guest Operating System(OS) till now.Therefore,this paper proposes the solution of μC/OS-II para-virtualizaiton on XtratuM after researching the architechure of XtratuM.It makes use of the hypercalls hiding the hardware details from μ/COS-II,redesigning the task stack frame of μ/COS-II and the context switching algorithm in order to avoid using the conflicting instructions,and mounts virtual clock interrupt on XtratuM system to make sure of task scheduling.Experimental result shows that two μC/OS-IIs can run independently according to the established scheduling scheme on the same hardware platform based on x86 architecture.
  • JIANG Chuan,HUANG Guoce,WANG Binghe,CHEN Yu
    Computer Engineering. 2015, 41(7): 106-110. https://doi.org/10.3969/j.issn.1000-3428.2015.07.020
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The currently High Frequency(HF) gateway implementation based on TUN/TAP has several shortcomings,such as inflexible configuration,no filtering mechanism,low packet capture performance,and lack of Quality of Service(QoS),etc.To solve these problems,this paper designs and implements a Libpcap-based HF IP gateway solution.The new gateway uses a web-based management tool to simplify configuration and management,uses BPF to define the packet filtering rules to improve the accuracy of packet capture,and uses priority queue to provide some QoS supports.Finally,through actual test,the functionality of this implementation is verified,and the throughput and latency performance of this solution can satisfy the requirement of high speed traffic capture in 100 Mb/s network.
  • LU Yunbo,TANG Liang,HAO Lixin,YU Kai,BU Zhiyong
    Computer Engineering. 2015, 41(7): 111-114,119. https://doi.org/10.3969/j.issn.1000-3428.2015.07.021
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Cognitive Radio(CR) is a concept to improve the utilization of scarce wireless spectrum resources,and Orthogonal Frequency Division Multiplexing(OFDM) is regarded as the best technology to match with the CR system.In this paper,a new resource allocation algorithm for OFDM CR system is presented,based on the characteristics of OFDM-based CR system.This algorithm allocates the subcarriers according to the proportion fairness principle and allocates power by water filling,solving the optimization problem:maximize the system capacity of secondary users,on the one hand,the total transmit power of secondary users should below the transmit power constraint,on the other hand,the interference to primary users should under the maximum interference level.Simulation results show that the proposed algorithm can not only satisfy the fairness,but also improve system capacity with low complexity.
  • SHEN Xueli,CHEN Guang
    Computer Engineering. 2015, 41(7): 115-119. https://doi.org/10.3969/j.issn.1000-3428.2015.07.022
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To solve the problem of low localization accuracy and the anchor node’s high cost of the existing DV-Hop algorithms in Wireless Sensor Network(WSN),the concept of minimum deviation is proposed and an improved algorithm(AADV-Hop) is designed in this paper based on it.By finding the minimum deviation to reduce the localization algorithm overall average position error,the real distance between anchor node is used to correct the distance of the unknown node to anchor node,replacing the distance obtained by hop count and average hop distance in traditional DV-Hop algorithms.The simulation results show that,when the number of anchor nodes in the network is less,AADV-Hop algorithm can effectively improve the positioning accuracy.The location accuracy is the highest when the minimum deviation value is 0.03.
  • TIAN Xinji,JIANG Limin
    Computer Engineering. 2015, 41(7): 120-123. https://doi.org/10.3969/j.issn.1000-3428.2015.07.023
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    An interference cancellation method based on space-time code is proposed for 3-user Mulit-input Multi-output(MIMO) interference channel.Each user employs rate-2 space-time block code,and two-layer pre-coding is performed for each codeword.One layer pre-coding aligns the interference at the second user.So does the interference at the third user.The multi-user interference at these two receivers is eliminated by the other layer pre-coding.The multi-user interference is eliminated at the first receiver by unidirectional cooperative link.Simulations results show that the transmission efficiency is improved compared with the existing interference cancellation method using Alamouti coding for the same scene.This method improves the transfer efficiency,and the reliability is good.
  • FEI Huan,LI Guanghui
    Computer Engineering. 2015, 41(7): 124-128. https://doi.org/10.3969/j.issn.1000-3428.2015.07.024
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to improve the reliability of Wireless Sensor Network(WSN) application system,it detects abnormal data from sensor environmental data set.An algorithm of abnormal data detection based on clustering of data mining is proposed in the paper,which not only adopts K-means clustering but also takes the characteristics of WSN data into account.This algorithm uses Euclidean distance to compare similarity of data for cluster partitioning,and identifies the abnormal data according to the distance between data point and cluster center.Experimental results show that when data is more than 1 000,compared with the algorithm based on Density-based Spatial Clustering of Applications with Noise(DBSCAN),the detection accuracy of this algorithm is higher and the false positive rate is lower under the same conditions.
  • HE Sheng,LIU Yijun,YE Feiyue,ZHAO Xiaorong,FENG Xinling
    Computer Engineering. 2015, 41(7): 129-132. https://doi.org/10.3969/j.issn.1000-3428.2015.07.025
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper analyzes the design idea of network layout evaluation algorithm based on the F-measure, calculates the recall ratio and precision ratio of F-measure, optimizes the objective function, gets the quantitative evaluation results of network layout, and gives the implementation process of evaluation algorithms. It uses the grid layout algorithm to make layout under different scale and different iterations. Test results on visAnt visualization platform show that the evaluation results of this algorithm are consistent with the grid layout algorithm, which proves its effectiveness.
  • YIN Xiangdong,XIAO Huijun
    Computer Engineering. 2015, 41(7): 133-137. https://doi.org/10.3969/j.issn.1000-3428.2015.07.026
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Traditional classification algorithms in heterogeneous networks map the original network into multiple homogeneous networks,and neglect the correlation between nodes of different types.This paper represents the relationships between heterogeneous nodes as latent variants,and proposes a labeling model and corresponding classification algorithm in heterogeneous networks.This paper describes the problem of node labeling in homogeneous networks,analyzes the drawbacks of the algorithms that map one heterogeneous network into multiple homogeneous networks,represents the nodes in heterogeneous networks as vectors,proposes a labeling model based on vectors,applies stochastic gradient descent method to solve the proposed model,and analyzes the complexity of the algorithm.Experimental results show that,the proposed node classification model in heterogeneous networks is more accurate than both mapping homogeneous model and unsupervised latent space model.
  • HA Lin,MA Yongtao,LIU Kaihua,HUANG Jianyao
    Computer Engineering. 2015, 41(7): 138-141. https://doi.org/10.3969/j.issn.1000-3428.2015.07.027
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Cognitive Radio(CR) can find the frequencies cavity in a specific frequency range by sensing,learning and adapting the electromagnetic environment around,be familiar with the characteristics of the wireless signal,and use these results reasonably,ensure the use of idle spectrum resource for communication without causing interference to authorized users.In existing literature,there are few researches measuring the reliability of spectrum.Therefore,on the basis of energy detection method which is maturely developed and widely used presently,this paper proposes an analysis method of frequency spectrum resource about the reliability of frequency channel.It jointly consideres availability and stability as the two factors measuring reliability,also it conducts simulation and verification.Proved that in the condition of low Signal Noise Ratio(SNR),it may appropriately increase the number of test analysis in order to acquire the improvement of spectrum analysis accuracy.This method enriches the features of describing spectrum resources,which can be used as part of the supplement of the spectrum analysis to make its content more comprehensive and its results more reliable,and improve the spectrum utilization as well as communication quality has great contribution.
  • LIU Wenfeng
    Computer Engineering. 2015, 41(7): 142-148,152. https://doi.org/10.3969/j.issn.1000-3428.2015.07.028
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Most existing history-based routing protocols for Delay Tolerant Network(DTN)make message forwarding decisions according to coarse-grained encounter information.However,these coarse-grained information can not give precise expression of the contact patterns between nodes in the network,thereby leading to inaccurate forwarding decisions.This paper presents a Fine-grained Probabilistic Routing Protocol Using History of Encounters and Transitivity(FG-PRoPHET),it uses slotted sliding window mechanism for maintaining fine-grained real-time statistics,and controls the historical data granularity by adjusting the size of sliding window,further describes the contact patterns between network nodes.Based on the statistics of fine-grained contact and the overall duration of all contact events occurring during sliding window of the encountering peer to peer nodes,it can compute the probability of node contact.Experimental results show that,compared with existing probabilistic routing protocols,FG-PRoPHET can significantly enhance message delivery rate and resource utilization of network caching with lower communication overhead.
  • ZHANG Qiuming
    Computer Engineering. 2015, 41(7): 149-152. https://doi.org/10.3969/j.issn.1000-3428.2015.07.029
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Evaluating and predicting calling characters among users in current mobile phone communication network enables mobile network operators to effectively obtain behavior rules of consumers.This paper proposes a new link prediction model for mobile phone communication network.Given common neighbors as the basic similarity metric among users in mobile phone communication network and the differences of some key calling characters such as the call duration and call duration ratio,a new node similarity experienced formula is proposed and three key parameters based on part of real mobile phone communication dataset are also obtained.The new node similarity metric is applied to the link prediction in a dataset generated by 50 000 mobile phone users in a communication operator.The results obtained from performance analysis show that the poropsed node similarity model can accurately analyze the consuming behavior patterns of mobile phone users.
  • JI Shilong,GUO Hui,MA Changdong,ZHAO Shangqing,CHEN Pengpeng
    Computer Engineering. 2015, 41(7): 153-156. https://doi.org/10.3969/j.issn.1000-3428.2015.07.030
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    A relay selection communication system with direct transmission based on Decode-and-forward(DF) protocol and best relay selection strategy is proposed,also the performance of the system is analyzed under spectrum sharing environment.In this dual-hop cognitive cooperative communication system,all terminals operate in half-duplex mode,source and all the relay nodes are equipped with a single antenna,meanwhile,the Selective Combining(SC) is adopted at the destination to receive signals from direct transmission and relaying link so as to improve the system performance.Both the exact and the high Signal Noise Ratio(SNR) asymptotic closed-form expressions for the Outage Probability(OP) are derived over non-identical Rayleigh fading channels.The correctness of the proposed analysis is corroborated via Monte Carlo simulations.The results show that the expressions derived in this paper can be used to evaluate the impact of the key system parameters on the end-to-end system performance,and high SNR can reflect the correctness and validity of the method better.
  • ZHANG Yulei,LI Chenyi,ZHOU Dongrui,WANG Caifen
    Computer Engineering. 2015, 41(7): 157-162. https://doi.org/10.3969/j.issn.1000-3428.2015.07.031
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Using Security Mediator(SEM) to realize the revocation of certificateless signature,SEM stores a lot of users’ secret information,and reduces the security and reliability of certificateless signature scheme.In order to overcome this shortcoming,this paper proposes an efficiently Revocable Certificateless Signature(RCLS) scheme based on the idea of time update key.In the random oracle model,based on the assumption of Computational Differ-Hellman(CDH) problem,the scheme which is proved to be secure can resist three types of attacks:the attack of user’s replace public key,the attack of Key Generation Center(KGC) and the attack of revocatory users.Analysis results show that this scheme has great computation efficiency as it only needs 3 bilinear pairing computation.
  • JIAO Jinping,LIU Guoyan
    Computer Engineering. 2015, 41(7): 163-170,176. https://doi.org/10.3969/j.issn.1000-3428.2015.07.032
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In Attribute-based Encryption(ABE),as the users in an ABE system may share some common attributes,some malicious users may leak their decryption ability corresponding to the common attributes.As the authority of ABE system can use the master secret key to generate any secret keys,a traced user may argue that it is the authority that generates the leaked secret key.This paper presents a scheme of an accountable and traceable ABE of an authority.A traced user can take advantage of accountability algorithm to further check whether it is the malicious act from the authority or the user.In this scheme,the authority of ABE system and the authority of identity management are mutually independent,and cannot unlock the ciphertext by their own.Analysis shows that the scheme not only realizes the user traceable,but also solves the credit problem of the authority of ABE system with accountability.
  • HE Junjie,ZHANG Xuefeng,QI Chuanda
    Computer Engineering. 2015, 41(7): 171-176. https://doi.org/10.3969/j.issn.1000-3428.2015.07.033
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to simplify the certificate management process in the traditional public key cryptosystem and eliminate the security vulnerability brought by the key escrow problem in the identity-based public key cryptosystem,a new certificateless blind signature scheme without pairings is proposed.The scheme is proved to be existentially unforgeable against adaptive chosen message and identity attacks in the random oracle model,and the security is reduced to the hardness of the discrete logarithm problem.Analysis results show that compared with the signature and verification algorithm of many other certificateless blind signature schemes,the proposed scheme has obvious advantages in computational efficiency because of no time-consuming bilinear pairing operation and inefficient MapToPoint hash function.
  • CAO Xiaomei,LI Jiageng,YIN Ying
    Computer Engineering. 2015, 41(7): 177-183. https://doi.org/10.3969/j.issn.1000-3428.2015.07.034
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    For Ad Hoc network wormhole attack detection predictability,this paper presents a wormhole detection algorithm based on fuzzy prediction.When a node is attacked by a wormhole,its neighbor node number abnormally increases.This algorithm uses the node to send packets,to determine its surrounding neighbor node number,then stores the total number of nodes in a sliding window.According to preliminary statistics of neighbor nodes,using fuzzy predictive theory,it predicts the upper threshold before the node moves to the next position.When the node actual moves to the next position,it detects neighbor nodes and compares with the upper threshold.When the neighbor node is greater than the threshold,the point is considered be attacked by wormhole.Simulation results show that the detection rate is improved over the contrast algorithm compared with Statistical Wormhole Apprehension using Neighbors(SWAN) algorithm.
  • JIN Ge,XUE Zhi,QI Kaiyue
    Computer Engineering. 2015, 41(7): 184-189. https://doi.org/10.3969/j.issn.1000-3428.2015.07.035
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Main Boot Record(MBR) Rootkit is a new kind of Rootkit which is much harder to detect.This paper analyzes the key technical points and the workflow of MBR Rootkit and develops cooperative concealment model.Two-level cooperative concealment and multiple-level cooperative concealment are proposed.It applies these concealment models to the formal description of concealing mechanism of MBR Rootkit.A new static detection method is presented and corresponding MBR matching algorithm is designed and implemented based on the analysis of MBR data format.The static method focuses on searching MBR backup data in some hidden disk areas.Experimental results show that the method gains a high detection accuracy,and the found MBR backup data can be used to restore the system.
  • WU Shaohua,SUN Dan,HU Yong
    Computer Engineering. 2015, 41(7): 190-193,198. https://doi.org/10.3969/j.issn.1000-3428.2015.07.036
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem that the identification rate of current Web Server Identification(WSI) methods is low, and the dependence for fingerprints is strong.By the Web servers’ different responses to the 15 types of abnormal Http requests, the naive Bayes classification model is used to identify the Web server type by the maximum posterior probability of status code attributes.The model is used again to identify the Web server version by selecting version features.A system based on the method called WSI system is designed and realized.Experimental results show that compared with the current three identification tools such as HMAP, Httprecon, Httprint, the accuracy and recall rate and F-measure of this system are higher, and it has better recognition performance with the increase of training samples.
  • CHEN Chen,ZHENG Gang,DAI Min
    Computer Engineering. 2015, 41(7): 194-198. https://doi.org/10.3969/j.issn.1000-3428.2015.07.037
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to explore the contribution rate of each feature in the initial feature set for identification,under the guidance of recognition accuracy rate,this paper uses stepwise discriminant analysis method to acquire the contribution rate and sorting of each feature,and select a key feature subset used for identification.Experiments are carried on Physikalisch-technische Bundesanstalt(PTB)and lab collected data sets which select 9 and 17 key features from two experiment data sets respectively.Experimental results show that these features have 66.7% coincidence degree,while the fiducial points of the two sets depending on have 63.6% coincidence degree,and initial feature set of Electrocardiogram(ECG) contains common features to distinguish individuals to a certain extent.Meanwhile,using the two key feature subsets,the recognition rate of two data sets is 99.7% and 94.8% respectively.
  • ZHANG Buzhong,CHENG Yusheng,WANG Yibin
    Computer Engineering. 2015, 41(7): 199-203. https://doi.org/10.3969/j.issn.1000-3428.2015.07.038
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The number of queens is becoming large,and the time consuming of Genetic Algorithm(GA) is becoming intolerant.In order to reduce the run time,parallel GA is applied to resolve N-queens problem based on the existed resolution.And dynamic programming algorithm is used in local search.Based on Simple Genetic Algorithm(SGA),a Coarse-grained Parallel Genetic Algorithm(CPGA) for solving the N-queens problem is implemented in the multi-core platform.Unlike traditional CPGA,population migration and message communication are avoided.After many times generation,the sub-populations are becoming more similar and the iterative speed is slowing.So a new Operator-oriented Parallel Genetic Algorithm(OOPGA) is proposed in this paper and it is also applied to solve N-queens problem.Experimental results show that OOPGA is better than CPGA in time-consuming and speedup.
  • LI Yuming,QIU Weidong,XU Saisai,GUO Yingkai
    Computer Engineering. 2015, 41(7): 204-209. https://doi.org/10.3969/j.issn.1000-3428.2015.07.039
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The research on uncertain data mining becomes a hotspot in the area of data mining recently.However,there are few algorithms which can be used to mine maximal frequent itemsets.Based on features of uncertain data,this paper proposes a new U-GenMax algorithm which improves and extends the maximal pattern mining algorithm GenMax from deterministic data to uncertain data.The algorithm extends the Tid set and adds probabilistic domain to the id domain,and realizes format converting of vertical data.In the aspect of judging frequent itemsets,the algorithm adds two prior judgments to prune infrequent itemsets,and lowers the amount of calculation enormously compared with calculating confidence level directly.The algorithm proposes a new multistep rollback pruning strategy,thus avoids the flaw of GenMax which only rolls back one step at a time.Experimental results show that the performance of U-GenMax is very good and suitable for sparse database under all circumstances as well as dense database under high degree of support.
  • DUAN Feiteng,CUI Baotong
    Computer Engineering. 2015, 41(7): 210-214. https://doi.org/10.3969/j.issn.1000-3428.2015.07.040
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper researches global exponential stability of memristor-based recurrent neural networks with time delays.By employing homeomorphism mapping,Lyapunov functionals and differential theory,the existence and uniqueness of the equilibrium point of memristor-based neural networks are proved and the equilibrium is global asymptotic stability,some M-matrix based conditions are obtained.In addition,the conditions improve some previous criteria based on M-matrix and have robustness for different time delays and activation function.Besides,the conditions are easy to be verified with the physical parameters of system itself.Numerical analysis and simulation result shows the effectiveness of the new criterion.
  • LI Jinzhong,YANG Wei,XIA Jiewu,ZENG Xiaohui,SUN Lingyu
    Computer Engineering. 2015, 41(7): 215-218. https://doi.org/10.3969/j.issn.1000-3428.2015.07.041
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Due to non-continuity and non-differentiability of information retrieval evaluation criteria,the traditional methods of learning to rank can not directly optimize evaluation criteria of ranking from training data.In this paper,the problem of learning to rank is transformed into a linear combination optimization problem,a new method of learning to rank is proposed.The proposed method employs Hooke & Jeeves pattern search which makes exploratory search and pattern move alternately to accelerate the convergence of learning to rank.Experimental results show that the time overhead of this method is lower and can obtain better sorting effect compared with the learning to rank based on coordinate ascent method in 10 datasets of learning to rank.
  • ZHANG Jiaming,WANG Bo,TANG Haohao,LI Tiancai
    Computer Engineering. 2015, 41(7): 219-223,229. https://doi.org/10.3969/j.issn.1000-3428.2015.07.042
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Sentiment orientation analysis on microblog has become a research hotspot in current academic circles.Unsupervised methods based on traditional topic models fail to resolve the problem of feature sparsity of microblog corpus,which turns in poor performance in sentiment orientation analysis on microblog.To solve this problem,this paper presents an unsupervised method for sentiment orientation analysis on microblog based on Biterm Topic Model(BTM).The corpus is preprocessed and the co-occurrence words pairs are counted.BTM model is used in the method to mine the implicit topics in the documents.A sentiment dictionary is used to calculate the sentiment distributions of the topics.The sentiment orientation of the whole microblog is obtained on the basis of the sentiment distributions of the topics.Experimental results conducted on NLP&CC2012 corpus show that the proposed method can more effectively identify microblogs sentiment orientation,and the average F1-measure is improved by 15% than that of the traditional methods.
  • PENG Xindong,YANG Yong,SONG Juanping,JIANG Yun
    Computer Engineering. 2015, 41(7): 224-229. https://doi.org/10.3969/j.issn.1000-3428.2015.07.043
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to solve the problem that intuitionistic fuzzy soft set can not deal with the situation that the sum of membership degree and non-membership degree of the parameter is bigger than 1.It makes the decision processs limted,and affects the application range.This paper combines the characteristics of Pythagorean fuzzy set with the parameterization of soft set,and constructs Pythagorean fuzzy soft set.Some operations such as complement,union,intersection,and,or,addition,multiplication,necessity,and possibility are defined.Some corresponding results are presented,and the De Morgan’s Law of Pythagorean fuzzy soft sets are discussed in detail.A decision making algorithm based Pythagorean fuzzy aggregation operator is proposed.This paper analyses the computational complexity of this algorithm,and applies it to stock investment.Experimental results show that the algorithm is effectiveness.
  • ZHENG Lianbin,YANG Lianhe
    Computer Engineering. 2015, 41(7): 230-233. https://doi.org/10.3969/j.issn.1000-3428.2015.07.044
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Based on the analysis of law of gravity and Newton’s second law,considering the relationship between displacement and acceleration,a new displacement-time function is conducted.Based on the function,a novel heuristic optimization algorithm named Gravitation Move Algorithm(GMA) is proposed.In this algorithm,search agents are distributed randomly in the search space.In each search phase,each search agent moves according to the proposed displacement-time function.Along with the iterations,all agents move forward to the best solution location achieved,until all agents converge to the global best solution.Experimental results show that GMA has better performance than PSO in solving various benchmark functions and it is stable.
  • WANG Xinying,GU Fangming,PANG Huanli,WANG Xiaohu
    Computer Engineering. 2015, 41(7): 234-238,243. https://doi.org/10.3969/j.issn.1000-3428.2015.07.045
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    For the problem that the method of content-based retrieval can not fully express the semantic information in the field of 3D model retrieval,a 3D model retrieval method containing semantic classification information is proposed.Artificial classification information and limited sematic annotation information,etc are used to build a heterogeneous semantic information network,and it is converted to a 3D model heterogeneous semantic features.Based on that,a subject classification method containing model semantic feature is used,and it is applied to the model retrieval.Experimental results show that compared with the conventional method of content-based 3D model retrieval,the method can improve the accuracy of 3D model retrieval.
  • ZI Lingling,CONG Xin
    Computer Engineering. 2015, 41(7): 239-243. https://doi.org/10.3969/j.issn.1000-3428.2015.07.046
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at solving the problem of blurry details in the image sequence interpolation methods,this paper proposes a region-guided interpolation algorithm.The attention region calculation method is proposed according to the space and temporal characteristics of image sequences.The region interpolation mode is constructed and the details of the original image sequence can be maintained,so as to achieve high quality interpolation for the obtained attention regions.The detail enhancement method based on guide filter is proposed and it can improve further the definitions of attention regions,so the high resolution image sequences in accord with visual perception can be acquired.Experimental results show that,in the cases of different interpolation factors,compared with image signature detection algorithm and global contrast detection algorithm,the proposed algorithm achieves better visual effects and higher objective index values.
  • WANG Bin,WENG Zhengkui,WANG Kun,LIU Hui
    Computer Engineering. 2015, 41(7): 244-249. https://doi.org/10.3969/j.issn.1000-3428.2015.07.047
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to realize the detection and tracking of moving targets,real-time eye feature points tracking method which combines Viola-Jones detecting algorithm,sub-pixel corner extracting with Lucas-Kanada optical flow algorithm is proposed in this paper.Viola-Jones algorithm is used to detect the eyes’ rough region,and Harris operator with filtering mechanism is adopted to extract the accurate sub-pixel eye feature points in the target area,which can reduce the locating time but remain the locating accuracy.The Lucas-Kanada optical flow algorithm based on pyramid layer mechanism is introduced to capture the movement feature point.Taking into account the tracking for large scale and high speed objects,a switchable location windows method is presented and used in the last step.Experimental results show that this method can realize the eye feature points location and tracking for fast movement and large scale target with high precision,good real-time performance and strong robustness.
  • TANG Lin,LI Min,LIU Bo
    Computer Engineering. 2015, 41(7): 250-255,260. https://doi.org/10.3969/j.issn.1000-3428.2015.07.048
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    For the larger intra-class variance and inadequate real-time problems caused by factors of imaging scales difference in nighttime pedestrian detection,this paper designs a rapid dentify program for nighttime pedestrians based on entropy weight and header calibration of Fast Classification Support Vector Machine(FCSVM) optimization under the application of statistical learning principles.The program utilizes entropy weight to improve the feature of gradient histogram,introduces three branch structure SVM to identify the target further,and uses rapid classification FCSVM to reduce the overhead required computation and to ensure real-time.Through the header calibration method to analyze and assess error detection goals,it further improves the accuracy of image matching.Experimental results show that the scheme can distinguish far infrared pedestrian goals effectively at night environment,and have good recognition effect in urban,suburban and other different application environments on the basis of ensuring pedestrian real-time fully.

  • CHENG Tingting,GUO Lijun,HUANG Yuanjie
    Computer Engineering. 2015, 41(7): 261-268. https://doi.org/10.3969/j.issn.1000-3428.2015.07.050
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper presents an automatic video segmentation method based on robust higher order Conditional Random Field(CRF),which alleviates the problem that interactive segmentation is time-consuming and labor-intensive,and oversegmentation is generated in unsupervised segmentation,and simple pairwise-pixel segmentation cannot get accurate boundary.It utilizes the saliency based segmentation of the first frame of video as initial seeds instead of user labeling.The Gaussian mixture model and a strong jointboost classifier model are respectively learned on the features of color,texture and shape,the combination of both in CRF improves the accuracy of segmentation.It adds higher order potential based on supervoxel to solve the shortcoming of oversmoothing of pairwise-pixel segmentation.Experimental results demonstrate that the method is more effective and efficient than the state-of-art methods.
  • ZENG Ye,CAI Biye,SONG Yun,LI Xueyu
    Computer Engineering. 2015, 41(7): 269-273. https://doi.org/10.3969/j.issn.1000-3428.2015.07.051
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To address the problem of segmenting the iris inaccurately in Hough transform when iris images are corrupted by the eyelids,eyelashes and deformation,an improved method of iris image segmentation based on image alignment is presented.The proposed method uses threshold and Hough transform to locate the center of pupil and applies Harris corner detection algorithm to estimate the left corner of eye.And it employs the robust alignment by sparse and low-rank decomposition algorithm to deal the labeled images with the two detected points,to make it have the low-rank feature.The proposed method uses edge detection and Hough transform method on the processed images to segment the iris accurately.Experimental results show that compared with Hough transform,this method can effectively remove the eyelids and eyelashes occlusion,and improve the accuracy of iris localization.
  • WEI Changbao,YAO Ruxian
    Computer Engineering. 2015, 41(7): 274-279,284. https://doi.org/10.3969/j.issn.1000-3428.2015.07.052
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Graph Partitioning Change Detection (GPCD) problem is important in that it leads to discovery of important events which cause changes of network communities.Aiming at the disadvantages that the existing detecting algorithms do not consider dynamic graph partitioning structures,it employs probabilistic trees to represent probabilistic models of graph partitioning structures,and reduces GPCD into the issue of detecting changes of trees on the basis of the Minimum Description Length (MDL) principle,and proposes Tree algorithm for solving the GPCD problem.Simulation experimental results show that the algorithm realizes significantly less False Alarm Rate(FAR) for change detection than the baseline method called GraphScope.And it is able to detect changes more accurately than GraphScope.
  • YANG Jie,ZHAO Min,LIN Liang,SU Hao
    Computer Engineering. 2015, 41(7): 280-284. https://doi.org/10.3969/j.issn.1000-3428.2015.07.053
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The jet trajectory recognition is the identification of fire-fighting jet running track by image processing method.It is a key link of intelligent fire monitor system.But the fire-fighting monitor jet will grow bigger and branch,the jet trajectory recognition will be influenced by high lighting background and shaking background in use.To solve these problems,a new method based on the Multi-trajectory Vector Search Method(MTVSM)is proposed,together with the search mode of the MTVSM,the set method of search start and search direction,the Multi-trajectory Vector Search principle,the condition of the successful trajectory point search and the condition set for the end of search.The identification method of the trajectory ending,anti-interference processing and the following image processing methods are given.Experimental results show that the fire-fighting jet trajectory is an efficient identification under high lighting background and shaking background.It is less affected on the trajectory recognition when background changes.The result is better than background subtraction and the one-way search method.
  • YI Qingming,ZENG Jielin,SHI Min
    Computer Engineering. 2015, 41(7): 285-288,293. https://doi.org/10.3969/j.issn.1000-3428.2015.07.054
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To enhance the convergence speed of existed Variable Step-size Frequency-domain Block Least Mean Square(VSFBLMS) algorithm,an Accelerating Vector based Variable Step-size Frequency-domain Block Least Mean Square(AV-VSFBLMS) algorithm is proposed.The algorithm judges the current convergence stage of basic step parameters obtained by using the algorithm of VSFBLMS computing,selects a larger iteration number of coefficient updating formulas in early process stage while a smaller iteration number of coefficient updating formulas in later process stage according to the convergence statue.It helps to improve the convergence rate in early stage and also ensures a low misalignment in later stage.Experimental result based on adaptive noise cancelling model shows that the proposed algorithm outperforms other VSFBLMS algorithms in convergence speed with a lower misalignment.
  • CHEN Tingwei,JIANG Yanan
    Computer Engineering. 2015, 41(7): 289-293. https://doi.org/10.3969/j.issn.1000-3428.2015.07.055
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Virtual Machine(VM)is moved transparently by pre-copy.The pre-copy transfers memory pages iteratively and some memory pages are copied again and again during the iterative process.Although the probability prediction method can reduce the number of transferred memory pages,the prediction time is still too long.For this problem,Fast Prediction Pre-copy(FPP)is proposed,to calculate the probability of pages which are modified and then put off the transmission time of them,achieving the purpose of optimizing the real-time migration of VM.Applying the above method to Xen,experimental results show that this method shortens the total migration time and transferred memory pages,and reduces the migration overhead.
  • WANG Weihong,LI Jun
    Computer Engineering. 2015, 41(7): 294-298,304. https://doi.org/10.3969/j.issn.1000-3428.2015.07.056
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    For the low computational efficiency in solving the similarity of two strings by traditional algorithm,an improved edit distance algorithm is proposed.It firstly obtains the longest common prefix and the longest common suffix of the two strings,and then gets the edit distance between the remainder of the two strings by traditional algorithm.Proof by contradiction is used to prove that this edit distance equals to the solution by traditional algorithm.On this basis,the improved algorithm is researched about the advantages and be applied to the Web tamper detection.Experimental results show that compared with the traditional algorithm,the improved edit distance algorithm has better computational efficiency in obtaining the similarity between the pages in the same URL.
  • LUO Youping,ZHOU Zhaomin,LI Lijuan,ZHANG Heng,QIN Damiao
    Computer Engineering. 2015, 41(7): 299-304. https://doi.org/10.3969/j.issn.1000-3428.2015.07.057
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to excavate the inter-relationship from person to person in online social network,this paper explores the complex rules of social network.Based on the dynamics of human behavior,according to the sample of the attention relations in Sina micro-blog network,it picks out the sub-network between several entertainments celebrities to analysis and comparison.It specificly analyzes the network’s degree correlations,reciprocities and the results in different indicators under the network centralities.The results illustrate the networks suit the power-law distribution and possess the small-word characteristic.The conclusion provides a good empirical basis for modeling and mastering the law of social network,and also provides a theoretical basis for the analysis,monitor and control of public opinion on social networks.
  • CHEN Fenglin,LIU Yongbin,FANG Jian,XU Qiang
    Computer Engineering. 2015, 41(7): 305-309. https://doi.org/10.3969/j.issn.1000-3428.2015.07.058
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    According to the different frequency characteristics of vibration signals in different conditions,a feature extraction method for machinery fault diagnosis is proposed based on Empirical Mode Decomposition(EMD) and Joint Approximate Diagonalization of Eigen-matrices(JADE).Vibration signals ae decomposed into different frequency components which are called stationary Intrinsic Mode Functions(IMFs) using EMD.The correlation coefficients of the IMFs and the original spectrum are calculated to construct a feature matrix of spectrum correlation.Then,the dimension of feature matrix is reduced using JADE.Simulation experimental signals are used to verify the effectiveness of the proposed method.The extracted features using this method are applied to machinery fault diagnosis.The features extracted from bearing signals on four conditions are classified by Support Vector Machine(SVM),and the correct rate of classification is more than 95%.The results show that the features extracted by the presented method can effectively characterize machine conditions.
  • WU Jiyun,CHEN Zhide,WANG Lei,WANG Meng
    Computer Engineering. 2015, 41(7): 310-316. https://doi.org/10.3969/j.issn.1000-3428.2015.07.059
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    There are millions of available keywords for each advertiser in keywords auction.How to set a reasonable bid price for the selected keywords under a limited condition,such as budget,becomes the most difficult work for the advertiser.As it is hard for many advertisers to select keywords and set the price,a novel model of auction strategy based on advertisers is proposed for these problems.This auction strategy includes the keywords selection strategy and the bidding strategy.The keywords selection strategy presents a calculation method for keywords correlation which is based on the Term Frequency-Inverse Document Frequency(TFIDF)algorithm.The keywords,selected through this method,not only improve the correlation with the promoted website,increase the conversion rate,but also avoid increasing the competition cost due to the overuse of common keywords.The bidding strategy uses an improved Particle Swarm Optimization(PSO)algorithm to properly adjust the bids of each keyword under some constraint conditions so as to increase the profits of advertisers.Experimental results show that keywords,selected through the auction strategy,increase the conversion rate of website and reduce the competition cost.Moreover,its profit is higher than that of traditional bidding method.The algorithm presents a continuous rising trend in the early-middle period and becomes stable in the late period.
  • FU Hua,ZI Hai
    Computer Engineering. 2015, 41(7): 317-321. https://doi.org/10.3969/j.issn.1000-3428.2015.07.060
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    For the multifactor gas emission quantity predication problem of coal mine,the basic Glowworm Swarm Optimization(GSO)algorithm suffers slow convergence rate in the late stage and is prone to be stuck in local optimum.The luciferin-factor is introduced in this paper to achieve self-adaption of the search step length.The proposed self-Adaptive Step Glowworm Swarm Optimization(ASGSO)algorithm is combined with dynamic feedback Elman Neural Network(ENN) to perform identification of the non-linear gas emission system.By globally searching the optimum of the network weights and thresholds in real-time,the ASGSO-ENN-based model is established by the coupling algorithm to predict the absolute gas emission quantity.The experiments are carried out with the historical monitoring data of the mine and prediction results show that,the root mean square error is 0.103 4 and the average relative variance is 0.000 387.The proposed model is practically useful and outperforms other commonly used models in terms of prediction accuracy and generalization ability.