Author Login Editor-in-Chief Peer Review Editor Work Office Work

15 July 2019, Volume 45 Issue 7
    

  • Select all
    |
  • JIAO Yiming, ZHOU Chuan, GUO Jian, CUI Yuwei
    Computer Engineering. 2019, 45(7): 1-5. https://doi.org/10.19678/j.issn.1000-3428.0051748
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    With the development of computer technology and the continuous improvement of user needs,the problem of multiple Directed Acyclic Graph(DAG) sharing the same group of heterogeneous computing resources has attracted wide attention.However,due to the complexity and change of the actual tasks,there are certain differences between multiple DAG,which lead to the fairness problem of multiple DAG scheduling strategies.Therefore,this paper proposes an improved heuristic fair scheduling algorithm,IFairness.In selecting the DAG phase to be scheduled,a new evaluation index DAG completion degree is adopted,which replaces the remaining Makespan in the original Fairness algorithm as the DAG selection basis.In the calculation phase of the lag degree of each DAG,the principle of "looking forward one step" is adopted to solve the problem that some DAG cannot be scheduled at the initial stage.Simulation results show that compared with the original Fairness algorithm,the unfairness degree of the IFairness algorithm is reduced by 7.28%,and the resource utilization rate is improved by 11.97%,which effectively improves the fairness and resource utilization rate of the scheduling algorithm.
  • ZHANG Lu, ZHU Haiting
    Computer Engineering. 2019, 45(7): 6-12. https://doi.org/10.19678/j.issn.1000-3428.0052048
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In order to quickly process large-scale user data in real environment in e-commerce spammer group detection,a distributed detection algorithm for spammer group is proposed.A candidate group extraction algorithm based on cosine pattern mining is designed to measure the coupling between group members by cosine similarity,so as to extract candidate groups accurately and reduce the computational complexity of subsequent recognition.Combining group projection technology with the Spark computing framework,a distributed group extraction algorithm is proposed to further improve the speed of group detection.Results of experiments and case studies on real data sets show that the proposed algorithm can guarantee the detection accuracy and has high efficiency.
  • WANG Huijian, LIU Zheng, LI Yun, LI Tao
    Computer Engineering. 2019, 45(7): 13-19,25. https://doi.org/10.19678/j.issn.1000-3428.0052424
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    For the prediction of time series data,most of traditional methods predict one or more specific values by analyzing the historical data,but the specific numerical accuracy of the prediction is low.Therefore,this paper proposes a new prediction method of time series short-term trends.It discretizes time series data,uses characters to represent the range of data for each time period,and uses the Neural Network Language Model(NNLM) to predict the next character,which is the range of the next segment of data.Experimental results show that in the circumstances where the prediction result is divided into five intervals,the average prediction accuracy of the algorithm is 56.7%,which means it has higher feasibility compared with support vector machine,cyclic neural network,random forest and other algorithms.And because the character representation has semantic information,the prediction results can reflect the trend of the data and the degree of change.
  • ZHAO Baoqi, LI Weidong, ZOU Jiaheng, LIN Tao, YAN Tian
    Computer Engineering. 2019, 45(7): 20-25. https://doi.org/10.19678/j.issn.1000-3428.0051077
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In order to simplify the off-line data processing flow of Jiangmen Underground Neutrino Observation(JUNO) and reduce resource consumption,a general software system is proposed to process data in distributed computing environment.Based on Message Passing Interface(MPI),communication and data exchange between nodes are realized.Master/Worker architecture is used to manage the life cycle of computing jobs,including computing job splitting,computing resource allocation,computing task execution and monitoring.Test results show that the proposed system has good scalability,and the data generated by the system is consistent with the data generated by manual step-by-step execution of job scripts to run simulation software.
  • CAI Ruichu, HOU Yongjie, HAO Zhifeng
    Computer Engineering. 2019, 45(7): 26-31. https://doi.org/10.19678/j.issn.1000-3428.0051856
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    The application of genetic testing technology has accumulated a large amount of data from different platforms.To address the problem that it is difficult to migrate traditional data classification modes across different platforms,this paper proposes a gene expression data classification algorithm k-HRT based on Hierarchy Rule Tree(HRT).The strategy of data conversion and rule pre-screening is designed to realize the fast mining of the algorithm to solve the large-scale data problems caused by cross-platform characteristics.Experimental results on real gene expression datasets show that,compared with k-TSP algorithm and SVM-RFE algorithm,k-HRT algorithm can effectively improve classification accuracy.
  • LU Zhigang, WU Lu
    Computer Engineering. 2019, 45(7): 32-40. https://doi.org/10.19678/j.issn.1000-3428.0052337
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    There are problems of computational redundancy and incomplete community mining in traditional local expansion methods for identifying overlapping community structures in Enterprise Social Network(ESN).Therefore,an overlapping community discovery algorithm GFE based on greedy factional expansion is proposed.GFE algorithm searches for maximal factions in the original ESN,calculates their link strength according to the degree of association between factions,and converts the original network graph into the maximal faction graph.Under the condition of maximizing fitness function,the seed factions in the maximal faction graph are greedily expanded for community discovery.On this basis,the community differences are compared,and the similar duplicated communities are merged to optimize the hierarchical structure of overlapping community.Experimental results show that the GFE algorithm can effectively discover overlapping community structure in ESN,and the operation efficiency is higher than those of CPM, LFM and other algorithms.
  • DODONOV Oleksandr, JIANG Bo, DODONOV Vadym
    Computer Engineering. 2019, 45(7): 41-45. https://doi.org/10.19678/j.issn.1000-3428.0053440
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    The survivability of computer systems should be guaranteed in order to improve its operation efficiency,especially for the efficiency of its critical functions.This paper proposes a decentralized mechanism based on Software-Defined Architecture(SDA).The concepts of critical functions and critical states are defined,and then,the critical functional parameters of the target system are collected and analyzed.Experiments based on the analysis results are performed for reconfiguring the implementations of the whole system.A formal model is presented for analyzing and improving the survivability of the system,and the problem investigated in this paper is reduced to an optimization problem for increasing the system survival time.
  • CHEN Jinwen, YAO Zhen, YANG Jian, XI Hongsheng
    Computer Engineering. 2019, 45(7): 46-53. https://doi.org/10.19678/j.issn.1000-3428.0051225
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In order to solve the problem of network bandwidth waste seriously caused by redundant transmission of a large number of video clips,an improved software-defined video streaming system is designed.Based on Network Function Virtualization(NFV) technology,video caching and streaming functions can be implemented on network nodes.By the abstracted programmable cache policy module,a Variable-Length Cache Window(VLCW) algorithm is deployed for real-time video streaming transmission to reduce the server load.The length of the cached video clips is adaptively adjusted according to different users' access patterns so as to improve the cache resource utilization.Experimental results show that under the optimization of the VLCW algorithm,the server load of the system drops by 50%,and the cache resource utilization is increased by 3 to 5 times.
  • YANG Tianhao, SUN Jin
    Computer Engineering. 2019, 45(7): 54-59. https://doi.org/10.19678/j.issn.1000-3428.0052204
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Network on Chip(NoC) routing units share the input buffers and only allow sequential access to data,which limits the speed and efficiency of communication on chip.To improve the parallelism of NoC,a routing units architecture based on virtual conflict array is proposed.Before data enters the pipeline of routing units,serial data requests are partially eliminated in the virtual conflict array to reduce the amount of data transmitted by pipeline of routing units and improve the parallelism of the system.Experimental results show that compared with the traditional virtual channel routing units,the routing units with virtual conflict array can effectively shorten the routing delay.
  • FENG Guofu, SHU Yujuan, CHEN Ming, DONG Lifu
    Computer Engineering. 2019, 45(7): 60-65. https://doi.org/10.19678/j.issn.1000-3428.0052735
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    To address the frequent changes of power consumption constraints in mobile computing systems and the energy loss caused by the inability of Dynamic Voltage and Frequency Scaling(DVFS) to effectively overcome the static power consumption,this paper proposes a multi-objective self-adaptive power consumption control method.According to the real-time power consumption constraints,this method formulates the core adjustment strategy to determine the type and number of processor cores,and combines Operating System(OS) thread affinity,process migration and CPU hot-plugging to complete the opening and closing of cores and load management,achieving self-adaptive power consumption.Experimental results on the typical multi-core application MapReduce model Phoenix and Deformable Parts Model(DPM) show that the proposed method can schedule the cores of the suitable type and quantity to complete calculation tasks on demand.Compared with the traditional constant-power system,the execution time and energy consumption are reduced by 60.91% and 48.54% on average,which means the proposed method effectively improves the energy efficiency of the target system.
  • WANG Chenxu, YU Dunhui, ZHANG Wanshan, ZHANG Xingsheng
    Computer Engineering. 2019, 45(7): 66-70. https://doi.org/10.19678/j.issn.1000-3428.0053768
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    To increase the efficiency of crowdsourcing tasks,an improved software crowdsourcing module allocation algorithm is proposed.According to the complexity of the structure and the complexity of the technology,the complexity of the module to be developed is calculated.The cosine similarity algorithm is used to estimate the quality of the module,the importance of the module is calculated according to the critical path,and the module core degree is calculated according to the complexity,quality and importance of the module.The software crowdsourcing module allocation is realized by module core degree sorting.Experimental results show that the fitness value of the algorithm can be improved by at least 33.9,27.7 and 27.8 compared with the CFA,QFA,and KFA allocation algorithms.
  • YANG Li, QI Yaowen, PAN Chengsheng
    Computer Engineering. 2019, 45(7): 71-77,85. https://doi.org/10.19678/j.issn.1000-3428.0052540
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In the environment of large spatial-temporal scale and sparse nodes of spatial information network,the heterogeneity of traditional networks decreases the efficiency of network transmission,and the inter-satellite communication process is easily interrupted.The existing heterogeneous nodes cannot be fully utilized,resulting in frequent switching of inter-satellite links,which makes an increase in average transmission delay and a decrease in QoS.Therefore,this paper takes the spatial information network as the background,and the Software Defined Networks(SDN) architecture as a core to improve the Floodlight controller module,so that the controller can realize the conversion strategy between IP packets and ATM packets.This paper proposes a Network Effectiveness based Routing Algorithm(NERA) for this strategy.Results of simulation experiments show that compared with traditional spatial information network model architecture and satellite network architecture,the average network delay of Software Defined Heterogeneous Satellite Network(SDHSN) architecture improved by the proposed algorithm is reduced by about 3.75% per unit time,and the throughput is increased by 7.22%~11.49%.
  • OUYANG Xiangzhen, ZHU Yian, LI Lian, SHI Xianchen
    Computer Engineering. 2019, 45(7): 78-85. https://doi.org/10.19678/j.issn.1000-3428.0050533
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Based on the study of the temporal and spatial isolation principle of the ARINC-653 standard,this paper designs and implements a safety-critical embedded real time operating system.Micro-kernel design improves system reliability and configurability through Manifest-based task and partition management,partition protection with software and hardware,and software static verification.The embedded real-time operating system kernel prototype is implemented on the PowerPC platform,and the function and performance of the kernel prototype are tested.The results show that the operating system kernel is fully functional and meets the requirements of software and hardware partition,stack overflow protection and monitoring,and drive fault protection,etc.,and the comprehensive performance is better than similar operating system kernel.
  • XIONG Wen, YU Quanxi, WU Renbo, LUN Huiqin, KONG Haibin, TAN Junguang
    Computer Engineering. 2019, 45(7): 86-94. https://doi.org/10.19678/j.issn.1000-3428.0051836
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Aiming at the problem of fluctuation of task scheduling performance on the Platform as a Service(PaaS) flexible regulate and control platform of the power grid production control cloud,a task scheduling framework including core components such as Node Perceptron(NP),Plat Resource Status Server(PRSS) and Task Scheduler(TS) is constructed.In the model selection stage,the hybrid game method is adopted,and the execution nodes are arranged according to the preference of the tasks for different resources,to complete the node load estimation calculation.In the abrupt change stage of the model,the execution effects of the tasks are analyzed to adjust its resource allocation,obtain a task scheduling strategy with higher node scores,and guide the selection of game nodes for subsequent tasks.The task scheduling framework is tested and verified on the distributed Supervisory Control and Data Acquisition(SCADA) system,and the task load balancing and disaster recovery processing of 7~25 fragments and 5 million measurement points are realized.Results show that the task scheduling strategy based on evolutionary game has more stable scheduling performance than the open source task scheduling tool.
  • MA Chao, ZHANG Limin
    Computer Engineering. 2019, 45(7): 95-102. https://doi.org/10.19678/j.issn.1000-3428.0050464
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Aiming at the problem of spreading code and information sequence blind estimation for long code Direct Sequence Spread Spectrum(DSSS) signal,this paper proposes a joint estimation algorithm based on overlapping segment Markov Chain Monte Carlo-Unscented Kalman Filtering(MCMC-UKF) for spreading code and information sequence.Under the Bayesian framework model,combined with the idea of overlapping segmentation,the UKF algorithm is used to solve the nonlinear model,and the mean and variance of the posterior probabilities of each parameter are estimated.The segmentation spread spectrum is obtained by MCMC method,sequence splicing to complete the estimation of the spreading sequence and the information sequence.Simulation results show that the algorithm can adapt to lower Signal-to-Noise Ratio(SNR) and is not limited by the type of spread spectrum sequence.
  • LI Linxiao, CHENG Fang, LI Huimin
    Computer Engineering. 2019, 45(7): 103-107. https://doi.org/10.19678/j.issn.1000-3428.0050861
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Aiming at the problems such as high computational complexity and instability of detection performance of the fast frequency domain correlation algorithm in LTE-A system,an primary synchronization signal overlapping segmentation detection algorithm is proposed.Through the sequence loop method,the tail of the half frame data is extended to ensure sequence integrity.On this basis,the overlapping segmentation is performed to reduce the related complexity and according to the result of threshold operation,whether the synchronization sequence is successfully detected is determined.Simulation results show that,compared with the fast frequency domain correlation algorithm,the computational complexity of the proposed algorithm is reduced by 53.85%,and for different segmentation numbers,the correct detection rate approaches 1.0 when the Signal-to-Noise Ratio(SNR) is -7 dB,which shows a stable detection performance.
  • LIN Feng, ZHOU Xiandong
    Computer Engineering. 2019, 45(7): 108-113,120. https://doi.org/10.19678/j.issn.1000-3428.0050968
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    IEEE 802.11p/1609.4 is exclusively designed as the Media Access Control(MAC) protocol for vehicular networking.To address the problems of channel resource wasting and congestion of competitive channels led by multi-channel collaboration mechanism in the IEEE 802.11p/1609.4 standard protocol,this paper proposes an improved multi-channel collaboration mechanism based on dividing the priorities of Onboard Unit(OBU) and Road Side Unit(RSU).The control channel is divided into Registration Interval(RI) and Polling Interval(PI).The RI node uses Service Chanel(SCH) to improve the resource utilization of the idle channel,and the PI issues a security message on the Control Channel(CCH) through the polling list to eliminate the CCH competition.The verification results in the simulation environment of Matlab and OMNET++ show that the improved mechanism has obvious better performance in the safe message transmission rate,average transmission delay,service channel throughput and message loss risk index,and in the case of high node density,it can still guarantee the efficiency of message transmission.
  • WEI Debin, LIU Jian, PAN Chengsheng, ZOU Qijie
    Computer Engineering. 2019, 45(7): 114-120. https://doi.org/10.19678/j.issn.1000-3428.0051284
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Aiming at the problems of ant colony algorithm in solving multi-objective optimization problems,such as slow convergence speed and easy to fall into local optimal solution,a multi-constrained QoS routing algorithm for satellite networks is proposed.By improving the heuristic function of the ant colony algorithm, the link QoS information is used as an important basis for the ant to select the next hop node, and the ordering idea and the maximum and minimum ant algorithm are combined to optimize the pheromone update rule to obtain the optimal QoS path in line with the current service.Experimental results show that the algorithm has good convergence speed and optimization ability while satisfying multi-QoS requirements of satellite network services.
  • XU Bin, HE Yucheng
    Computer Engineering. 2019, 45(7): 121-125,133. https://doi.org/10.19678/j.issn.1000-3428.0050741
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    To address the low throughput and high storage resource consumption of the low density parity-check decoder,this paper proposes a layered decoding algorithm of QC-LDPC codes.The algorithm uses the receiving channel module to initialize the likelihood ratio information.Then the storage check information and posteriori information are combined to generate a node self-updating decoding algorithm based on the layered minimum sum.The decoder is judged based on the sign bit of posterior information.Simulation results show that the storage resources of the improved decoder are saved by 20% compared with the traditional decoder.When the number of iterations is 10,the throughput can reach 516.8 Mb/s.
  • YAN Xiaoyong, LI Qing, MO Youquan
    Computer Engineering. 2019, 45(7): 126-133. https://doi.org/10.19678/j.issn.1000-3428.0050589
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    As the one-to-one mapping relationship does not exist between the message format type and the message status type in the communication protocol specification,it is difficult to separate messages with the same format type and different status type by clustering.Therefore,a state machine inference method for binary private protocol based on state-related field is proposed.State-related field are identified according to the Longest Common Subsequence Distance(LCSD) to obtain the logical similarity of protocol sessions.An initial state machine based on adjacency table is constructed,and its abnormal session is removed and similar state is merged to reduce the size of protocol state machine.Test results on TCP and SMB protocol datasets show that the proposed method can effectively infer the state machine of binary private protocol,and both its accuracy and recall rate are high.
  • LIN Qiangqiang, TU Shanshan, LIU Meng, XIAO Chuangbai
    Computer Engineering. 2019, 45(7): 134-139. https://doi.org/10.19678/j.issn.1000-3428.0050671
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    For the large-scale deployment environment of Small Cellular(SC),the existing planning processes of the Tracking Area(TA) in Heterogeneous Cellular Network(HCN) have such phenomena as uneven signaling of position update and ping-pong effect.Aiming at this problem,this paper proposes a TA planning scheme based on Newman rapid community detection algorithm.TA planning is modeled as the community detection problem in the complex network,and the Newman algorithm in the community detection is used to get the community partition structure in the network.The concept of modularity of community detection is introduced to measure the performance of the proposed scheme.Experimental results show that,in the case of large number of SC and high Poisson distribution expectation of SC,compared with the TA planning scheme based on game,the modularity of the proposed scheme is increased by 0.107 on average,which shows a better planning performance.
  • DING Chengjun, LIU Qiang
    Computer Engineering. 2019, 45(7): 140-146,153. https://doi.org/10.19678/j.issn.1000-3428.0051341
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In order to improve the accuracy of node deployment in environmental monitoring application,this paper proposes a Particle Swarm Optimization(PSO) algorithm based on spatial weight and fuzzy perception.The spatial weight is introduced to quantify the importance of the region,the fuzzy perception model is established to describe the perceived performance of the nodes,and the weighted coverage rate is designed as the algorithm evaluation function.On this basis,the particle flight characteristics in the perception model are excavated,and the particle evolution equation is optimized by using the weighted gravity to improve the optimization ability of the algorithm.Simulation results show that,compared with PSO algorithm,Virtual Force(VF) algorithm and Extrapolation Artificial Bee Colony(EABC) algorithm,the proposed algorithm can increase the target coverage rate by up to 13% and reduce the number of nodes by up to 15%.
  • WANG Jingjing, LIU Wei, XIA Yu, LUO Rong, HU Shunren
    Computer Engineering. 2019, 45(7): 147-153. https://doi.org/10.19678/j.issn.1000-3428.0050869
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    To address the current situation of lack of wireless channel models for semi-closed environments,the semi-closed corridor is taken as the research object and received signal strength of 2.4 GHz wireless signals are measured and analyzed.Single-slope and double-slope logarithmic distance received power models are built by using linear regression analysis,and corresponding model parameters are obtained by fitting.By comparing with reference models,the influence of semi-closed structure on wireless signal propagation is analyzed.The results show that logarithmic distance received power model is suitable for semi-closed corridor environment,and the double-slope model is better than the single-slope model in fitting.Compared with frequently-used theoretical models and empirical models of approximate scenes,the proposed model can describe actual propagation characteristics of the target environment more realistically.
  • HE Xiangfu, FAN Zhihui, WANG Hui, ZHAI Zhibo, LI Jing
    Computer Engineering. 2019, 45(7): 154-158. https://doi.org/10.19678/j.issn.1000-3428.0051254
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Aiming at the problem that ZigBee wireless sensor network and IPv6 network cannot be directly interconnected,an embedded internet gateway based on Contiki operating system and cc2530 hardware platform is proposed.The gateway architecture is innovatively deployed on the Sink node in the ZigBee network,and the 6LoWPAN technology is used to realize the interconnection and interworking between ZigBee network and IPv6 network.The header compression method on the 6LoWPAN adaptation layer is improved,and the data exchanges between Zigbee protocol on the interface layer and IPv6 protocol on the network layer is realized.In the case of heavy transmission load,simulation results show that the gateway in this paper has the characteristics of low delay and low power consumption,and the effectiveness of the gateway is verified.
  • CHENG Zhiwei, CHEN Caisen, ZHU Lianjun, MO Weifeng, WANG Huiyu
    Computer Engineering. 2019, 45(7): 159-163. https://doi.org/10.19678/j.issn.1000-3428.0051139
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Aiming at the problem that the data collected by Cache timing template attack is noisy,a method of establishing timing template by using the Cache hit rate of access address is proposed,and the input value is judged by Pearson correlation coefficient.The Flush+Reload attack method is used to attack the computer keyboard input.The Cache hit rate of each address is obtained and the address with high Cache hit rate is converted into a template matrix,which is used to calculate the Pearson correlation coefficient,and then the input value is judged according to the size of the coefficient.Experimental results show that this method can improve the accuracy of judging input values compared with the mean square error method.
  • HUANG He, CHEN Jun, DENG Haojiang
    Computer Engineering. 2019, 45(7): 164-169. https://doi.org/10.19678/j.issn.1000-3428.0051190
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    The related protocol packets for Modbus/TCP security vulnerability mining are often generated in a random way,which is prone to generate excessive invalid packets and reduce the efficiency of vulnerability mining.To deal with this problem,a structural fuzzy algorithm named Fuzzy-RNN is proposed based on the concept of Recurrent Neural Networks(RNN).It learns the probability distribution of each part of the proptocol packet from the Modbus-TCP training set,and takes the corner cases into account,so as to realize the targeted fuzzy generation.Experimental results show that compared with the General Protocol Fuzzer(GPF),in a variety of simulation software such as Modbus Slave and xMasterSlave,the Fuzzy-RNN algorithm can achieve the fuzzy generation of legal protocol packets with a higher probability.The test time can be reduced by more than 50%,and its efficiency can be obviously improved.
  • DU Junlong, JIN Junping, ZHOU Jiantao
    Computer Engineering. 2019, 45(7): 170-175. https://doi.org/10.19678/j.issn.1000-3428.0051549
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    This paper proposes a system Data Disaster Tolerant Mechanism(DDTM) to address the problem that general Linux platforms suffer from abnormal attacks,damage,downtime,and virus infection,which leads to system startup failure. The security context is used as a host object,and the configurable form covers the mount preset,the disaster tolerance granularity,and the rewriting policy library. A dynamic rewriting chain is built by traceability of file integrity.Based on the formal definition of DDTM,a fine-grained implementation algorithm is given.Experimental results show that the mechanism is of high reliability and practicability.
  • ZHANG Qi, LI Jiawei, LIN Xijun, QU Haipeng
    Computer Engineering. 2019, 45(7): 176-180. https://doi.org/10.19678/j.issn.1000-3428.0051186
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    The Identity-Based Encryption with Equality Test(IBEET) scheme can simplify the management of keys and certificates while ensuring data confidentiality,but it lacks control over the granularity of authorization,making it difficult to meet the management needs of different data granularities in practical applications.Therefore,random user level,random ciphertex level,specified user level and ciphertext-user level mechanisms are introduced.Based on asymmetric bilinear mapping,an IBEET scheme supporting flexible authorization is constructed,and related definitions and security models are given.The analysis results show that the scheme has the OW-ID-CCA security and can realize user privacy protection.
  • LIAO Fangyuan, CHEN Jianfeng, GAN Zhiwang
    Computer Engineering. 2019, 45(7): 181-187,193. https://doi.org/10.19678/j.issn.1000-3428.0055155
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Critical Information Infrastructure(CⅡ) is the nerve center of economic social operations for the operations of economic society,an important guarantee of network security,and also a target that may be subject to key attacks.The network security protection,which is based on artificial intelligence,can build a more sensitive anomaly identification mechanism,a more automated event analysis engine and more accurate global operations and maintenance capabilities for CⅡ defense.This paper investigates and analyzes the attack risk of CⅡ,studies the morphological structure of the existing security protection system,and builds the intelligent CⅡ security system based on the SMCRC ring idea.On this basis,this paper reviews and analyzes the latest research development on CⅡ defense key points of the situational awareness and continuous monitoring driven by artificial intelligence,and CⅡ defense guarantee of the trust mechanisms and threat information.
  • CHENG Jinxue, XU, XU Lei, ZHAO Zemao, XUE Chunyang
    Computer Engineering. 2019, 45(7): 188-193. https://doi.org/10.19678/j.issn.1000-3428.0052000
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In order to securely retrieve the ciphertext data in cloud storage technology,a searchable encryption scheme in unsecure channel is proposed,and a general method for converting the dual-system encryption scheme into the dual-system searchable encryption scheme in unsecure channel is given.Cloud servers retrieve files containing specific keywords from a large number of encrypted files through traps corresponding to the keywords provided by users without knowing any information about the original files in the cloud.Analysis results show that compared with the traditional searchable encryption data retrieval scheme,the proposed scheme can shorten the length between trapdoor and ciphertext,and the verification process only needs two bilinear pairings operations.
  • XU Shuo, TANG Zuoqi, WANG Xin
    Computer Engineering. 2019, 45(7): 194-202. https://doi.org/10.19678/j.issn.1000-3428.0052209
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Fully considering the influence of uncertainty of evaluation information on evaluation results,an information security risk assessment method based on D-number Analytic Hierarchy Process(D-AHP) and grey theory is proposed.According to the relevant industry standards,the assets,threats,vulnerabilities and existing security measures of information system are identified,the evaluation index system is constructed,and the hierarchical structure model is established.The D-AHP method is used to calculate the influence weights of each index to solve the uncertainty problem of the evaluation information.In view of the grey characteristics of insufficient information resources in the evaluation process,the grey theory is used to solve the grey evaluation matrix.On this basis,the information security risk is assessed comprehensively and the assessment results are displayed intuitively.Analysis show that this method can use uncertain information for risk assessment and provide reference for formulating targeted risk management and control strategies.
  • MA Hao, YIN Baoqun, PENG Sifan
    Computer Engineering. 2019, 45(7): 203-207. https://doi.org/10.19678/j.issn.1000-3428.0050951
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    The single-picture crowd count has a sharp decline in performance due to severe population occlusion and scale changes.Therefore,this paper proposes an algorithm for crowd counting pictures,and gives a Full Convolution Network(FCN) capable of processing the resolution of any picture.The scale change and occlusion problems in the picture are solved by applying the feature pyramid network to the crowd count.The network model is trained and evaluated in the crowd counting database ShanghaiTech,results show that the algorithm has good robustness and accuracy compared with the current mainstream crowd counting algorithm.
  • ZHUANG Lichun, ZHANG Zhengjun, ZHANG Naijin, LI Jundi
    Computer Engineering. 2019, 45(7): 208-211. https://doi.org/10.19678/j.issn.1000-3428.0050800
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    To address the problem that the linear Logistic model in the UDEED algorithm has poor classification prediction accuracy,based on Taylor expansion,an improved nonlinear Logistic model algorithm for polynomial kernel is proposed.The estimation method for kernel function parameter of nonlinear Logistic model is studied,and the calculation rules of the loss function are updated.The improved UDEED model is solved by the gradient descent method,and the data set is classified and predicted.Experimental results show that compared with the UDEED algorithm,the improved algorithm improves the accuracy of classification prediction.
  • DING Yong, WANG Xiang, JIANG Cuiqing
    Computer Engineering. 2019, 45(7): 212-216. https://doi.org/10.19678/j.issn.1000-3428.0051504
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In the location recommendation application,the traditional collaborative filtering recommendation algorithms are not effective due to the sparseness of the check-in data.In order to improve the recommendation effect and overcome the shortcomings of the traditional collaborative filtering recommendation algorithms affected by popular locations,this paper proposes a new location recommendation algorithm.The check-in location is transformed into a vector,the similarity between the locations is calculated by the cosine similarity of the vectors.The locations with low check-in frequency are marked as unpopular locations,which can be used to calculate the similarity of the users at the check-in location.The user's recommendation list is generated in conjunction with the influence of the geographical factors.Experimental results show that compared with the traditional collaborative filtering recommendation algorithms,the F1 value of the algorithm is improved by more than 0.009,and the recommended effect is better.
  • FANG Yuling, CHEN Qingkui
    Computer Engineering. 2019, 45(7): 217-221,228. https://doi.org/10.19678/j.issn.1000-3428.0051507
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    An efficient convolution calculation optimization method MCFA based on matrix transformation is proposed.The input matrix is divided into blocks according to the width and the convolution core size of the output matrix.The input matrix sub-blocks and the core function matrix are transformed by im2col method.The matrix-matrix multiplication library encapsulated in the Computing Unified Device Architecture(CUDA) is used to speed up the convolution calculation.On this basis,the output sub-blocks are arranged in order,and the complete output matrix is finally obtained.Experimental results show that this method can save 61.25% of the computing space compared with im2col method,improve 20.57% of the computing speed compared with MEC method,and relieve the cathe pressure caused by large input matrix in the case of block,thus improve the cache utilization.
  • YANG Jiali, LI Zhixu, XU Jiajie, ZHAO Pengpeng, ZHAO Lei, ZHOU Xiaofang
    Computer Engineering. 2019, 45(7): 222-228. https://doi.org/10.19678/j.issn.1000-3428.0051041
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In order to solve the problem that the collaborative filtering algorithm has low recommendation efficiency when processing a large amount of data,an adaptive hybrid collaborative recommendation algorithm is proposed.The algorithm adjusts the weight of the model based on the to-be-recommended user activity and the freshness of target items.The similarity between items is calculated based on the tensor decomposition.The prediction result is generated based on short path enumeration superposition.Experimental results show that compared with the CBCF algorithm,the proposed algorithm improves the recommendation accuracy by 28.6%.
  • YU Qingying, LI Qian, CHEN Chuanming, LIN Wenshi
    Computer Engineering. 2019, 45(7): 229-236,241. https://doi.org/10.19678/j.issn.1000-3428.0051574
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In order to effectively utilize the internal and external attributes of the trajectory for anomaly detection,an abnormal trajectory recognition method based on BP neural network is proposed.The original trajectory data is denoised and stored to the LBS cloud of Baidu Cloud.The trajectory data based on Baidu map is designed to visualize the trajectory of the website,and the trajectory attribute value is calculated by normalizing the data.At the same time,the internal and external feature attributes of the trajectory are used as the input layer of the BP neural network algorithm,the trajectory similarity measure is used as the output layer,and the hidden layer coefficient is adjusted to obtain the training model,thereby identifying the user's abnormal trajectory.Simulation results on two user datasets show that the accuracy of the anomaly trajectory identification of the proposed method is 92.3% and 100%,respectively.
  • XU Yingying, HUANG Hao
    Computer Engineering. 2019, 45(7): 237-241. https://doi.org/10.19678/j.issn.1000-3428.0051291
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Based on the Bi-Long Short Term Memory (BiLSTM),this paper proposes a label splitting strategy for Spoken Language Understanding(SLU)and constructs a joint model.The model convert a classification of 127 labels into 3 independent classifications to balance the labels in the ATIS database.Due to the scarcity of ATIS data,this paper introduces external word embedding to improve the classification performance of the model.Experimental results show that compared with the traditional recurrent neural network and its variants,the proposed joint model obtains significantly improvement in F1 value,which can be up to 95.63%.
  • SHI Congwei, ZHAO Jieyu, CHANG Junsheng
    Computer Engineering. 2019, 45(7): 242-250. https://doi.org/10.19678/j.issn.1000-3428.0053445
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    When the skeleton feature is extracted by the medial axis transformation algorithm,the result is sensitive to boundary noise and prone to burrs.To adress this problem,an improved algorithm based on the medial axis transformation is proposed.The original media axis is calculated by the Voronoi diagram and the burrs are removed with an improved Quadratic Error Metric(QEM).Experimental results on 2D and 3D graphic datasets show that the proposed algorithm is capable of extracting a simple and accurate skeleton and is robust to boundary noise.
  • YANG Xiankang, PAN Maodong, TONG Weihua
    Computer Engineering. 2019, 45(7): 251-257,263. https://doi.org/10.19678/j.issn.1000-3428.0051288
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Based on the sparse distribution of feature lines on meshes,the paper proposes an optimized algorithm for feature lines extraction.For a given mesh,compute a scalar or a vector for each triangle as the input.Then,an optimized model is established for the measurement of the input so that the jumps on the edge of the meshes are as few as possible and the changes before and after optimization are minimized.An alternating direction optimization algorithm based on variable splitting technique and penalty function approach.In addition,a heuristic strategy is introduced to improve the sparsity of the solution,thus achieving higher quality of feature lines.Experimental results demonstrate that this method can effectively extract feature lines.Compared with some state-of-the-art methods such as Crest lines algorithm,variation algorithm and others,the proposed algorithm can improve the quality of feature lines extraction and the robustness for noisy data.
  • ZHANG Jiemei, YANG Cihui
    Computer Engineering. 2019, 45(7): 258-263. https://doi.org/10.19678/j.issn.1000-3428.0052132
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Since the size and shape of the liver vary from person to person,and the grayscale contrast value of the liver and its adjacent organs in the CT image is low,it is difficult to accurately determine the boundary information of the liver image.Aiming at these problems,this paper proposes an improved algorithm based on Fully Convolutional neural Network (FCN).Based on the FCN,the residual and VGG-16 networks are introduced to obtain the initial segmentation result of the liver.The Batch Normalization (BN) and PReLU activation functions are introduced to improve the generalization ability and convergence speed of the network.Conditional Random Field (CRF) method is used to further optimize the segmentation result and improve the segmentation accuracy.The 2-dimensional liver segmentation result is reconstructed into a 3-dimensional structure by the system of VTK and ITK.The effectiveness and efficiency of the algorithm are verified by the experimental results on the 3DIRCADb date set.
  • XIE Xiaoyan, XIN Xiaofei, ZHU Yun, WANG Feilong, LIU Yang
    Computer Engineering. 2019, 45(7): 264-267. https://doi.org/10.19678/j.issn.1000-3428.0051505
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    To address the problem that the depth map characteristics are not fully reflected,and the algorithm has high complexity and low efficiency in 3D High Efficiency Video Coding(3D-HEVC) inter prediction,this paper proposes a motion estimation algorithm based on depth map edge detection.First,the edge detection preprocessing is performed on the depth map.Then,according to the results,the full search algorithm and the hexagon search algorithm are respectively performed on the edge region and the flat region.The fast search of flat areas can be implemented to reduce the computational complexity of motion estimation SAD in inter prediction.Test results on the 3D-HEVC/HTM16.0 platform indicate that the proposed algorithm reduces the coding time of the depth map by 6.7%,while the Band rate of the synthetic view is only 0.146%,and the coding efficiency is improved significantly.
  • YANG Gang, JIN Tao, WANG Dawei, CAO Jingjin, ZHANG Na, YAN Biwu, LI Tao, CHENG Yuan
    Computer Engineering. 2019, 45(7): 268-272,281. https://doi.org/10.19678/j.issn.1000-3428.0050790
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    For the cost aggregation problem in stereo matching,an improved cost aggregation algorithm is proposed.The image is segmented by superpixel and the Minimum Spanning Tree(MST) is established.Then,use tree filter for cost aggregation to generate superpixel cost.The final pixel cost is obtained by weighting the initial pixel cost and superpixel cost.Experimental results show that compared with MST,Segmcat-Tree(ST) and other methods,the proposed algorithm has lower time complexity,and can get disparity map with good edge preserve features.
  • CHEN Yangyang, QIAN Pengjiang, ZHAO Kaifa, SU Kuanhao
    Computer Engineering. 2019, 45(7): 273-281. https://doi.org/10.19678/j.issn.1000-3428.0052534
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In order to generate Synthetic Computed Tomography (sCT) from Magnetic Resonance Imaging (MRI) data,this paper proposes the TFCM-SVM method for the abdominal MRI data based on mDixon seguence.With the advanced historical knowledge provided by the MRI source data of patients,this method uses Transfer Fuzzy Clustering Means (TFCM) to process abdominal MRI data.The supervised learning method is used to perform the final segmentation with voting strategies on the clustering result.Finally,the corresponding CT values are assigned to the segmented tissues to generate sCT.Experimental results show that the method can reliably partition abdominal MRI data into the four groups regarding fat,bone,air,and soft tissue,and robustly generate sCT.The Mean Absolute Prediction Deviation(MAPD) of the prediction value is only 81 HU,which is significantly improved compared with the FCM method.
  • HAO Zhanjun, CAI Wenbo, DANG Xiaochao
    Computer Engineering. 2019, 45(7): 282-290. https://doi.org/10.19678/j.issn.1000-3428.0051122
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In the non-sight distance transmission environment,the lower characteristic dimension of the Received Signal Strength (RSS) in the coarse estimation phase results in poor positioning performance.To address this problem,an improved time reversal two-stage indoor positioning method based on Multi-Dimensional Scale (MDS) algorithm is proposed.Specific reference point of the RSS and the Channel Frequency Response (CFR) are collected.Linear time domain filtering is adopted to narrow the dynamic data range of the Channel State Information(CSI).RSS and MDS algorithms are used for coarse location estimation to determine the location range of to-be-measured points,and the fingerprint database is constructed.The Combined Time Reversed Resonating Strength (CTRRS) value is calculated by using the pre-processed CFR and the CFR at each reference point in the fingerprint sub-library,and the reference point of the CTRRS maximum value is searched for precise positioning.Experimental results show that compared with the time reversal indoor positioning method,the positioning time of the improved method can be increased by 56.5%.
  • ZHOU Mengni, NIU Yan, CAO Rui, YAN Pengfei, XIANG Jie
    Computer Engineering. 2019, 45(7): 291-295,302. https://doi.org/10.19678/j.issn.1000-3428.0051201
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    To address the low efficiency of clinical manual diagnosis of epilepsy,this paper establishes an automatic diagnosis model of epilepsy signal based on phase synchronization.First,the model use Phase Locking Value (PLV) to measure the degree of synchronization of brain regions in different states and constructs a corresponding brain function network connection matrix.Then,the two global attributes,clustering coefficient and characteristic path length,are extracted as training features to input onto Support Vector Machine (SVM).Finally,6-fold cross-validation method is used for the classification and identification of interictal and ictal signals.Experimental results show that the classification effect of the weighted network is better than that of the binary network.The average accuracy of the weighted network is 83.4%.Single attribute is not enough to fully reflect the difference in functional network connections in two states of epilepsy,and most patients achieve better classification results in gamma and beta bands.
  • LI Lian, YANG Haotian
    Computer Engineering. 2019, 45(7): 296-302. https://doi.org/10.19678/j.issn.1000-3428.0051134
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Aiming at the reliability modeling and analysis of lock-step self-monitoring processor system,based on the two-machine synchronous system,a lock-step self-monitoring processor system with error self-checking,fault location and failure repair function is constructed.This paper describes the architecture and working principle of the system,and establishes a system reliability model based on Generalized Stochastic Petri Nets(GSPN) according to the system feature instantiation library set and transition set.Through analysis and comparison with single processor system reliability model,the results show that the model is highly reliable,and provides theoretical support and technical methods for subsequent lock-step system design based on parameter comparison experiments.
  • LIU Yue, ZHAI Donghai, REN Qingning
    Computer Engineering. 2019, 45(7): 303-308,314. https://doi.org/10.19678/j.issn.1000-3428.0051312
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Combining Convolutional Neural Network (CNN) and Nested Long Short-Term Memory (NLSTM) models,this paper proposes a CNLSTM model for text representation and classification based on the attention mechanism.The model uses CNN to extract feature of phrase sequences,and then uses NLSTM to learn the representation of text features.By introducing attention mechanisms,the key phrases are highlighted to optimize feature extraction.Experiments on three published news data sets demonstrate that the classification accuracy of the model is 96.87%,95.43%,and 97.58%,respectively,and its performance is significantly improved compared with the baseline methods.
  • GAO Xiue, CHEN Xiaoshuang, WANG Yunming, CHEN Bo
    Computer Engineering. 2019, 45(7): 309-314. https://doi.org/10.19678/j.issn.1000-3428.0051086
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    The bridge-edge is the pivot of network structure connection and plays a crucial role in the flow of command and control network information.Accurately identifying and protecting the bridge edge can greatly improve the destruction resistance of the command and control network.To this end,a bridge-edge identification method based on bridging coefficient is proposed.The method calculates the sum of the number of reachable secondary and tertiary paths between the two endpoints of the key edge,calculate the ratio between the sum and the product of the augment degree of the endpoints to define bridging coefficient,and determine the criticality of edge.Simulation results show that the similarity between the bridge edges identified by the method and the betweenness method exceeds 50%,and 73% of the identified key edges can be identified by the Jaccard coefficient and the betweenness methods.Using the edge deletion method to verify the criticality of the bridge-edge,the recognition precision of the method is higher than those of the degree product,the Jaccard coefficient and the betweenness methods.
  • LI Tingshun, WANG Wei, LIU Zesan
    Computer Engineering. 2019, 45(7): 315-320. https://doi.org/10.19678/j.issn.1000-3428.0053148
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In order to improve the reliability of load forecasting in power market,combining Generalized Extreme Learning Machine(GELM),Wavelet Neural Network(WNN) and sampling model construction technology,this paper proposes a hybrid probability power load forecasting method.Considering the uncertainty of the prediction model and data noise,the wavelet function is used to divide the information into sub-sequences with different frequency attributes and to analyze them with similar resolution scales.GELM is used to perform fast training for WNN, and the iterative adaptive sampling technique is used to evaluate the uncertainty of the model.The power load prediction is output in the form of probability intervals.The largest load of the power system is predicted 24 h in advance.The results show that the mean absolute percent error(MAPE) of the proposed method is lower than 1.1%,which is better than those of the gray value prediction model and the ratio estimation method.