Author Login Editor-in-Chief Peer Review Editor Work Office Work

15 March 2020, Volume 46 Issue 3
    

  • Select all
    |
    Hot Topics and Reviews
  • FU Yiwenjin, CHEN Huahui, QIAN Jiangbo, DONG Yihong
    Computer Engineering. 2020, 46(3): 1-10. https://doi.org/10.19678/j.issn.1000-3428.0056025
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    The spatiotemporal data,which includes both time and space dimensions,is often used in some fields such as logistics and supply chain. Although the traditional centralized storage method has certain convenience,it cannot meet the needs of spatiotemporal data storage and query. In contrast,the blockchain technology applies decentralized distributed storage mechanism and uses consensus protocol to guarantee data security. The architectures and performance characteristics of the current blockchain 1.0,blockchain 2.0 and blockchain 3.0 represented by Block-DAG are studied,and their performance support,optimization modes and limitations for the spatiotemporal data are analyzed. Furthermore,the application prospect,existing problems and future research directions of blockchain technology in the field of spatiotemporal data are discussed.
  • WANG Zhihui, WANG Xiaodong
    Computer Engineering. 2020, 46(3): 11-17. https://doi.org/10.19678/j.issn.1000-3428.0053748
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Large-scale text analysis is an important means of understanding and finding value of big data.Hence text classification,as a classical natural language processing problem,has been widely concerned by researchers,and its main research direction is artificial neural network due to its excellent performance in text analysis.This paper introduces the history of Convolutional Neural Network(CNN),Recurrent Neural Network(RNN),recursive neural network structure and the pretraining model applied to text classification.Then this paper compares classification performance of different models based on the common dataset,demonstrating that artificial neural network structure can reduce manual characterization work by automatically obtaining text features,and thus improve text classification effects.On this basis,this paper prospects the future research directions of text classification.
  • DONG Siqi, WU Jiahui, LI Hailong, QU Yuben, HU Lei
    Computer Engineering. 2020, 46(3): 18-23. https://doi.org/10.19678/j.issn.1000-3428.0054490
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In mobile edge computing,most of existing resource allocation methods allocate computing resources according to the time that tasks request computation offloading,without considering task priority in practical applications.To meet computing requirements in such cases,this paper proposes a resource allocation method for priority tasks.This method assigns priority to each task based on their average processing value,and implements weighted allocation of computing resources for tasks of different priorities.It ensures high-priority tasks obtain sufficient computing resources while the total time and energy consumption for completing all tasks are reduced.Thus the Quality of Service(QoS) is improved. Simulation results show that,compared with the method of evenly allocating computing resources,the method of allocating resources according to the amount of task data and the method of placing all tasks on mobile terminals,the calculation delay of this method is reduced by 83.76%,15.05% and 99.42% respectively,and the energy consumption is reduced by 84.78%,17.37% and 87.69% respectively.
  • FU Lin, SHAO Peinan, YING Fei, XIE Wei
    Computer Engineering. 2020, 46(3): 24-33. https://doi.org/10.19678/j.issn.1000-3428.0055890
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    For information system security,this paper proposes a Mimic Common Operating Environment(MCOE) framework for Dynamic Heterogeneous Redundancy(DHR) architecture.The objects of the framework are heterogeneous redundancy information system application that is functionally equivalent to the system before mimicry transformation and running environment facilities of heterogeneous information systems.For N heterogeneous executive bodies,the framework constructs a supportive environment for automation to process resource allocation,distribution,execution,voting,security threat cleaning recovery and management for service requests.The framework provides uniform integrated interface standards for the distribution and voting of mimicry products.In this framework,the N heterogeneous executors driven by the primary key of service request and five servers,MCOE distribution,internal voting,external voting and management servers collaborative execute.Simulation results show that the design can effectively protect network attacks from hardware and software backdoor and vulnerability.
  • Lü Jia, CAO Suzhen, KOU Bangyan, ZHANG Zhiqiang, HAN Longbo
    Computer Engineering. 2020, 46(3): 34-39. https://doi.org/10.19678/j.issn.1000-3428.0056456
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    To solve the problem of user identity information leakage and privacy protection under 5G network,this paper proposes a signcryption scheme for certificateless identity hiding.The scheme uses cryptographic hash functions to bound the identity information of the signer to the public key,and generates a part of the private key of the user to prevent public key substitution attacks.In signature decryption,the signer identity is not included in the input information,but is output for verification,so the signcryption process is anonymous.Experimental results show that in the random prediction model,the security of the proposed scheme can be reduced to the calculation of the Decisional Diffie-Hellman(DDH) problem,and the scheme has a high communication efficiency and low computation cost.
  • Artificial Intelligence and Pattern Recognition
  • YANG Ping, XIE Zhipeng
    Computer Engineering. 2020, 46(3): 40-45. https://doi.org/10.19678/j.issn.1000-3428.0053393
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Definition extraction is the task that automatically identifying definition sentences from unstructured text.The definition extraction problem can be modeled as the sequence labeling problem of a sentence term and its corresponding definition,in which the extraction task can be accomplished by using the labeling results.Aiming at the shortcomings that the traditional definition extraction method can easily cause error propagation while defining features,this paper proposes a sequence labeling neural network model based on Bidirectional Long Short Term Memory(BiLSTM) to automatically execute definition extraction for input text.By inputting the original data into the BiLSTM neural network,this model completes the feature representation of the input sentences.Then,by using the LSTM based decoder for decoding,the labeling results are obtained.Experimental results on the Wikipedia English dataset show that the accuracy,recall and F1 value of the proposed method are 94.21%,90.10% and 92.11% respectively and it can effectively improve the effectiveness of the benchmark model.
  • HU Junyi, LI Jinlong
    Computer Engineering. 2020, 46(3): 46-52,59. https://doi.org/10.19678/j.issn.1000-3428.0054521
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Not all words in the text have similar sentiment tendency and intensity,so it is very important for sentiment classification tasks that the context is well encoded and the key information is extracted.Therefore,this paper proposes a hierarchical attention network framework based on sentiment evaluation to conduct effective classification for text sentiment.The bidirectional recurrent neural network encoder is used to encode the word vector and sentence vector respectively and the final representation of the text is obtained by the weighted sum of attention mechanism.On this basis,the auxiliary network is designed to evaluate the sentiment of words and sentences.The evaluation score is used to adjust the distribution of attention weight.After exploring the influence of sentiment information of text on classification performance,on the basis of hierarchical representation,the model is further prompted to focus on the information with strong sentiment color through the auxiliary network.Experimental results on four commonly used sentiment classification datasets show that the proposed framework can focus on the sentiment expression in the text and obtain high classification accuracy.
  • SHANG Yingjie, DONG Liya, HE Hu
    Computer Engineering. 2020, 46(3): 53-59. https://doi.org/10.19678/j.issn.1000-3428.0054208
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Spiking Neuron Network(SNN) uses spike sequence for data processing,so it has the excellent characteristic of low power consumption.However,due to the immaturity of learning algorithm,the multilayer network training has difficulty in convergence.Utilizing the mature learning algorithm and fast training speed of the back propagation network,this paper proposes a transfer learning algorithm.The algorithm completes the training process based on the back propagation network and transfers the training results to the spiking neuron networks through the spike coding rules and the adaptive weight mapping relationship.Experimental results show that the transfer learning algorithm can effectively solve the convergence problem in the training process of multilayer spiking neuron networks.The recognition accuracy on the MNIST dataset and CIFAR-10 dataset can be up to 98.56% and 56.00% respectively,with low power consumption at the microwatt level.
  • PENG Zhuliang, LIU Bowen, FAN Cheng'an, WANG Jie, XIAO Ming, LIAO Zeen
    Computer Engineering. 2020, 46(3): 60-65,72. https://doi.org/10.19678/j.issn.1000-3428.0053929
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Aspect-Based Sentiment Analysis(ABSA) has been widely used in text information mining,but can hardly extract accurate feature information when the sentiment polarity of a sentence is fuzzy or a sentence has sentiment polarities of multiple aspects,which undermines performance of sentiment polarity classification.To address the problem,this paper proposes a sentiment classification method that combines the bidirectional long short-term memory and aspect attention module.The method uses multiple aspect attention modules to independently train different aspects at the same time,making information and attention operations of each aspect processed without affecting the other.Attention parameters of each aspect are independently learnt and modified,so hidden information of a specific aspect can be fully extracted for more effective recognition of sentiment polarities of different aspects.Experimental results on the SemEval dataset show that compared with the existing baseline sentiment analysis method,the proposed method can enhance sentiment classification performance,with the classification accuracy rate,recall rate and F1 value significantly improved.
  • GAO Maoting, WANG Ji
    Computer Engineering. 2020, 46(3): 66-72. https://doi.org/10.19678/j.issn.1000-3428.0054096
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    The user behavior preference is often affected by many factors such as social relationships,time and so on.However,when constructing the user preference model,if only one single factor is considered,the model can be one-sided,resulting in incorrect recommendations.Therefore,this paper proposes a topic model recommendation algorithm,in which the user social relationships and time factors are considered.The topic model is used to model the user's labeling behavior,thus obtaining the user-item probability matrix.According to the time spent on labeling the items,the time weight of user's labeling behavior is calculated,and combined with the user's labeling behavior weight to calculate the time-based user similarity.Then,the user's weight is obtained by weighting the user's social relationships and time-based user similarity.On this basis,with a consideration of other user's influence,the finally preference weight of user-item is calculated and the recommendation results is obtained by preference ranking.Experimental results on the Last.fm dataset show that the proposed algorithm can consider the user's features in a more comprehensive way,thus improving the recommendation quality.
  • XU Huijun, WANG Zhong, MA Liping, RAO Hua, HE Cheng'en
    Computer Engineering. 2020, 46(3): 73-78,86. https://doi.org/10.19678/j.issn.1000-3428.0054223
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    The traditional collaborative filtering algorithm has the problems of sparse data,weak scalability and deviated user interest,causing low efficiency in algorithm operation and low accuracy in prediction.To address these problems,this paper proposes an improved Mini Batch K-Means time-weighted recommendation algorithm.The Pearson correlation coefficient is used to improve the Mini Batch K-Means clustering,and the improved clustering algorithm is applied to cluster the sparse scoring matrix,calculate user interest score and complete the filling of sparse matrix.Giving the influence of user interest varying with time,this paper introduces the Newton's law of cooling time weight to improve the similarity.The filled scoring matrix is used to perform weighted calculation on the similarity and on this basis,the final score is obtained.Experimental results show that compared with the traditional collaborative filtering algorithm,the mean absolute error of the proposed algorithm is reduced by 31.08%,and the precision,recall and F1 value are improved a lot,which shows its high scoring prediction accuracy.
  • ZHANG Yueping, LI Ru, WANG Yuanlong, CHAI Qinghua, WU Yujuan, GUAN Yong
    Computer Engineering. 2020, 46(3): 79-86. https://doi.org/10.19678/j.issn.1000-3428.0055783
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Null Instantiation(NI) recognition and filling is the process of finding fillers for missing semantic roles of a given sentence from the context in discourses,but existing methods that use classification to predict correct fillers in the set undermine the performance of NI filling.To address this problem,this paper proposes a new NI recognition and filling method.It combines heuristic rules and the decision tree algorithm to identify NI of to-be-filled contents,and the contents filled with frame elements in the context are collected to form a candidate set.Then an improved SMOTE algorithm is used to extend the minority sample data to solve data imbalance in the candidate set.On this basis,semantic similarity features are extracted from the knowledge base of Chinese FrameNet(CFN),and the mapping relationships between frame elements are used to improve filling performance.Experimental results show that this method can increase the final F value by about 12% by relieving the imbalance of filling samples at the data level.
  • NI Zhiwen, MA Xiaohu, SUN Xiao, BIAN Lina
    Computer Engineering. 2020, 46(3): 87-92,98. https://doi.org/10.19678/j.issn.1000-3428.0053928
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Feature engineering has significant effects on performance of machine learning algorithms,but its cost continues to grow with scaling Internet data.In order to reduce the dependence on feature engineering,this paper proposes a fusion model combining explicit and implicit feature interactions.The sparse structural element and residual element are combined to extract implicit features,and the explicit features are learned by using compressed interactive network.And then,fuses explicit feature vectors with implicit feature vectors on the last fully connected layer.Experimental results on 4 kinds of different datasets show that the proposed model has better feature extraction performance than PNN,DCN and other models.
  • KANG Yan, YANG Qiyue, LI Hao, LIANG Wentao, LI Jinyuan, CUI Guorong, WANG Peiyao
    Computer Engineering. 2020, 46(3): 93-98. https://doi.org/10.19678/j.issn.1000-3428.0053717
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Traditional text classification method only uses one model for classification,so it is easy to ignore the overlapping of different categories of feature words,which affects the classification performance.To improve accuracy of text classification,this paper proposes a text classification algorithm based on topic similarity clustering.The algorithm combines CHI with WordCount to extract category feature words.Then it performs clustering using the K-means algorithm and extracts cluster feature words to constructs a cluster feature word library.On this basis,the Adaptive Strategy algorithm is used to adaptively choose fasttext,TextCNN or RCNN model for classification to obtain the final classification result.Experimental results on the AG News dataset show that the proposed algorithm can better solve overlapping of different categories of feature words,and significantly improves text classification performance compared with fasttext,TextCNN and RCNN models used alone.
  • HENG Hongjun, LIU Jing
    Computer Engineering. 2020, 46(3): 99-104. https://doi.org/10.19678/j.issn.1000-3428.0055186
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Existing outlier detection models cannot accurately analyze abnormal driving behavior.To address the problem,this paper builds a driving outlier detection model using multidimensional time series based on an autoencoder and the isolation forest algorithm.The model uses sliding windows to calculate the norm of the original multidimensional time series,the change rate of the norm and values of related statistical information to extract data features.Feature data is reconstructed using an autoencoder,and on this basis the isolation forest algorithm is used to realize outlier detection.Experimental results show that the proposed model generally outperforms other outlier detection models such as LOF,OCSVM,iForest and LSTM-AE,increasing the recall rate and F1 value by at least 6% and 2.4% respectively.
  • Cyberspace Security
  • SONG He, WANG Xiaofeng
    Computer Engineering. 2020, 46(3): 105-113. https://doi.org/10.19678/j.issn.1000-3428.0054002
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Low-rate Distributed Denial-of-Service(LDDoS) attack,as a complex large-scale network attack,is one of the major threats faced by modern networks.It is necessary to establish an emulation platform to study LDDoS attack and defense method,so as to improve the emulation fidelity while ensuring emulation scale.Therefore,based on lightweight virtualization,this paper proposes an LDDoS emulation method for BGP connection.The emulation architecture is built by converging network topology construction,attack scene configuration,and acquisition and analysis.Then the implementation method of the emulation architecture based on lightweight virtualization technology is proposed.Experimental results show that compared with the GTNeTS method and the GNS3 method,the proposed method has the advantages of high fidelity,strong scalability and large emulation scale.With this method,a single physical server can construct an LDDoS emulation scene with four hundreds of routing nodes,so it can provide emulation technology foundation for the research of large-scale LDDoS attack and defense strategy.
  • REN Dezhi, CHEN Juguang, WANG Yong, DUAN Xiaoran, HAO Yujie, WU Xiaohua
    Computer Engineering. 2020, 46(3): 114-119,128. https://doi.org/10.19678/j.issn.1000-3428.0054001
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In the data outsourcing service,Spatial Polynomial Function(SPF) query can ensure the authenticity of the query information returned to users,so it has high application value.In order to solve the problem of high communication cost of Inverted File(IF) in MIR tree,by replacing the IF with bitmap,this paper constructs MRH tree,a data index structure supporting query authentication.On this basis,the generation algorithm of Verification Object(VO) is presented to verify the query results.Experimental results show that MRH tree can significantly reduce the communication overhead and computing time of query compared with MIR tree on the premise of ensuring the reliability,correctness and integrity of query results.
  • CAO Suzhen, DU Xialing, WANG Youchen, LIU Xueyan
    Computer Engineering. 2020, 46(3): 120-128. https://doi.org/10.19678/j.issn.1000-3428.0054493
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    To solve the problems of uncontrollable search behavior,untrusted search results and unitary search semantics of existing searchable encryption schemes,this paper proposes a verifiable attribute-based multi-keyword sorting retrieval scheme under multi-server mode.In this scheme,the multi-dimensional B+ tree is constructed as the index storage structure to store the index and ciphertext separately.The subtrees of low relevance is clipped by using the pruning strategy in advance,so as to realize fast multi-keyword sorting search.The attribute-based encryption technology is used to authorize the search behavior,and the retrieval results are verified by the authorized verification server to ensure the correctness of the retrieval results.The analysis results of security and efficiency show that,based on the DL assumption and q-BDHE assumption,under the random prediction model,the scheme can resist choice plaintext attacks and keyword guessing attacks,and can effectively reduce the computational cost.
  • ZHANG Haijun, CHEN Yinghui
    Computer Engineering. 2020, 46(3): 129-137,143. https://doi.org/10.19678/j.issn.1000-3428.0053360
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In this paper,the methods similar to image processing and vectorization are used for the verctorization of access traffic corpus big data,and the intelligent detection for big data cross-site scripting attack is achieved.Besides,this paper uses methods similar to image processing for data acquisition,data cleaning,data sampling and feature extraction.Then,a word vectorization algorithm based on neural network is designed to obtain the big data of word vectorization.On this basis,the DCNNs intelligent detection algorithm with different depth is proposed.Finally,experiments with different hyper-parameter are conducted,and the obtained average recognition rate,variance and standard deviation show that the proposed algorithm has high recognition rate and stability.
  • JIN Ye, DING Xiaobo, GONG Guoqiang, Lü Ke
    Computer Engineering. 2020, 46(3): 138-143. https://doi.org/10.19678/j.issn.1000-3428.0054407
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Existing k degree anonymous privacy protection methods usually damage the graph structure significantly and cannot resist structural background knowledge attacks.To address the problem,this paper proposes an improved k degree anonymous privacy protection method.The method introduces the concept of community,and divides nodes into two types which including nodes in the community and edge nodes that connect communities.The importance of nodes is differentiated,and the degree anonymity of the nodes in the community and the community sequence anonymity of the edge nodes are implemented,thereby the k degree anonymity of the entire social network is completed.Experimental results show that the proposed method reduces the practical loss of data,and can resist attacks that take node degree and community relationship as background knowledge.Thus,privacy protection is enhanced.
  • DING Longbin, WU Zhongdong, SU Jiali
    Computer Engineering. 2020, 46(3): 144-150. https://doi.org/10.19678/j.issn.1000-3428.0053018
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In practical application,the Convolutional Neural Network(CNN)-based intrusion detection method has some problems,such as long training time,a large number of hyper parameters,and a large amount of required data.In order to reduce the complexity of training and improve the efficiency of intrusion detection,this paper proposes an detection method based on Ensemble Deep Forests(EDF).On the basis of analyzing the hidden layer structure of CNN and the Bagging integration strategy of ensemble learning,the method constructs a Random Forest(RF) layer.Then the features randomly selected by the RF input are trained in each layer.The output class vectors and feature vectors are spliced,and iterations are passed to the next layer.Training continues until the model converges.Experimental results on the NSL-KDD dataset show that compared with the CNN algorithm,the EDF algorithm can improve the convergence speed by more than 50% while ensuring the classification accuracy,which proves the efficiency and feasibility of the EDF algorithm.
  • Mobile Internet and Communication Technology
  • FAN Qiaoling, JIA Xiangdong, JI Pengshan, LU Yi
    Computer Engineering. 2020, 46(3): 151-156. https://doi.org/10.19678/j.issn.1000-3428.0054068
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Aiming at the unbalanced load of Uplink(UL) and Downlink(DL) in the Heterogeneous Network(HetNet),this paper proposes a Decoupling UL and DL Association(DUDA) scheme.The probability density functions of primary and secondary UL access distances are derived by simplifying the association condition of Dual Connectivity(DC).The general form of the DUDA UL coverage probability is calculated by using stochastic geometry tool,thus obtaining the UL Average Coverage Probability(ACP) of the whole network.Numerical and simulation results show that compared with the traditional CUDA scheme,the coverage performance of the DUDA scheme is better.
  • DU Gang, ZHANG Shanwen, QIU Lijun
    Computer Engineering. 2020, 46(3): 157-162,171. https://doi.org/10.19678/j.issn.1000-3428.0053662
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Existing node location algorithms for Wireless Sensor Network(WSN) are not applicable to environments with frequent topological changes,as rapid motion of nodes reduces the accuracy of recognition.To address the problem,this paper proposes a WSN signal detection algorithm based on motion trajectory capture and coupled orthogonal coverage mechanisms.The algorithm captures the radio frequency intensity of the anchor nodes to cover their motion trajectories,and obtains the anchor node with the best performance and its coordinate.So the inaccurate location phenomenon caused by anchor node failure and weak singal strength can be improved.On this basis,it constructs a motion trajectory capture method based on Lagrange interpolation function,using vertical and horizontal coordinates to implement precise capture for the motion vector of nodes.On this basis,the next-moment coordinate of the node is preliminarily predicted at a controllable precision,so as to optimize the regional coverage of the anchor node on moving nodes.Orthogonal coverage is also adopted to design a coverage optimization method based on the filtering mechanism to improve the precision of coordinate sampling in the covered area and the precision of network node location.Simulation results show that the proposed algorithm has better performance in dynamic path capture and higher accuracy of coordinate location than 2S-HGR and TDLM mechanisms.
  • HAN Yijing, ZENG Fangling, WANG Haibing
    Computer Engineering. 2020, 46(3): 163-171. https://doi.org/10.19678/j.issn.1000-3428.0054573
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Half data modulation means that the signals sent by each node are modulated with two pseudo codes,which are the in-phase pseudo code and the orthogonal pseudo code.The in-phase channel is void of data information while the orthogonal channel modulates data information.The pseudo random codes in these two channels are mutually orthogonal.According to the signal pattern of half data modulation,this paper proposes the joint acquisition and joint tracking algorithm for baseband processing.In the acquisition phase,this paper compares the performance of acquisition probability and acquisition time between single channel acquisition and other acquisition algorithms such as coherent joint,incoherent joint and differential coherent joint.In the phase of code tracking and carrier tracking,the tracking error covariance of single channel tracking and joint tracking algorithms in different interference scenarios is compared,so as to select the best acquisition and tracking algorithm.Simulation results show that the joint acquisition algorithm has the highest acquisition probability and the shortest average acquisition time.The joint tracking algorithm of code loop and carrier tracking loop has high tracking accuracy and reliability.
  • YING Wen, ZHOU Jie
    Computer Engineering. 2020, 46(3): 172-177. https://doi.org/10.19678/j.issn.1000-3428.0055071
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    To study the features of wireless Multiple Input Multiple Output(MIMO) system channels,this paper proposes a three-dimensional elliptic channel model for outdoor environment that considers its complexity and spatiality of signal transmission.With a Uniform Rectangular Array(URA) set at its sending and receiving ends,this model is used to analyze the performance of the MIMO antenna system,and derive the expression of the probability density function of the Angle of Arrival(AOA) and the Time of Arrival(TOA).Also,the factors that affect Spatial Correlation(SC) and capacity of URA space are studied.Theoretical analysis and experimental results show that in an URA-based elliptic channel model in three-dimensional space,the Azimuth Spread(AS) is the main factor that affects spatial correlation,while the distance between antenna at the sending end and receiving end also affects the antenna system capacity.The result is important reference and significance to the expansion of three-dimensional channel model application and the analysis of antenna array sensitivity.
  • PENG Daqing, LI Jing
    Computer Engineering. 2020, 46(3): 178-183,191. https://doi.org/10.19678/j.issn.1000-3428.0053961
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Narrow Band Internet of Things(NB-IoT) is featured by low costs,low energy consumption,large amount of connection and wide coverage.However,its low complexity and strong penetration fading reduces the precision of localization.This paper proposes a localization algorithm using fingerprint matching based on the amplitude of Channel State Information(CSI) and Narrowband Reference Signal Received Power(NRSRP).The algorithm uses the CSI amplitude and NRSRP to construct fingerprints offline,and collects fingerprint information of the to-be-localized terminal online.The K-Nearest Neighbor(KNN) algorithm is used to obtain the nearest K neighboring point.The NRSRP information of the to-be-localized terminal and the K neighboring point is used to estimate the distance difference with the wireless channel transmission model.On this basis,the maximum likelihood estimation algorithm is used to obtain the final estimated position.Experimental results show that compared with KNN,WKNN and other algorithms,the proposed algorithm can effectively reduce the localization errors and improve the localization precision.
  • SUN Zeyu, YAN Ben, NIE Yalin, LIU Baoluo, JIA Fuqian, LAI Chunxiao
    Computer Engineering. 2020, 46(3): 184-191. https://doi.org/10.19678/j.issn.1000-3428.0053550
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    To address frequent interrupts of communication networks caused by a large amount of redundant data in sensor networks.This paper proposes an optimized clustering routing algorithm with controllable threshold parameters.The algorithm introduces the fitness function and heuristic function in the ant colony algorithm to make the selection of the cluster head node of the next hop more targeted,and realize the establishment of the network routing tree and distributed clustering of the event domain nodes.Then the controllable threshold parameters and variation coefficient are used to optimize the shortest path selected by the network routing,so that the energy consumption of the nodes can be reduced,and network delay can be minimized.Finally the algorithm uses the update strategy of global pheromone to suppress the generation of long links,balance network energy,and extend network lifetime.Experimental results show that compared with DMOA and MTTA algorithms,this algorithm has improved 13.72% and 12.06% respectively in terms of suppressing network energy consumption and extending network lifetime.
  • Computer Architecture and Software Technology
  • YIN Jiabao, ZHU Tao, CUI Kaihua
    Computer Engineering. 2020, 46(3): 192-197. https://doi.org/10.19678/j.issn.1000-3428.0054506
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In order to realize the Controller Area Network(CAN) bus communication of Loongson 3A3000 motherboard under VxWorks,an eight-channel CAN communication board based on PCI bus is designed by using SJA1000T,and the corresponding driver design and optimization scheme are proposed.The design optimizes the driver of the Loongson 3A3000 processor by disabling CAN and using the query mode to send data,and traversing all channels in the interrupt service routine when receiving data,so as to improve the utilization of interrupts.When creating a device function,each CAN communication board is identified based on the PCI bus information to ensure that different CAN channels in the system have their unique channel numbers.Experimental results show that the driving scheme runs stably,and the data transmission is safe and reliable.After optimization,it can effectively reduce the number of interrupts of the CAN communication board,improve the communication speed of CAN bus,and prevent the normal board from the interference of failing boards in a multi-board environment,which improves the robustness of the system.
  • WANG Jin, ZUO Chun, ZHANG Zheng
    Computer Engineering. 2020, 46(3): 198-205,213. https://doi.org/10.19678/j.issn.1000-3428.0054162
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    In order to improve the quality of industry software on the basis of sample programs,this paper proposes an automated testing tool based on the analysis of sample program content and domain data.By analyzing the standardized skeleton comments in sample programs,the rule base is used to extract the element positioning parameters and business process identifiers needed for automated testing,so as to extract business data from domain data.On this basis,the code engine is used to automatically generate a test script.Experimental results show that the test tool can quickly test and modify the business procedure of industry software based on sample programs,outperforming the frequently used QTP test tool in terms of the test efficiency and the script accuracy.
  • WANG Yuqi, GAO Jianhua
    Computer Engineering. 2020, 46(3): 206-213. https://doi.org/10.19678/j.issn.1000-3428.0054962
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Web statistical testing can ensure the quality of Web applications and testing cases are the key factors for the reliability improvement of software.Therefore,based on association rules,this paper proposes a method for the generation of Web statistical testing cases and the measurement of system reliability.The information is extracted from Web server logs and saved in the customized data structure note through the hash table.The association rules are used to excavate the notes to obtain the frequent access sequence of users,which is further modeled by the Markov model.On this basis,the roulette algorithm is used to generate testing cases,according to which,the Nelson model is used for system reliability evaluation and the MTBF is taken as the evaluation index.Experimental results show that the MTBF of the testing cases generated by this method is close to that of the real environment,which verifies the effectiveness of this method.
  • LI Jiawei, ZHANG Ji, ZHAO Juncai, DING Ruyi
    Computer Engineering. 2020, 46(3): 214-221,228. https://doi.org/10.19678/j.issn.1000-3428.0056116
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Routing selection algorithms are one of the important factors affecting transmission performance during serial RapidIO transmission.Aiming at the non-optimal allocation path of Serial Rapid Input and Output(SRIO) network depth search,this paper proposes a load balancing shortest path routing algorithm.The algorithm enumerates the nodes in the SRIO network by Breadth First Search(BFS),establishes network topology information,and defines the cost of routing by routing hops.The improved Floyd-WarShall algorithm is used to calculate and save the K shortest path between switching nodes.The load of the link is defined by giving the concept of the expected load and the number of routing paths on the link.The load balancing is used to select the path from K shortest path,so as to establish the load balancing routing with the shortest path constraint of the SRIO network.Experimental results show that compared with algorithms such as deep traversal routing configuration and minimum hops,the proposed algorithm has better performance in terms of average network transmission hops,link average load and link load balancing,and can effectively improve the stability of SRIO network.
  • ZHANG Hongwei, LI Xiaohuan, LI Chunhai, YAO Rongbin, TANG Xin
    Computer Engineering. 2020, 46(3): 222-228. https://doi.org/10.19678/j.issn.1000-3428.0054451
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Memory pre-copy has the problem of repeated transmission of dirty pages under intensive load,which needs a large number of iterations that greatly reduce the overall performance of memory pre-copy migration.The probability prediction of dirty pages can effectively reduce the repeated transmission of dirty pages.However,the existing probability prediction research of dirty pages only focuses on the time correlation while ignoring the spatial correlation between memories.To address this problem,this paper proposes a pre-copy migration strategy based on memory association analysis.The dirty page rate is used to predict the probability of dirty page becoming dirty in the next round.The Memory_cor algorithm is designed to calculate the association rules and associated memory pages of dirty pages,so as to avoid the transmission of memory pages with high dirty probability as well as their associated memory pages.Experimental results show that the proposed strategy is superior to the Xen pre-copy migration method in terms of total migration time and downtime.
  • Graphics and Image Processing
  • LI Ya, ZHANG Yu'nan, PENG Cheng, YANG Junqin, LIU Miao
    Computer Engineering. 2020, 46(3): 229-236. https://doi.org/10.19678/j.issn.1000-3428.0054327
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    To address the problem of complex model and slow recognition speed of traditional deep convolutional neural network,this paper proposes a face attributes recognition method based on multi-task learning.The underlying network is constructed by the lightweight residual modules.According to the correlations between attribute classes,the sharing branch networks are designed to largely reduce the network parameters and calculation costs.Then the parameters of the branch networks and underlying network are jointly optimized in the manner of multi-task learning.The shared features of correlated attributes are used to achieve better recognition effect.The weighted cross entropy is taken as the supervised training network model of the loss function,so as to improve the disequilibrium of positive and negative samples.Experimental results on the public dataset CelebA show that the recognition error rate of the proposed algorithm can be reduced to 8.45% and the space cost is only 2.7 MB.Besides,the prediction time of each image on CPU is reduced to 15 ms,which is suitable for the resource-limited portable devices and valuable in real life applications.
  • ZHANG Qiang, ZHANG Yong, LIU Zhiguo, ZHOU Wenjun, LIU Jiahui
    Computer Engineering. 2020, 46(3): 237-245,253. https://doi.org/10.19678/j.issn.1000-3428.0054222
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    The hand gesture recognition method based on artificial modeling has many problems such as low accuracy and slow speed.Therefore,this paper proposes a static hand gesture recognition method based on improved YOLOv3.By using the convolutional neural network YOLOv3 model,the commonly used RGB images are replaced by the IR,Registration of RGB,RGB and Depth images collected by Kinect equipment as dataset.The recognition results of these 4 types of images are fused to improve the recognition accuracy.The k-means clustering algorithm is used to optimize the initial candidate frame parameters in YOLOv3,so as to improve the recognition speed.On this basis,the transfer learning is used to improve the basic feature extractor to shorten the training time of the model.Experimental results show that for the recognition of static hand gestures in stream videos,the mean Average Precision(mAP) of the proposed method is 99.8% and the recognition speed is up to 52 FPS.The training time of the proposed model is 12 hours,and its recognition accuracy and speed is better than other deep learning methods such as Faster R-CNN,SSD and YOLOv2.
  • TAO Fei, CHENG Keyang, ZHANG Jianming, TANG Yuhao
    Computer Engineering. 2020, 46(3): 246-253. https://doi.org/10.19678/j.issn.1000-3428.0054092
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Pedestrian reidentification is an important research branch in the current image recognition field.While many breakthroughs are made,there are also many challenges in the application of actual scenes.The differences of camera equipment and shooting scenes,and the influence of wearing,scale,partial occlusion,posture,etc on pedestrian appearance,bring greater difficulties to pedestrians reidentification.Therefore,this paper proposes a pedestrian reidentification method.The method marks the pedestrian attribute information through the parallel attribute learning task based on posture,and then the labeled attitude information is integrated into the pedestrian reidentification task as a semantic attribute.The method reduces the impact of attribute missing on the model in the actual scenes and accelerates the training process.Experimental results show that the method achieves a 90% recognition accuracy on the VIPeR data set.
  • XI Runping, XUE Shaohui
    Computer Engineering. 2020, 46(3): 254-260,266. https://doi.org/10.19678/j.issn.1000-3428.0054295
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    The performance evaluation of existing moving target tracking algorithms has many drawbacks,such as massive amount of data,redundant tests and insufficient consideration on algorithm performance under multifactor situation.Therefore,this paper proposes a performance evaluation method for moving target tracking algorithm based on orthogonal test.After a full analysis on the factors and levels that affect algorithm performance,the dataset of orthogonal test is built and then used for algorithm performance test.The data results are analyzed by the range analysis method,so as to obtain the relationship between the factors that affect the algorithm,as well as the combination of factor levels when the algorithm performance is good.Experimental results show that the proposed method can evaluate the performance of the moving target tracking algorithm in a comprehensive and effective way.Besides,this method can reduce the number of tests and data volume,and provide reference for the performance evaluation of other image processing algorithms.
  • XUE Zhixin, ZHENG Yinghao, XIAO Jian, WEI Lingling
    Computer Engineering. 2020, 46(3): 261-266. https://doi.org/10.19678/j.issn.1000-3428.0054590
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    The traffic sign recognition algorithm based on multi-column Convolutional Neural Network(CNN) has an ideal recognition rate,but its recognition and training time is longer,so its practicability is poorer.Therefore,a road traffic sign detection model based on multi-scale CNN is constructed.By improving the base network of feature extraction in the single-scale CNN,the features generated by different layers of the network are fused into multi-scale features and provided to the classifier,so as to improve the utilization of the lower features.Experimental results on the GTSRB dataset show that the traffic sign recognition of the model is 99.25%.Compared with the multi-column CNN neural network model,while ensuring high accuracy,the recognition and training time decreases by more than 90%,which is more suitable for the accurate detection of traffic signs under real road conditions.
  • MEI Xuzhang, JIANG Hong, SUN Jun
    Computer Engineering. 2020, 46(3): 267-272,279. https://doi.org/10.19678/j.issn.1000-3428.0054379
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    The structural information of retinal vesselsassists in the diagnosis of ophthalmic diseases,and thus efficient and accurate segmentation of retinal vessel images has become an urgent clinical demannd.The traditional artificial segmentation methods are time-consumingand frequently affected by personal subjective factors,leading to a decline in segmentation quality.To address the problem,thispaper proposes an automatic image segmentation algorithm based on dense attention network.The algorithm combines the basic structure of the encoder-decoder fully convolutional neural network with the densely connected network to fully extract the features of each layer.Then the attention gate module on the decoder side of the network is introduced to suppress unnecessary features and thus improve the segmentation accuracy of retinal vessel segmentation.Experimental results on DRIVE and STARE fundus image datasets show that compared with other algorithms based on deep learning,the proposedalgorithm has excellent segmentation performance with the sensitivity,specificity,accuracy and AUC value all improved.
  • LIU Yande, ZENG Tiwei, CHEN Dongbin, WANG Guantian
    Computer Engineering. 2020, 46(3): 273-279. https://doi.org/10.19678/j.issn.1000-3428.0054445
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    The detection performance of ViBe algorithm is poor when there are dynamic background interference,ghost regions and stationary targets in video monitoring.To address the problem,this paper proposes an improved ViBe algorithm.First,the Otsu algorithm is used to obtain the dynamic threshold value of images and enhance the ability of resisting dynamic background interference.At the same time,region similarity is used to judge ghost regions,smears,and stationary target area.It can also adaptively suppress the pixel update of different kinds of regions.Experimental results show that the improved ViBe algorithm has good robustness in the dynamic background,and can effectively suppress ghost regions and smearsgenerated by stationary targets.The proposed algorithm improves the detection accuracy by 0.309 while keeping real-time performance,and increases the overall evaluation index F value by 0.2,which means it has better detection performance.
  • Development Research and Engineering Application
  • HU Yanan, LI Chunsheng, ZHANG Kejia, FU Yu
    Computer Engineering. 2020, 46(3): 280-291. https://doi.org/10.19678/j.issn.1000-3428.0053931
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    To address fault diagnosis tasks with large amounts of computation and complex fault causes in industrial production,this paper proposes a rule activation multi-objective optimization algorithm based on MPSO/D.First,Extended Belief Rule Base(EBRB) system is used to implement task decomposition.Next,the objective functions of the problems are optimized with inconsistency of activation rules and sum of activation weights taken as multiple objectives.Then MPSO/D algorithm is used to obtain a set of activation rules with minimal inconsistency,so as to improve inference accuracy.Experimental results on the standard test functions and the fault diagnosis example of the well pump for ternary compound flooding show that the proposed algorithm can effectively improve the reasoning ability of the EBRB system and the accuracy of task decomposition of virtual guidance.
  • HUANG Hehe, ZENG Yuanyuan, ZHANG Yi, NAI He
    Computer Engineering. 2020, 46(3): 292-298,308. https://doi.org/10.19678/j.issn.1000-3428.0055439
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    With the popularization of intelligent communication devices and the improvement of positioning accuracy of communication base stations,it becomes feasible to monitor and predict crowd density using the behavior data of users recorded by communication base stations.However,the prediction performance of the frequently methods using time series and probability models is reduced by the suddenness of crowd gathering events.To address the problem,this paper proposes a prediction method based on group behavior analysis.By analyzing the online behavior of crowds and the behavior features of crowds moving between base stations,their correlation is obtained.On this basis,in combination with the time series information of the crowd density of stations,the prediction result is obtained by using the expanded causal convolutional neural network and logistic regression model.Experimental results on the online behavior record dataset of mobile phone users provided by operators show that the accuracy of this prediction method is 0.93 and the recall rate is 0.97,which is significantly better than the ARIMA algorithm,LSTM algorithm and XGBoost algorithm,proving the introduction of online behavior and movement features of users can effectively improve the accuracy of abnormal crowd aggregation prediction.
  • LIU Yi, MEI Yupeng, LI Guoyan, PAN Yuheng
    Computer Engineering. 2020, 46(3): 299-308. https://doi.org/10.19678/j.issn.1000-3428.0054613
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Aiming at the nonlinear network switched system whose controller gain is perturbed,this paper adopts the T-S model to study the stability control problem of the system under the condition that both random time-varying delay and uncertainty exist.The switching law and non-fragile feedback controller of the system are designed by using Average Dwell Time(ADT) method.The ADT condition of the exponential stability for network switched fuzzy time-delay system is given and the matrix inequality condition is derived by combining the Lyapunov-Krasovskii Function(LKF) method.Then the matrix inequality condition is transformed into the form of Linear Matrix Inequality(LMI),and the state curves of the system under ADT method and LKF method are compared by using numerical simulation.The results show that the ADT method can improve the convergence speed and performance index of the system.
  • HAN Yunxiao, SHAO Qing, FU Yuxiang, GUO Qing
    Computer Engineering. 2020, 46(3): 309-314. https://doi.org/10.19678/j.issn.1000-3428.0053966
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    To improve the accuracy of speech signal endpoint detection under complex noise environment,this paper proposes a multidimensional feature speech signal endpoint detection algorithm based on MFCC distance.By calculating the MFCC distance of the speech signal and combining short time energy and short time over zero rate,this algorithm corrects the feature distance,updates the threshold value and establishes the adaptive noise model to achieve the speech signal endpoint detection in complex noise.Experimental results show that under the condition of same calculation efficiency,the proposed algorithm has higher detection accuracy compared with the two classic detection algorithms based on double threshold energy and cepstrum distance.
  • ZHANG Chunfu, WANG Song, WU Yadong, WANG Yong, ZHANG Hongying
    Computer Engineering. 2020, 46(3): 315-320. https://doi.org/10.19678/j.issn.1000-3428.0054109
    Abstract ( ) Download PDF ( ) HTML ( )   Knowledge map   Save
    Diabetes is a metabolic chronic disease that cannot be thoroughly cured.Early detection and early treatment can reduce the risk of this disease.Machine learning model can effectively predict the disease and provide auxiliary diagnosis and treatment.Therefore,this paper proposes a GA_Xgboost model to predict diabetes risk.Based on Xgboost algorithm,this method makes use of the good global search ability of Genetic Algorithm(GA) to make up for the shortcoming of slow convergence of Xgboost.The elite selection strategy is used to guarantee the best evolutionary results in each round.Experimental results show that the mean square error of GA_Xgboost in diabetes prediction is 0.606,so the prediction accuracy is better than those of the linear regression,decision tree,support vector machine and neural network.Besides,the time of parameter adjustment is 152 s,which is less than grid search and random walk method.