Author Login Editor-in-Chief Peer Review Editor Work Office Work

15 August 2017, Volume 43 Issue 8
    

  • Select all
    |
  • ZHANG Hao,LIU Yuan,WANG Xiaofeng,JIANG Min
    Computer Engineering. 2017, 43(8): 1-7. https://doi.org/10.3969/j.issn.1000-3428.2017.08.001
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem that the performance parameters of the virtual links cannot be emulated accurately in the OpenStack cloud platform,a method of high-fidelity link emulation is proposed.The virtual network nodes are emulated by multi-granularity virtualization technology,and the virtual links between virtual network nodes are constructed based on Software Defined Network(SDN).According to the location of the compute nodes where the virtual network nodes communicate,the intra-host link emulation and the inter-host link emulation are achieved respectively,so that the flexible configuration and automatic deployment of the virtual link performance parameters(bandwidth,delay and packet loss rate) are supported.The experimental results show that the method can accurately emulate the performance parameters of virtual links and improve the fidelity of link emulation.
  • ZHU Jun,CHEN Linlin,ZHU Xian,XIE Ling,WEI Wei
    Computer Engineering. 2017, 43(8): 8-14. https://doi.org/10.3969/j.issn.1000-3428.2017.08.002
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper researches Proxy Re-encryption(PRE) technology under the Certificateless Public Key Cryptography(CL-PKC),gives a formal definition and security model for certificateless PRE system and construct a new certificateless PRE scheme.The new scheme is proved secure against chosen plaintext attack in the Random Oracle Model(ROM).With the help of the proposed scheme,cloud service providers can transform ciphertexts encrypted under a user’s public key into different ciphertexts that can be decrypted by the other user’s secret key,and the other user also can access raw data to realize data sharing.Analysis result shows that the scheme can effectively ensure the safety and reliability of data under the environment of cloud computing.
  • ZHANG Lei,ZHOU Jinhe,ZHANG Yuan
    Computer Engineering. 2017, 43(8): 15-20,25. https://doi.org/10.3969/j.issn.1000-3428.2017.08.003
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to improve the delivery efficiency of Content Delivery Network(CDN) under Cloud storage environment,this paper puts forward a cache resource allocation and pricing algorithm based on Stackelberg game.It models the Web servers and cloud CDN service agents as a multi-leader multi-follower Stackelberg game model,and builds their respective utility functions.It also proves the existence of Nash Equilibrium(NE) point of the Web servers when the CDN agents’ prices are fixed.Finally,it utilizes a distributed iterative algorithm to solve the game model,and finds the optimal price and the optimal cache allocation under it.Simulation results show that the proposed algorithm ensures the Web server’s cache needs be efficiently allocated between agents.Compared with the user’s Quality of Service(QoS) priority algorithm,it can make the Web server obtain higher benefit per unit cost.
  • TAO Linbo,SHEN Jianjing,LIU Bo,WEI Liang
    Computer Engineering. 2017, 43(8): 21-25. https://doi.org/10.3969/j.issn.1000-3428.2017.08.004
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    The researches on the security of cloud computing architecture are mainly focused on the architecture design and some technology improvement which are lack of overall thinking and guantitative analysis on the environment and related factors of the architecture.This paper abstracts the regulars and factors of cloud computing architectures,and finds those who affect their security through probability prove.A method about how to cut data into blocks according to its length is given.Security analysis result shows that the security of cloud architecture is related to data classification,increasing data security level and data block cutting which enhancing the difficulty of data recovery can improve the security of cloud architecture.

  • YANG Peng,MA Zhicheng,PENG Bo,YAO Jianguo
    Computer Engineering. 2017, 43(8): 26-31. https://doi.org/10.3969/j.issn.1000-3428.2017.08.005
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem that the virtual machines of OpenStack cloud platforms fail to better utilize host physical machine resources,this paper builds a OpenStack platform based on Docker container technology which combines the efficiency of OpenStack based on component work in cloud platform management and the rapid deployment of Docker container virtualization.It tests the basic performance of the guest OSs of Docker container in the cloud platform,including computing,scheduling,memory access and I/O performance,evaluates and analyzes the related performance and compares its performance with the traditional OpenStack virtual machine in a multi instance running state.Analysis result shows that the new cloud platform can optimize the computation performance and the file system I/O performance from a global perspective by using the lightweight virtualization capabilities of Docker container.
  • WANG Xiaojie,XU Mingwei,WANG Sixiu,ZHU Yixin
    Computer Engineering. 2017, 43(8): 32-37. https://doi.org/10.3969/j.issn.1000-3428.2017.08.006
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The existing Virtual Machine(VM) placement researches mostly aim at energy saving,which reduce energy consumption by consolidating of resources.However,the excessive consolidation of these resources may lead to network performance degradation.Aiming at this problem,this paper studies the network-aware VM placement problem,analyzes the influence factors of VM placement and proposes a two-phase heuristic VM placement algorithm.Firstly,based on analyzing the similarity between VMs,the aggregation is performed to improve communication ability among VMs and reduce the network traffic of data center.Then,a modified knapsack algorithm is used to implement the appropriate allocation of VMs between physical hosts.Experimental results show that,compared with Best Fit(BF) algorithm and random algorithm,the proposed algorithm can optimize the network traffic,reduce the number of avtivated physical hosts and save energy consumption more effectively.
  • SUN Xu,WEN Mi,ZHANG Xu,ZHOU Bo
    Computer Engineering. 2017, 43(8): 38-43. https://doi.org/10.3969/j.issn.1000-3428.2017.08.007
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to accurately and efficiently verify data integrity in smart grid with cloud storage,this paper proposes a new scheme for dynamic data integrity verification.On the basis of protecting the data confidentiality,this scheme performs BLS short signature on data and supports third party auditing,which can verify the integrity of partial data and reduce the computational cost according to user requirements.Meanwhile,it uses the Locality Sensitive Hash(LSH) as an quick searching manner to improve the query efficiency of updating the stored data and quickly retrieve the data while integrity being validated.Experimental results show that the proposed scheme can verify the integrity of the power data accurately and support efficient dynamic data updating.
  • SHI Baopeng,DUAN Xun,KONG Guangqian,WU Yun
    Computer Engineering. 2017, 43(8): 44-48,,55. https://doi.org/10.3969/j.issn.1000-3428.2017.08.008
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The resource scheduling policies for the cloud platform are too simple to meet the needs of the medical service effectively.Aiming at this problem,this paper analyzes different demands of different medical systems on resources and proposes the IB-Choose resource scheduling strategy on this basis.It builds a medical cloud platform with doctor diagnosis and treatment system,laboratory test system and image archiving system based on OpenStack platform,and implements the IB-Choose strategy on this platform.Experimental results show that,compared with the default resource scheduling policy in OpenStack,named Chance,the IB-Choose resource scheduling policy can shorten the service time of starting virtual machines by 25%~30%.At the same time,it reduces the cloud resources overhead and improves the utilization rate of cloud resources.
  • NIU Ruibiao,TANG Lun,CHEN Wan
    Computer Engineering. 2017, 43(8): 49-55. https://doi.org/10.3969/j.issn.1000-3428.2017.08.009
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The Quality of Experience(QoE) of mobile terminals severely reduces caused by both local service station to a serious shortage of resources,and mobile terminal resources with limited.Furthermore,the task offloading into remote cloud brings the large time delay.This paper proposes a joint optimization of the power and load allocation algorithm for small cell cloud.Based on the channel quality and the rest of the available computing resources to build Small Cell Cloud(SCC),according to the same element,it distributes the load(offloading tasks) to SCC and uses a heuristic algorithm to seek approximate suboptimal solution to transmitting power.Simulation results show that the algorithm can improve the utilization of the radio and computing resources,and enhance the user QoE at the same time.
  • HOU Xiaojing,JI Mengluo,HUANG Chenlin,SHU Yunxing,YAN Ben
    Computer Engineering. 2017, 43(8): 56-62,68. https://doi.org/10.3969/j.issn.1000-3428.2017.08.010
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Obtaining program modes of real-time control system from source codes cannot only verify the consistence of the design and implementation,but also improve the accuracy of Worst-Case Execution Time(WCET) calculation.Based on this consideration,this paper proposes an automatic analysis method for program mode of real-time control system.The Control Flow Graph(CFG) of program is generated by analyzing the C source codes,and by means of slicing input variables dependent nodes,an Input-dependent Control Flow Graph(ICFG) is formed.Linear programming problem is constructed and solved for each path of ICFG.If the problem has a solution,a potential program mode will be achieved.On this basis,the WCET of given mode for program in modern RISC processor is calculated.Experimental result for a benchmark program shows that the proposed method is feasible and effective.
  • LI Shuangquan,DU Yajuan
    Computer Engineering. 2017, 43(8): 63-68. https://doi.org/10.3969/j.issn.1000-3428.2017.08.011
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the shortcomings of existing real-time simulation platforms,such as inconvenient operation and low flexibility,this paper introduces the characteristics of RT-LAB simulation platform and analyzes the advantage of the platform when it is applied in large-scale control systems.It builds a hardware-in-the-loop simulation system for space docking based on RT-LAB and develops corresponding simulation software.Then,it downloads the programmed Matlab/Simlink mathematics model into target machines through RT-LAB platform and implements the simulation of space rendezvous and docking.Experimental results show that the proposed system can simulate the process of two aircrafts’ rendezvous and docking in space,verify the accuracy of the mathematics model and the quality of docking mechanism product,which has higher stability and reliability.
  • CHEN Xi,ZHU Jiantao,HE Xiaobin
    Computer Engineering. 2017, 43(8): 69-73. https://doi.org/10.3969/j.issn.1000-3428.2017.08.012
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The existing distributed file storage system has problems like inefficient data organization and redundant access semantics,etc.,which severely limit the performance of High Performance Computing(HPC) storage system.Therefore,this paper presents HPC-oriented distributed Object Storage System(COSS) based on the idea of object storage.It separates the data access and data management to achieve more streamlined and efficient access semantics,uses distributed global object data organization method and designs memory-based metadata management method to improve system performance.Experimental results show that,under large scale concurrent access,COSS can improve read and write aggregated bandwidth by 22.5% and 50.4% respectively compared with Lustre system.The performance of file creation and file deletion of COSS is respectively 2.15 times and 5.13 times of Lustre system.Meanwhile,COSS provides quasi-linear data read/write performance and metadata management performance.It also has excellent scalability.
  • ZHANG Junwei,WANG Jing,ZHANG Weigong,QIU Keni
    Computer Engineering. 2017, 43(8): 74-81. https://doi.org/10.3969/j.issn.1000-3428.2017.08.013
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper analyzes the development and present application situation of ARM server.Combines the characteristics of data center,it designs an energy-efficient data center server based on ARMv8 architecture.It selects present representative ARMv8 server parameters,quantitatively evaluates the micro architecture index,performance and power consumption of the ARMv8 and X86 architectures using typical data center loads.Experimental results show that,compared with X86 architecture,ARMv8 architecture server is more energy-efficient in the field of the data center,which can effectively reduce energy consumption and cost.
  • GAO Yuan,REN Sheng,GU Wenjie
    Computer Engineering. 2017, 43(8): 82-89. https://doi.org/10.3969/j.issn.1000-3428.2017.08.014
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the insufficient of write performance of Hadoop Distributed File System(HDFS),a scheduling algorithm for concurrent transmission of HDFS data blocks in a heterogeneous environment is proposed.The algorithm monitors the resource status and memory queue of each node in the HDFS cluster in real time,matches receiving nodes with the forwarding nodes dynamically,makes the network cards and disk of the whole system work concurrently and reduces the time to write all copies to the distributed file system.The algorithm ensures that the data are written to disk before requesting the next data block for the data security.In the meantime,it makes the number of copies of each node match its own performance,so that the heterogeneous systems can achieve a high rate of writing.Performance tests show that the write performance of the distributed file system using the proposed algorithm is improved by 1 times compared with the original HDFS.
  • ZHANG Jing,CHEN Yao,SUN Jun,FAN Hongbo
    Computer Engineering. 2017, 43(8): 90-94,100. https://doi.org/10.3969/j.issn.1000-3428.2017.08.015
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In the execution process of status transfer tasks for embedded control systems,the schedulability of events is difficult to guarantee and it needs high costs when a task is preempted.Aiming at these problems,this paper constructs a predictive scheduling model combining Feedforward prediction Model(FFM) and Feedback Monitoring Model(FMM).It adds the real-time label of constraint condition including value and timing constraints,so as to adjust the order of events in the released queue in real time and improve the real-time performance and determinacy of the control system.The result of simulation shows that the proposed model can improve the performance of control system and stabilize system’s effect time.
  • XIAO Weimin,DENG Haojiang,HU Linlin,GUO Zhichuan
    Computer Engineering. 2017, 43(8): 95-100. https://doi.org/10.3969/j.issn.1000-3428.2017.08.016
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To implement efficient management and secure isolation of Web application in browser,this paper proposes the lightweight isolation method of rendering process based on Chromium.It researches and analyzes multi-process mechanism and management of Chromium,as well as virtualization of Docker container,designs the solution of ZygoteDocker which is the combination of Chromium and Docker,separates the rendering process module from the browser content module,and implements lightweight browser by simplifying the browser functions.Experimental result shows that the rendering process module implements the isolation in the container,with obvious lightweight result.
  • ZHOU Meng,JIA Xiangdong,XIE Mangang
    Computer Engineering. 2017, 43(8): 101-107,113. https://doi.org/10.3969/j.issn.1000-3428.2017.08.017
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In full-duplex massive Multiple Input Multiple Output(massive MIMO) relaying systems,the employment of numerous Analog to Digital Converters(ADCs) at massive MIMO relay leads to extremely high hardware cost and energy consumption.Moreover,the system performance can be further affected due to the loop interference.Therefore,based on maximum-ratio-transmission/maximum-ratio-combining processing schemes,this paper studies a multi-user full-duplex massive MIMO Amplify and Forward(AF) relaying system with low-resolution ADCs,which consists of large numbers of base station and multi-pairs antennas users,massive MIMO base station uses low-resolution ADCs and runs on full-duplex mode.Through random matrix theory,it performs the analysis of the achievable spectral efficiency of each user,which is given with closed-form expression.Based on the derivations the asymptotic analysis of spectral efficiency are performed under three different power-scaling schemes.Performance analysis results show that both the impact of loop-interference and the low-resolution ADCs on achievable spectral efficiency can be restricted effectively only under the power-scaling scenario where the relay’s transmit power is scaled down with the number of transmit antennas at relay while the one of sources is fixed.On the contrary,under the scenario where the transmit powers of both source and relay are scaled down with the number of antennas,only the effect of loop-interference is restricted but the one of low-resolution ADCs remains.
  • LI Zhenbi,WANG Kang,JIANG Yuanyuan
    Computer Engineering. 2017, 43(8): 108-113. https://doi.org/10.3969/j.issn.1000-3428.2017.08.018
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    When the source signal consists of components with large energy difference or the global matrix is a non-row element dominant matrix,the variable step-size Equivariant Adaptive Separation via Independence(EASI) algorithm based on crosstalk error cannot correctly evaluate the separation effect,resulting in incorrect step selection.Aiming at this problem,this paper proposes an improved variable step-size EASI algorithm.It calculates the square root of the step-size based on crosstalk error to increase step-size appropriately.Meanwhile,according to the defined separation degree,it adjusts the step-size so as to reduce the impact of small step caused by incorrect separation effect evaluation based on crosstalk error.Simulation results show that the proposed algorithm has good step-size adjustment ability.Compared with traditional EASI algorithm and variable step-size EASI algorithm based on crosstalk error,the step-size square root of crosstalk error or the separation degree,its separation effect is better.
  • MU Zhuqing,HUANG Guoyong,WU Jiande
    Computer Engineering. 2017, 43(8): 114-119,125. https://doi.org/10.3969/j.issn.1000-3428.2017.08.019
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In BeiDou Navigation Satellite Systen(BDS),the cycle-slip signal with high Signal to Noise Ratio(SNR) is difficult to be extracted and located.In order to solve this problem,the Singular Value Decomposition(SVD) method based on sensitive factor is used to detect the cycle-slip signal.The phase minus pseudorange method is used to construct the cycle-slip detection variable.The selected cycle-slip signal is used to construct the Hankel matrix and perform SVD.The component signal obtained by decomposition is used to find the sensitive component by using the sensitivity factor,and the signal is reconstructed by selecting the singular value corresponding to the sensitive component by the positioning factor.The abrupt position in the reconstructed signal is the epoch that generates the cycle-slip signal.Simulation results show that this method can locate the epoch which generates the cycle-slip signal more accurately than traditional SVD method.
  • XIA Jingming,ZHUANG Yong,LI Peng,TAN Ling
    Computer Engineering. 2017, 43(8): 120-125. https://doi.org/10.3969/j.issn.1000-3428.2017.08.020
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    The paper proposes a low-complexity reliability constraint algorithm based on QR decomposition.The algorithm judges soft estimates by shadow area constraint method.Besides,constellation points are introduced as the candidate points,and the optimal candidate point is chosen from multiple candidate points for feedback.Simulation results show that,compared with conventional QR decomposition algorithm,thealgorithm can significantly improve system interference and greatly reduce error propagation in decision-feedback,meanwhile,it can control the operation complexity and improve the detection performance of error rate.

  • LIU Guangzhong,ZHU Xiangyu
    Computer Engineering. 2017, 43(8): 126-131. https://doi.org/10.3969/j.issn.1000-3428.2017.08.021
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Aiming at the delay problem of MACAW protocol in underwater acoustic sensor networks,on the basis of the traditional handshake protocol,this paper proposes a new Media Access Control(MAC) protocol for underwater acoustic sensor networks.The protocol uses the node pre-scheduling method to make the reservation selection for the data transmission slot.While the upper and lower nodes Request To Send(RTS)/Clear To Send(CTS) handshake and data,the next node multiplexes CTS and data signal mode as the RTS and determined sent signal,which forms a pipelined MAC protocol.Simulation result shows that this protocol can effectively reduce the transmission delay between nodes,improve the underwater acoustic sensor network throughput,and effectively remit collision and save energy.

  • WANG Wei,WANG Bin
    Computer Engineering. 2017, 43(8): 132-137,143. https://doi.org/10.3969/j.issn.1000-3428.2017.08.022
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In large-scale Radio Frequency Identification(RFID) application scenarios,the existing RFID anti-collision algorithm presents the disadvantages such as long search time and large amounts of data transmission,resulting in decreasing efficiency with increasing tags.Aiming at the above problems,this paper proposes a new RFID anti-collision algorithm.By analyzing and dealing with tag identification data,the order and the instruction parameters of the next tag identification are obtained,so that the electronic tag can be quickly identified.Simulation results show that while identifing the same quantity of tags,the RFID paging times are reduced by 86.37% and 33.67% on average,and the request data is reduced by 85.13% and 26.67% respectively compared with the dynamic binary algorithm and the backward binary algorithm.
  • PENG Yunjian,HUANG Lu
    Computer Engineering. 2017, 43(8): 138-143. https://doi.org/10.3969/j.issn.1000-3428.2017.08.023
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    ReInForM routing protocol in Wireless Sensor Network(WSN) only guarantees network reliability without considering the dynamic changes of node energy and communication path conditions.The way of selecting the next hop forwarding node randomly makes some nodes fail quickly because of repeated use,and further shortens the network life cycle.To solve the above problems,this paper introduces Ant Colony Optimization(ACO) algorithm and combines ant pheromone concentration and node residual energy to construct a ReInForM dynamic routing selection algorithm under the condition of multi-object optimization.It takes the energy consumption and residual energy as the path selection indicators to determine the optimal next hop.Simulation results show that,compared with the original ReInForM routing algorithm,the proposed algorithm can effectively balance the node energy consumption while ensuring the reliability of data transmission.
  • WAN Benting,QUAN Xiaofeng
    Computer Engineering. 2017, 43(8): 144-150. https://doi.org/10.3969/j.issn.1000-3428.2017.08.024
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The traditional Genetic Algorithm(GA) is questioned in solving the problem of mobile node path planning because of its high complexity,long iteration time and easiness to fall into the local optimum.For this reason,this paper proposes an improved GA based on walking point.It turns obstacles into convex polygon,then starts from the starting point and searches for convex polygon vertex until the target point.It then initializes this obtained sequence of genetic genes to obtain initial population.The iterations of selection,crossover and mutation are adopted step by step,and the optimized path is obtained.Simulation results show that the proposed strategy can reduce the path of sensor nodes,shorten the initialization and iteration time,reduce the energy consumption of mobile node and improve the life cycle of Wireless Sensor Network(WSN).
  • LAN Geng,LIU Daowei
    Computer Engineering. 2017, 43(8): 151-155. https://doi.org/10.3969/j.issn.1000-3428.2017.08.025
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In the Internet of Things(IOT) environment, the label privacy information of the goods embedded with Radio Frequency Identification(RFID) tag may be leaked in the process of trading. Aiming at this problem,an ultra-lightweight group RFID tag ownership transfer protocol is proposed based on the Trusted Third Party(TTP) symmetric key update management strategy,in which the word synthesis method is used to encrypt the information to be transmitted and the freshness of each message is maintained by different random numbers.Simulation results show that compared with the group transfer protocol,radio frequency identification and security group transfer protocol,the proposed protocol not only satisfies the security requirements of ownership transfer protocols,but also uses the bit operation to encrypt,which reduces the cost and computation amount and saves the calculation time.
  • LING Hang,WU Zhen,DU Zhibo,WANG Min,RAO Jintao
    Computer Engineering. 2017, 43(8): 156-160,168. https://doi.org/10.3969/j.issn.1000-3428.2017.08.026
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to assess the security of EPCBC cipher,this paper proposes an algebraic side channel attack method based on Hamming weight,and studies factors which affect the attack efficiency.The algebraic equations of the algorithm are constructed.The power leakage is collected to infer the Hamming weight and it is transformed into an algebraic equation.The solver is used to solve the key.Experimental results show that the complete key can be recovered in the known-plaintext scenario or not.
  • DANG Xiaochao,LI Qi,HAO Zhanjun,ZHANG Yulei,ZHANG Linggang
    Computer Engineering. 2017, 43(8): 161-168. https://doi.org/10.3969/j.issn.1000-3428.2017.08.027
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the weak security problem of communication between the sensor nodes and the Internet hosts,a CLPKC-to-TPKC online/offline Heterogeneous Signcryption(HSC) scheme is proposed.Based on the online/offline technology,this paper defines a formalized security model from the certificateless public key cryptography to the traditional public key cryptography environment.In the random oracle model,based on q-SDH,mICDH and BDHI difficult assumption problem,the scheme is proved to satisfy the security requirement.Analysis results show that compared with the IDPKC-to-CLPKC online/offline heterogeneous signcryption scheme,the proposed scheme is more efficient for only using two bilinear pairs of operations,so it is suitable for Wireless Sensor Network(WSN).
  • MA Min,LI Zhihui,XU Tingting
    Computer Engineering. 2017, 43(8): 169-172. https://doi.org/10.3969/j.issn.1000-3428.2017.08.028
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper constructs a class of verifiable secret sharing scheme based on Chinese Remainder Theorem and Bell states.In the distribution phase,Alice distributes the shares to the participants through the quantum secure channel.In the recovery phase,Alice generates a two-bit Bell state in Hilbert space,and then the participants and Alice perform some unitary operations on the Bell state to reconstruct the secret.Analysis results show the scheme can resist intercepted retransmission attack,entangled measunement attack,participants attack,and trojan horse attack.
  • SU Qing,LI Qian,PENG Jiajin,LIU Fuchun
    Computer Engineering. 2017, 43(8): 173-177,183. https://doi.org/10.3969/j.issn.1000-3428.2017.08.029
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In view of the problems including the key escrow,the insecurity of the wireless channel and the high cost of the tag in the key generation process of Radio Frequency Identification(RFID) system,an wireless key generation protocol for RFID system is proposed.On the premise that the forward and backward channels can be eavesdropped,the paper introduces the pseudonym logo to prevent the disclosure of secret information.The paper only uses simple bit operations to reduce the tag cost and computation,and it uses Exclusive OR(XOR) and shift operations to encrypt and transmit information,which aims to ensure the security of protocol.The proposed protocol is formally proved by using GNY logic.The security and performance of the protocol are analyzed under three kinds of applications,which are the single label individual key generation,batch label key generation and group label key generation.It is proved that the proposed protocol has higher security and lower cost.
  • CHEN Li,JIANG Dongdong,LU Jingqiao
    Computer Engineering. 2017, 43(8): 178-183. https://doi.org/10.3969/j.issn.1000-3428.2017.08.030
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to study the influence of clustering coefficient on interaction of virus propagation and cascading failure,an improved interaction model of virus propagation and cascading failure is proposed.Network clustering coefficients are adjusted by changing average degree and probability of triad formation,so as to observe the interaction process of virus propagation and cascading failure.When the probability of triad formation is not considered,the smaller the average degree is,the stronger the ability of the network to resist the interaction is and the more obvious the difference is,but it also enhances the destructive power of the cascading failure sub-procedure.When the average degree is small,the bigger the probability of triad formation is,the weaker the ability of the network to resist interaction is.When the average degree is large,the effects of different probability of triad formation are similar.Simulation results show that compared with the single virus propagation model,the interaction model is more destructive at the same time scale.
  • CAO Suzhen,DAI Wenjie,SUN Han,WANG Xiuya
    Computer Engineering. 2017, 43(8): 184-187. https://doi.org/10.3969/j.issn.1000-3428.2017.08.031
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    On the basis of the certificateless short signature,this paper proposes a certificateless aggregate signature scheme in resource constrained environment.A signature protocol is initiated by the specified aggregator.The state information including random numbers is introduced to make each round of aggregated signature generate different state information.Experimental results show that,compared with the traditional certificateless aggregate signature scheme,the proposed scheme can reduce computation overhead on the basis of provable security,and can be falsified under random oracle model.
  • GONG Yunbao,GAN Liang,HUANG Jiuming
    Computer Engineering. 2017, 43(8): 188-192,199. https://doi.org/10.3969/j.issn.1000-3428.2017.08.032
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    As any closed atom adopts hard constraints in Entity Resolution(ER) algorithm based on Markov Logic Network(MLN),the reasoning and weight learning processes of the algorithm hardly converge to the optimal solution,which decreases the efficiency and accuracy.This paper proposes a method to apply Probabilistic Soft Logic(PSL) model to entity resolution,the closed atom in the model adopts soft constraints,making it easy to carry out millions of knowledge reasoning and weight learning.The paper explains on the basic theory of PSL model,and constructs a first order logic rule to describe the matching process of entity parsing through three aspect of the entity relationship,entity attribute,and ontology constraints.The reasoning mechanism is used to calculate the entity matching result accurately and efficiently.Experimental results show that,compared with the physical analytic method based on MLN model,this method can improve the accuracy and efficiency of F1,and significantly improve the execution efficiency.
  • WANG Luhui,WANG Guiling
    Computer Engineering. 2017, 43(8): 193-199. https://doi.org/10.3969/j.issn.1000-3428.2017.08.033
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem of accompanying vehicle discovery and its real-time performance,this paper proposes a real-time accompanying vehicle group discovery algorithm using Parallel Frequent Itemsets Discovery(PFID) technology based on license plate recognition data stream with time variation.The algorithm adopts the idea of Eclat algorithm for frequent items mining,and implements the generation of maximum accompanying vehicle groups by the distributed data stream processing framework named Spark Streaming.Experimental results show that compared with the Permutation and Combination(PM) algorithm and FP-Growth algorithm,the PFID algorithm consumes less memory and has faster response.The accompanying vehicle group is found within seconds of the response time,which achieves warning objective timely.
  • WANG Shengyu,ZENG Biqing,HU Pianpian
    Computer Engineering. 2017, 43(8): 200-207,214. https://doi.org/10.3969/j.issn.1000-3428.2017.08.034
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The training of Convolutional Neural Network(CNN) model requires designers to set a large number of model parameters.Because the sensitivity of the model to various parameters is different,the experimental results are poor.To address the problem,this paper provides an analysis on Chinese text sentiment analysis and designs a one layer CNN with influence factors of different models,including the dimensionality of word vectors,the training scale of word vectors,slide window size,regularization method,and so on.Chinese sentiment classification experiment is conducted under different influence factors.According to the results,the sensitivity degree of the CNN on various parameters when dealing with Chinese sentiment analysis and the optimization of specific model parameters are proposed.
  • WU Jiehua
    Computer Engineering. 2017, 43(8): 208-214. https://doi.org/10.3969/j.issn.1000-3428.2017.08.035
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper researches the problem of link classification based on complex network features.Aiming at the situation that the original featurehas large noise and redundancy,this paper proposes an improved link classification model based on RReliefF feature selection algorithm.The feature associated with the link information is constructed from the local and global dimensions.RReliefF algorithm is introduced to select features,and the regression classification is carried out by the Partial Least Squares(PLS) method.The result of experiments on artificial datasets and real datasets show that,the model can screendiscriminativecharacteristicto improve the quality of link classification.They alsoprovide a new idea for complex network link classification of supervised learning.
  • ZHENG Suyang,JIANG Jiulei,WANG Xiaofeng
    Computer Engineering. 2017, 43(8): 215-218,224. https://doi.org/10.3969/j.issn.1000-3428.2017.08.036
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    On the basis of the general user’s similarity calculation,a recommendation algorithm based on user project experience degree is proposed according to the impact of user preferences and project characteristics on user ratings.This paper describes the potential impact of the project experience degree on the user,and selects Pearson similarity formula for further calculation.Through the number of users praise for the projects accounting for the total number of project reviews and the proportion of the user’s rating to the overall project score,the user experience weight for the project is gained.The long tail theory is used to balance the user’s similarity and the user’s attention to the popular items.The user’s similarity is calculated,and the prediction and recommendation are generated.Experimental results show that,compared with the traditional collaborative filtering algorithm,the proposed algorithm improves the accuracy of similarity computation and the recommendation effect under sparse data.
  • PU Mei,ZHOU Feng,ZHOU Jingjing,YAN Xin,ZHOU Lanjiang
    Computer Engineering. 2017, 43(8): 219-224. https://doi.org/10.3969/j.issn.1000-3428.2017.08.037
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    In order to quickly find the content you are interested in in large number of news, a method based on weighted TextRank algorithm is proposed to extract the topic sentence in a single document and get information about key news events.It classifies news reports as event sentences and non-event sentences and filters the latter by calculating the mutual information value of the keywords in the news text sentences.It constructs a directed graph of event sentences on the basis of TextRank algorithm, and calculates the influence weight between sentences by introducing three influence factors of the sentence position, sentence similarity and keyword coverage frequency.It calculates the weight for each point in the graph by using TextRank model and selects the most front sorting sentences as topic sentences of the key events.Experimental results show that the proposed method is better than the methods based on Term Frequency-Inverse Document Probabilistic(TF-IDF) and news title in topic sentence extraction.

  • ZENG Qianqian,ZENG An,PAN Dan,YANG Haidong,DENG Jiehang
    Computer Engineering. 2017, 43(8): 225-230. https://doi.org/10.3969/j.issn.1000-3428.2017.08.038
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    An improved Bayesian network structure learning algorithm is proposed by introducing Maximal Information Coefficient(MIC).Under the conditions of a given data set,MIC is used to measure dependency between the variables.An initial Bayesian network is constructed according to the screening and correlation factor.It is combined with the greedy algorithm to locally modify the initial network,integrat local optimal solution to form the global optimal solution,and generate the final network structure.Experimental results on Asia and Car benchmark networks show that,compared with BN structure learning algorithm based on traditional Greedy algorithm,random K2 algorithm,the algorithm is able to get the network structure which is close to that of the benchmark network and has higher mean of the right side and classification accuracy.
  • OUYANG Chao,CHEN Zhibo,SUN Guodong
    Computer Engineering. 2017, 43(8): 231-235,242. https://doi.org/10.3969/j.issn.1000-3428.2017.08.039
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    An improved Genetic Algorithm(GA) for Web service composition Quality of Service(QoS) optimization is brought up by combining Simulated Annealing(SA) algorithm and traditional GA.In order to keep the diversity of the population,the thought of choosing better solution in SA is introduced into the selection of reproduction operator and mutation operator in GA,and a filter rate to dislodge inferior genes in reproduction and procedures in GA is set.The experiment result shows that compared with traditional GA,SA and Particle Swarm Optimization(PSO) algorithm,the improved GA has a better performance in both Web service composition quality and convergence speed.
  • LI Weijiang,QI Jing,YU Zhengtao,ZHAO Tiejun
    Computer Engineering. 2017, 43(8): 236-242. https://doi.org/10.3969/j.issn.1000-3428.2017.08.040
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the data sparsity of user trust matrix,this paper designs a propagation rule for trust relationships among users.It computes the trust degree of user according to the rule,and then uses the trust degree to fill the user trust matrix.It proposes a social recommendation algorithm based on users’ trust propagation algorithm and Singular Value Decomposition(SVD) model,The user scoring matrix is combined with the trust relation matrix to improve The prediction accuracy of the recommended system.Experimental results on both Epinions and Filmtrust publicly available datasets show that compared with the traditional recommendation algorithm,the proposed algorithm has higher recommendation quality.
  • YI Sheng,LIANG Huagang,RU Feng
    Computer Engineering. 2017, 43(8): 243-248. https://doi.org/10.3969/j.issn.1000-3428.2017.08.041
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    The feature extraction method adopted by traditional Convolutional Neural Network(CNN) for video image with continuous frames is east to lose movement information on the target time axis,resulting in low recognition accuracy.To solve this problem,a method based on multi-lolu deep 3D is proposed.The 3D convolution kernel is used to extract the temporal and spatial features to capture the object’s motion information.In order to avoid the error classification because of the insufficient feature information of single 3D CNN,the multi-column 3D CNN is consisted by multi-component 3D CNN that each of them has very strong classification ability.The output of this structure is weighed by the output of each of the 3D CNN,and the category which has the maximum weight is determined to be the final result.The structure of multi-column 3D CNNs is applied to the CHGD for hand gesture recognition.Experimental results show that the method achieves a recognition rate of 95.09%,and the recognition rate compared to a single 3D CNN increases by nearly 7%,it increases by nearly 20%compared to the traditional 2D CNN,it has very excellent recognition ability for the video image sequence.

  • ZHOU Kai,SU Juan
    Computer Engineering. 2017, 43(8): 249-252,257. https://doi.org/10.3969/j.issn.1000-3428.2017.08.042
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem that the traditional edge detection algorithm cannot automatically extract iris contour from the edge points,a edge detection algorithm based on supervised learning for iris segmentation is proposed in this paper.A set of features including Haar and Hog in multi-scale is used to characterize the edge points.The probabilistic boosting tree is used as a training framework to train the pupil,iris and eyelid as a probabilistic boosting tree model.The test samples are input to calculate the probability of the truth iris contour.The output true iris contour edges are fitted,and the local OTSU algorithm is used to segment the iris accurately.Experimental results show that,compared with iris segmentation algorithm based on Hough transform and active contour model,this algorithm has less test time and lower error rate.
  • LU Minjun,WANG Ci
    Computer Engineering. 2017, 43(8): 253-257. https://doi.org/10.3969/j.issn.1000-3428.2017.08.043
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Quantization noise is one of thedominant distortions of JPEG image compression,so its amplitude usually needs to be estimated for image quality assessment.Meanwhile,Peak Signal to Noise Ratio(PSNR) estimation method cannot get the original image in the practical applications,so Mean Squared Difference Slope(MSDS) and the position of maximum nonzero Discrete Cosine Transform(DCT) coefficient are introduced as the measurable characteristics based on image spatial correlation.Some images as well as their PSNR are trained to produce the PSNR estimation model,and the test image is fed into the model to estimate its PSNR.Experimental results show that the proposed algorithm is superior to the existing blind PSNR estimation algorithms.
  • ZHANG Yong,YUAN Jiazheng,LIU Hongzhe,LI Qing
    Computer Engineering. 2017, 43(8): 258-265,271. https://doi.org/10.3969/j.issn.1000-3428.2017.08.044
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Traditional GrabCut based image segmentation method is mainly based on the image pixel values to build a graph model,and does not take into account the rich texture of color image information.This paper presents an image segmentation algorithm based on GrabCut model,and contrasts results of Structure Tensor(ST) GrabCut segmentation method and traditional GrabCut segmentation method.The method uses the ST and the pixel values to construct the tight ST.For concise and efficient calculation,this paper extends Gaussian Mixture Model(GMM) built based on Grabcut method to tensor space,and uses Kullback-Leible(KL) divergence instead of the commonly used the Riemannian metric.Through a lot of experiments on synthetic texture images and natural images,results show that,compared with carstem Rother,GACWRF method the algorithm has more accurate segmentation effects,not only achieves the texture and color information parameter fusion,but also improves the computational efficiency.
  • ZHONG Xiaofang,ZHOU Hao,GAO Zhishan,GAO Yun
    Computer Engineering. 2017, 43(8): 266-271. https://doi.org/10.3969/j.issn.1000-3428.2017.08.045
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The moving shadow will result in deformed or missing foreground targets,which affects the tracking and analyzing of the moving target foreground.To solve this problem,this paper designs a moving shadow removal algorithm by introducing codebook model.This algorithm detects moving regions which included foreground targets and moving shadow by background codebook model constructed in YCbCr color space.According to the property of moving shadow in YCbCr color space,it gets the pixel values that represent moving shadow in moving region and establishes a moving shadow codebook model which has self-adaptive thresholds of brightness range and color distortion for all pixels in different locations in video frame,so as to implement detection and removal of motion shadow.Experimental results show that the proposed algorithm can effectively improve the detection rate and recognition rate of moving shadow.
  • WU Liyang,XIONG Lei,ZHONG Rouzai
    Computer Engineering. 2017, 43(8): 272-278,283. https://doi.org/10.3969/j.issn.1000-3428.2017.08.046
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    There are such problems in traditional facial feature point localization algorithm based on regression that the local structure information is ignored and the localization accuracy is poor when the attitude deflection is large,so this paper proposes a localization algorithm based on fuzzy clustering regression.The face training set is clustered with the local structure information of the face feature points,and the training samples are extended according to the threshold decision.The regression structures for all the sub-training sets are trained separately,and the shape constraints are added for several times in the test process to automatically adjust the results of each clustering and the selection of the regression structure,which improves the location accuracy of the facial feature point localization.Experimental results on 300-W database show that compared with ESR and RCPR,the proposed algorithm can effectively improve the positioning accuracy in the condition of large attitude deflection.
  • SHI Haoliang,WU Lushen,YU Zheqi,WAN Chao
    Computer Engineering. 2017, 43(8): 279-283. https://doi.org/10.3969/j.issn.1000-3428.2017.08.047
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To solve the problem of low efficiency and low noise sensitivity in the process of scattered point cloud feature extraction,this paper proposes a double threshold point cloud feature information extraction algorithm.The Principal Component Analysis(PCA) method and the local quadric surface fitting method are used to estimate the differential geometry information of the point cloud model.The characteristic weights of the average normal vector angle and the mean curvature of k neighborhood sampling points are obtained.The feature information of scattered point cloud is extracted by the double threshold detection method.Experimental results show that the algorithm can extract the feature information of scattered and noisy point cloud model quickly and accurately,and it has high robustness.
  • YANG Hongbai
    Computer Engineering. 2017, 43(8): 284-287. https://doi.org/10.3969/j.issn.1000-3428.2017.08.048
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Aiming at the shortcomings of high image distortion rate of existing frame memory compression algorithms for high resolution flat panel display,a frame memory compression algorithm based on image annular texture and bit reservoir technology is proposed.The intra-frame prediction mode based on the annular texture can better adapt to the large number of point-like structures in the image,so as to better remove the spatial correlation between the annular texture pixels.On the premise that the compression ratio of the image and frame memory capacity is limited,the bit reservoir technology dynamically allocates the bit space needed by the bit stream of each compressed image block,thus greatly improving the utilization rate of the frame memory storage space.Test results show that the proposed algorithm can significantly improve the Peak Signal to Noise Ratio(PSNR) of the compressed image and reduce the image distortion.

  • ZHANG Jie,TIAN Yuan
    Computer Engineering. 2017, 43(8): 288-292,298. https://doi.org/10.3969/j.issn.1000-3428.2017.08.049
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The existing publications about consensus event-triggered control strategy of multi-agent networks all assume that the interactive network is undirected instead of a more practical case:directed topology.Motivated by this problem,a novel analysis method for directed interactive networks under event-triggered control is introduced.In this method,agents can truly obtain the ability avoiding continuous communication.Furthermore,connection between algebraic connectivity and consensus performance in networks is established.It proves that the inter-event times for each agent are strictly positive,which implies that the Zeno behavior can be excluded.Simulation results show the effectiveness of the proposed approach.
  • LIU Jun,YUAN Peiyan,QIU Hao
    Computer Engineering. 2017, 43(8): 293-298. https://doi.org/10.3969/j.issn.1000-3428.2017.08.050
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to accurately model the system detection probability in a complex background,a Multi-Target Tracking(MTT) method with unknown detection probability is proposed.The detection probability is modeled by the Time Varying AutoRegressive(TVAR) process.The parameterized model is combined with the Labeled Multi-Bernoulli(LMB) filter,and the Sequential Monte Carlo(SMC) implementation of the proposed algorithm is given.Simulation results show that the target number and the target state estimation of the proposed algorithm are better than that of Beta Cardinality Balanced Multi-target Multi-Bernoulli(Beta-CBMeMBer),and the average Optimal Sub Pattern Assignment(OSPA) distance is significantly smaller than that of the LMB algorithm with fixed detection probability.
  • YAN Chunjie,YU Han,ZHANG Pei,TANG Liping,ZHOU Kegui
    Computer Engineering. 2017, 43(8): 299-305. https://doi.org/10.3969/j.issn.1000-3428.2017.08.051
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    According to achieve the localization requirements of data processing server for airborne radar mission system,this paper designs a high performance data processing module.The module uses SW1600 as the core processing unit,combined with PCIE switch chip,graphics chip,storage drive chip and GBE controller to realize data reception,parallel processing,storage and output.With the help of Bit-in Test(BIT),the module realizes the function of health management through monitoring voltage,current,and temperature in real time.The module’s design method follows the 6U Compact PCI Express(CPCIE) specification.It is rich of function interface and easy to reinforce.The module has been used in domestic airborne reinforced server.The results indicate that this module meets the appplication and environment requirements of high performance high reliability and high security for server devices in an airborne environment.
  • ZHANG Le,ZHANG Xueying,SUN Ying,ZHANG Wei
    Computer Engineering. 2017, 43(8): 306-309,315. https://doi.org/10.3969/j.issn.1000-3428.2017.08.052
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Extracting features of emotional speech signal is particularly important in the emotional speech recognition systems,which determines the overall recognition performance.The traditional feature extraction techniques assume speech signal is linear and short-stationary,without self-adapability.By using the Ensemble Empirical Mode Decomposition(EEMD) algorithm,the features are extracted in a nonlinear way.First,the emotional speech signal is decomposed into a series of Intrinsic Mode Function(IMF) by EEMD and effective IMFs set is selected using correlation coefficient method.Then the IMF Energy(IMFE) characteristics are obtained through calculation of the function in the set.In the experiment,Berlin speech database is chosen as the data source.IMFE features,prosodic features,Mel-Fregurecy Cepstrum Coefficients(MFCC) features and the fusion features of the three are input inte SVM respectively.The recognition results of different feature combinations are compared to validate the performance of the IMFE features.The experimental results show that the average recognition rate of IMFE feature merging with acoustic feature can reach 91.67%,and IMFE can effectively distingwish between different states.
  • WANG Yang,LI Nan,ZHANG Lei
    Computer Engineering. 2017, 43(8): 310-315. https://doi.org/10.3969/j.issn.1000-3428.2017.08.053
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    On the basis of analysing and comparing present evacuation simulation models,this paper proposes a new evacuation model based on hexagonal cellular automata for pedestrian panic.The evacuation space is divided into equal regular hexagon,each pedestrian has seven moving directions,including stationary.Velocity level is proposed to describe pedestrian movements.During evacuation process,the pedestrian decision-making of selecting exit is affected by the distance to the exits and the occupant number and density within the view field of the pedestrian.To show the impact of panic on evacuation,it defines the panic index in this model.The evacuation process is simulated by experiments in the case of without panic and panic.Results show that the proposed model can reproduce exactly the real evacuation process.It has definite reference effect for real pedestrian evacuation and public architectural design estimation.
  • HE Li,ZHOU Chuanwei,ZHANG Kun,SHI Chaoxia
    Computer Engineering. 2017, 43(8): 316-321. https://doi.org/10.3969/j.issn.1000-3428.2017.08.054
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem of computating resources waste in sampling process for the classical feature matching algorithm,this paper proposes a new binary feature matching approach.Local features use strong corners as the keypoints and rotation-invariant binary string as the feature descriptor.Sequential sampling evaluation uses Hammingdistance to sort matching pairs,the samples are selected sequentially,the mismatching pairs are eliminated by the model calculated with the least squares method.Experimental results show that the proposed approach can decrease the computation time significantly while achieving the same accuracy compared with the RANSAC and PROSAC algorithms.