Author Login Editor-in-Chief Peer Review Editor Work Office Work

15 June 2016, Volume 42 Issue 6
    

  • Select all
    |
  • WU Guojin,HU Cheng
    Computer Engineering. 2016, 42(6): 1-6. https://doi.org/10.3969/j.issn.1000-3428.2016.06.001
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In distributed file systems,metadata prefetching can reduce the response latency of metadata server.Existing metadata prefetching strategies prefetch metadata according to its past file access pattern,without considering the correlations between process and the corresponding files,such as file’s provenance.A metadata prefetching strategy based on provenance information is proposed for distributed storage systems.The strategy extracts provenance information windows,calculates the correlation degree of any two metadata files after analyzing the correlations between process and metadata request.It generates the correlation rules hash table and employs aggressive metadata prefetching.Experimental results show that Cache hit ratio of proposed strategy is up to 49% and 7% respectively.Additionally,the proposed strategy performs more effectively with less memory overhead than Nexus algorithm.
  • LIU Lu,CAO Yuesheng,DUO Ruihua
    Computer Engineering. 2016, 42(6): 7-13. https://doi.org/10.3969/j.issn.1000-3428.2016.06.002
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the deficiency of signal integrity for the Printed Circuit Board(PCB) design of high-density Fourteen Data Rate(FDR) interconnection switch,this paper proposes a solution scheme in the aspects of material selection,stackup assignment,wiring and anti-crosstalk.The material for PCB board manufacture and the maximum channel length are determined based on the characteristics of three types of typical high-speed material in conductor loss and dielectric loss.Moreover,the wiring policy within the BGA area and stackup are designed according to the amount of wires as well as their width requirements.Based on the theoretical analysis and calculation,the paper proposes some wiring guidelines,such as the width of the differential lines,the space between the differential lines,the placement of the differential via hole and the minimum distance from the differential via hole to the partition lines of the power layer.Under the implementation limitation,some tradeoff policies are adopted by using the striplines in wiring and reserving one side of the stub without backdrill.Simulation experimental results show that the FDR interconnection switch board designed according to the guidelines mentioned above is successfully used in several supercomputer systems including TianHe-2,and it solves the problems of signal integrity for FDR high-speed PCB design.
  • XU Jian,LI He,GONG Donglei,FANG Ming
    Computer Engineering. 2016, 42(6): 14-20. https://doi.org/10.3969/j.issn.1000-3428.2016.06.003
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    There are problems of miscellaneous computer congifuration,inflexible hardware and software upgrades,chip monopoly and stopping production,and increasingly evident bottleneck of volume and power consumption in bus interface implemented by ASIC chip.To solve these problems,this paper introduces the design of partial reconfiguration intelligent I/O interface based on the ZYNQ-7000 series Field Programmable Gate Array(FPGA) from Xilinx.By using programmable System-on-Chip(SoC) technology,based on PetaLinux development environment and Vivado2014.4 development tool,taking RS232,RS422 and CAN bus interfaces as example,the user can switch the bus interface configuration instructions via TCP/IP network data packet,and dynamically switch the corresponding local bit stream file,thus achieving the actual configuration of each interface and on-demand communication.Simulation results show that the combination of partial-reconfiguration technology and SoC technology makes the product design process more flexible and reduces the product’s dependence on hardware and update cost as well as the consumption of resources and power.In a certain extent,it also enhances the product safety and reliability.

  • ZHAI Xiaofang,LIU Quanming,CHENG Yaodong,LI Haibo
    Computer Engineering. 2016, 42(6): 21-26. https://doi.org/10.3969/j.issn.1000-3428.2016.06.004
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In recent years,international researcher proposes an Altmetrics metrology method based on social network,which can accurately reflect social impact of literature and compensate the defects of traditional citation-based method that exists inherent defects of timeliness poor evaluation,non-comprehensive,and Matthew effect,etc..In order to realize the research and application of Altmetrics method in the domestic academic environment,this paper uses citation-based traditional index and social network-based Altmetrics index.Through the statistics and calculation of metrics sources in domestic academic network environment,and the integrated use of the Delphi and Principle Component Analysis(PCA) weighting method to get the weight factor of each influence factor,and then it designs an integrated literature metrology model suitable for domestic academic environment.The results show that,the subjective and objective weighting method can make up the defect and the result is closer to the real situation,the citation ranking calculated by ALS integrated metrology model is basically consistent with that by traditional citation-based method.
  • GE Qiang,CHEN Qiancheng,ZHOU Ke,ZANG Wenqian,YAN Yunguang,FANG Xin
    Computer Engineering. 2016, 42(6): 27-30. https://doi.org/10.3969/j.issn.1000-3428.2016.06.005
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The remote sensing data in the traditional client/server mode causes heavy server load and has slow transmission.To solve this problem, this paper presents a fast remote sensing data transmission poliy. It dynamically adjusts the number of download tasks according to the download rate change. It selects the appropriate node to provide users with resource download service by retrieving the client resource of access network, thus improving the user’s download speed. Experimental results show that the policy can improve the rate of acquiring remote sensing data and the user’s download experience. Especially in the case of large number of concurrent download, it can better meet the speed requirements for remote sensing data transmission.
  • WANG Shuang,FENG Zhiyong
    Computer Engineering. 2016, 42(6): 31-36. https://doi.org/10.3969/j.issn.1000-3428.2016.06.006
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    For the reason that the traditional biological analysis method is unable to handle the semantic information effectively,this paper applies the node similarity algorithm based on attribute co-occurrence to ChEMBL database and constructs the bipartite graph based on nautral product and activity.Then,with the framework of Graphlab,it calculates the natural product similarity based on activity and recommends the natural products with high similarity.Experimental results show that the method can effectively use the semantic information of biological datasets to find out the potential activities of natural products,thus guiding the activity detection and drug target discovery and selection in the early stage of drug research.
  • ZHANG Peng,ZHU Li,DU Xiaozhi,HE Chaohui,CHEN Hao
    Computer Engineering. 2016, 42(6): 37-42. https://doi.org/10.3969/j.issn.1000-3428.2016.06.007
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In high irradiation cosmic environment,a large number of rays often result in transient faults on space computer.One major effect incurred by these faults is the Control Flow Error(CFE).In this paper,a CFE detection algorithm based on structural tag is proposed,which aims at changing the high computational complexity,lag and inflexible configuration of present CFE detection algorithms.It uses two signatures,introduces the double instruction loop into the structure of basic blocks,in order to detect the CFE occurring in inter-block and intra-block.Meanwhile,it solves the problem of the lag of inter-block.Experimental results show that,compared with other similar algorithms,the proposed algorithm can decrease the space overhead by 49.3%,decrease the time overhead by 17%~45.3%,and increase the fault coverage rate by 6.2%~8.6%,under the condition of equivalent error-detecting capacity.
  • LEI Pengbin,WANG Ling,WU Yu,HUANG Zihong,LI Lanhua
    Computer Engineering. 2016, 42(6): 43-47,54. https://doi.org/10.3969/j.issn.1000-3428.2016.06.008
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To improve the real-time of Common Object Request Broker Architecture(CORBA) in Software Defined Radio(SDR) system,this paper chooses TAO middleware as the research object.It tests the performance of TAO and analyzes the main reasons leading to longer transmission delay in SDR system.In order to optimize the internal function call process of TAO Object Request Broker(ORB),an optimized design scheme about simplifying horizontal function calls and keeping vertical function calls as much as possible is proposed.By establishing connection and handling messages in pushpacket as well as cutting out some simple classes,the time overhead of function call is reduced.Experimental results show that the transmission delay of optimized TAO is decreased significantly.It has high real-time which can meet the needs of SDR system well.
  • KAN Wenxiao,WANG Cong,XU Qi,DU Ran,Andrei Tsaregorodtsev,CHEN Gang
    Computer Engineering. 2016, 42(6): 48-54. https://doi.org/10.3969/j.issn.1000-3428.2016.06.009
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Traditional cluster computing resources can only partly meet the demand for massive data processing in high-energy physics.A high-energy physics computing system is proposed based on desktop grid technology.The system uses virtualization technology to deploy high-energy physics data processing on it,and designs image metadata service,and manages remote image database.The computing system is built on the technologies of DIRAC,BOINC,and 3G-bridge.Among them,BOINC is used to provide a new platform of computing resource,3G-bridge is used to handle job requests,and DIRAC is used for job submission.Test and analysis results show that the new system is stable,reliable and efficient for data processing with good computing performance and scalability.
  • CHEN Weijian,GUO Yong,YIN Fei
    Computer Engineering. 2016, 42(6): 55-59,67. https://doi.org/10.3969/j.issn.1000-3428.2016.06.010
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    imultaneous Multithreading(SMT) allows irrelevant instructions execute synchronously from multiple threads.It achieves the combination of Thread-level Parallelism(TLP) and Instruction-level Parallelism(ILP),enhances the performance of processor further.In the SMT design,architecture simulator can be used to achieve realizability analysis and correctness verification.This paper based on the Shenwei multi-core functional simulator,puts forward the Shenwei SMT function simulator design method,and realizes the Shenwei SMT functional simulator.Simulation results show that the design and realization of Shenwei SMT functional simulator are correct,and the RTL level real-time verification platform is built based on Shenwei SMT functional simulator,Shenwei SMT functional simulator has a high application value in the processor design and verification.
  • ZHANG Jing,LI Guoqing,YU Wenyang
    Computer Engineering. 2016, 42(6): 60-67. https://doi.org/10.3969/j.issn.1000-3428.2016.06.011
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The dispersibility and heterogeneity of Earth Observation(EO) data sources make the discovery and access of EO data strenuous and time-consuming.Aiming at this problem,a data discovery system integrated with OpenSearch EO data service is proposed.It integrates EO data source based on OpenSearch standard,and provides unified discovery interface for EO data.It extends the query mode for unlimited query keywords query,establishes standard metadata model,and implements fast response in a massive query keywords collection and unified multi-source searching results presentation through Trie tree and mapping mechanism.It uses Java Emitter Templates(JET2) technology and makes XML files description of exogenous data services and keep the completeness of the system.A prototype system is implemented,and experimental results show that it can provide accurate results for multi-source data set search,provide unified meta data model and reduce 40% time cost compared with one-by-one search approach.
  • LEI Xiaofeng,LI Qiang,SUN Zhenyu,SUN Gongxing
    Computer Engineering. 2016, 42(6): 68-74,80. https://doi.org/10.3969/j.issn.1000-3428.2016.06.012
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To make full use of I/O resources and improve data analysis efficiency,according to the features of data analysis procedure and data storage,this paper develops new C++ interfaces to access HBase by using Java Native Interface(JNI) and provides a data fully localization analysis platform for data accessing.Meanwhile,it re-designs and implements the related algorithms and software components of MapReduce,and enables optimal allocation and combination of Mapper tasks to improve the utilization of CPU resources.In addition,it provides new user friendly interfaces by integrating the data analysis environment,job management system and ROOT graphics module.Test results show that the new platform is faster and more scalable compared with traditional data analysis system based on file storage.
  • ZHANG Weiwei,GUO Jun,CHEN Suiyang
    Computer Engineering. 2016, 42(6): 75-80. https://doi.org/10.3969/j.issn.1000-3428.2016.06.013
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to avoid network segmentation in Mobile Wireless Sensor Network(MWSN),a distributed node path selection algorithm based on right hand rule is proposed.The Sink node is mainly used to initialize the network in MWSN.It uses right hand rule to calculate the position and orientation of mobile sensor node to maintain the network connectivity.Each sensor node can send the sensed data back to the Sink node to improve the coverage of MWSN.Experimental results show that the algorithm can avoid network segmentation and improve the survival time and overall efficiency of MWSN.
  • GE Liufei,LI Keqing,DAI Huan
    Computer Engineering. 2016, 42(6): 81-85,90. https://doi.org/10.3969/j.issn.1000-3428.2016.06.014
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The randomness of signal strength with stochastic fluctuation makes Generalized Regression Neural Network(GRNN) difficult to choose the optimal parameters to establish the location model and predict the target location.For this reason,a location algorithm with adaptive GRNN is put forward.The method introduces the Improved Artificial Bee Colony Algorithm(IABC) to optimize the parameter of GRNN,and is applied to wireless indoor location for the mapping relationship between the signal characteristics and the target location,which can predict the target location and reduces the influence of the randomness of signal strength with stochastic fluctuation on location accuracy.Experimental results show that average location error of the proposed algorithm is 0.65 m at the range of 12 m×12 m.Compared with GRNN based on Artificial Bee Colony(ABC-GRNN),GRNN based on Particle Swarm Optimization(PSO-GRNN),the location accuracy of this algorithm is increased by 21.3% and 23.1%,and has the fastest convergence speed.At the same time,compared with the path loss model,BP neural network,location accuracy of the algorithm respectively is increased by 17.86% and 3.1%,which can effectively improve the location accuracy.
  • CHEN Kangrun,LIU Yang,ZHANG Wei
    Computer Engineering. 2016, 42(6): 86-90. https://doi.org/10.3969/j.issn.1000-3428.2016.06.015
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper proposes a low-complexity Peak to Average Power Ratio(PAPR) reduction algorithm to deal with the high PAPR of radar-communication integrated system of Orthogonal Frequency Division Multiplexing(OFDM) based on Fractional Fourier Transform(FRFT).This algorithm can reduce system PAPR with clipping method at transmitter.It combines Compressive Sensing(CS) at receiver and reconstructs the clipped signal with improved Orthogonal Matching Pursuit(OMP) algorithm.Simulation result indicates that the proposed PAPR reduction algorithm can avoid signal distortion and out-of-band radiation of clipping method at a high Signal Noise Ratio(SNR) situation,it can guarantee the Bit Error Rate (BER) performance and reduce PAPR effectively.Moreover,it can also reach a lower calculation complexity than traditional Basis Pursuit(BP)algorithm.
  • CHEN Liuwei,LIANG Jun,ZHU Wei
    Computer Engineering. 2016, 42(6): 91-95. https://doi.org/10.3969/j.issn.1000-3428.2016.06.016
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    A distributed relay selection strategy based on Amplify and Forward(AF) mode is proposed for the unfairness between relay nodes in satellite-terrestrial cooperative system.It analyzes the Land Mobile Satellite(LMS) channel,evaluates the closed-form expression of the outage probability,and balances relaying probability by finding the corresponding weight coefficient which is introduced to achieve the required fairness for each relay node.By AF mode,signals from the satellite and terrestrial relay are combined through Maximum Ratio Combining(MRC),thereby balancing the power overhead of each relay node.Simulation results show that the proposed strategy can improve the required fairness and reduce computational complexity compared with the centralized relay selection strategy and the relay selection strategy based on best path.
  • ZHAO Huiqing,WAN Zhiping
    Computer Engineering. 2016, 42(6): 96-100. https://doi.org/10.3969/j.issn.1000-3428.2016.06.017
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    On the basis of the traditional wavelet neural network blind equalization algorithm,this paper proposes a blind equalization algorithm of adaptive step size based on dynamic parameter adjustment.According to the equalizer of output signal size,it combines with the relationship between output signal power and convergence properties,and realizes dynamic adjustment of iterative step size factor.It optimizes and selects tunable parameters through several comparative experiments,and overcomes the mutual restriction problem of convergence rate and convergence precision.Experimental results show that performance indicator of this algorithm is basically consistent of the expected results.Especially in the case of more iterations,compared with traditional wavelet neural network blind equalization algorithm,it has faster convergence rate and higher convergence accuracy.
  • HE Jianwen,ZHOU Jipeng
    Computer Engineering. 2016, 42(6): 101-107. https://doi.org/10.3969/j.issn.1000-3428.2016.06.018
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Until now,a large number of time slot assignment algorithms do not adopt effective classification and statistical approach for time slots,which greatly reduces the reuse efficiency of time slots,therefore affects the regular transmission of multi-media data flows in Mobile Ad Hoc Network(MANET).For this reason,after classifing the time slots between links in route,according to the applied situation of time slots,this paper proposes an end-to-end Quality of Service(QoS) bandwidth guaranteed time slot assignment algorithm.It firstly considers the time slots with Collision Free(CF) label,and then considers the time slots with Collision Exist(CE) label in the time of slot assignment,which can satisfy the QoS bandwidth requirement of the route and also has a maximum utilization on time slots in the same time.Simulation result shows that,compared with heuristic time slot selection algorithm(Heuristic) and Least Cost First(LCF) algorithm,this algorithm proposed in this paper can reallocate time slots effectively,and the average success rate of QoS bandwidth requirements routing is higher than the other algorithms.
  • ZHANG Fei,GENG Hongqin
    Computer Engineering. 2016, 42(6): 108-112. https://doi.org/10.3969/j.issn.1000-3428.2016.06.019
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To improve the energy efficiency of Wireless Sensor Network(WSN),an energy utility optimization algorithm based on random probability statistics model analysis is proposed in this paper.The algorithm based on Gamma and the modified Bessel function proposes a way to reflect the fading severity probability model.Then,it uses the training sequence packet to estimate and synchronize the channel,and analyzes the energy level of the two-channel data transmission mechanism.Finally,according to the severity of the current channel fading and shadowing,this paper selects the best transmit energy level to improve the efficiency of the energy per bit.Experimental results show that,compared with the algorithm of energy optimization based on nodes awareness mechanism and the algorithm of energy efficiency based on IEEE 802.15.4 mobile sensor network,the average bit error rate is decreased by 276.3% and 147.6%,and the total network energy efficiency is increased by 26.7% and 29.2%.Therefore,it has a better effect on reducing the bit error rate of data transmission and improving network energy efficiency.
  • LIU Jianxun,CHENG Zijing,LU Xiang,LIN Kai,WANG Chong
    Computer Engineering. 2016, 42(6): 113-119. https://doi.org/10.3969/j.issn.1000-3428.2016.06.020
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    It can use IP technology to realize the space and ground network interconnection for the construction of space and ground integrated network.However,the characteristic of Space Data Link(SDL) is long-delay and easily interrupted.Aiming at this problem,a space and ground network interconnection simulation system is proposed in this paper.In order to improve the transmission performance of network,the segmentation Transmission Control Protocol(TCP) technology is adopted in the transmission layer.Space and ground gateway is used to implement protocol converting at data link layer or transmission layer.Based on this system,this paper tests the TCP and Space Communications Protocol Specification-Transport Protocol(SCPS-TP) transmission speed and performance in supporting some application services.Experimental results show that,compared with TCP protocol,the segmentation TCP technology can meet the transmission performance requirements of video,voice and other services.It can effectively support a variety of application services.
  • GU Ting,DU Weizhang
    Computer Engineering. 2016, 42(6): 120-124. https://doi.org/10.3969/j.issn.1000-3428.2016.06.021
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    The existing public verifiable multi-secret sharing schemes without trusted centers can not make sub-secrets self-selected and periodical renewable simultaneously.In order to solve the problem,a public verifiable and renewable multi-secret sharing scheme without trusted centers is proposed.Every participant selects sub-secret and then generates shadow secret.Shadow secrets can be transmitted on public channel based on signcryption algorithm,and the validity of the distribution shadow secrets can be checked by anyone.The one-way hash chains are used to make the shadow sub-secret renewable.The correctness and security of the scheme are analyzed,and the cheatings of the scheme are checked.Compared with existing schemes,the proposed scheme does not need trusted center.Participants can select the sub-secret themselves.And multi-secret can be verified publicly,renewable and shared.

  • LI Jindong,WANG Tao,WU Yang,LEI Dong
    Computer Engineering. 2016, 42(6): 125-130. https://doi.org/10.3969/j.issn.1000-3428.2016.06.022
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The encrypted Session Initiation Protocol(SIP) is difficult to identify and there is less related research,which makes the intrusion detection and the network traffic monitoring inconvenient.Aiming at these problems,this paper proposes a SIP identification model based on Principal Component Analysis(PCA) and Learning Vector Quantization(LVQ) network.It extracts the feature of relevant flow characteristics,the cumulative contribution rate of which is higher than 85%,as the main characteristic during the identification of SIP by adopting PCA on the network traffic properties of the SIP.Then it trains the LVQ network training and builds a complete SIP identification model.Results show that the PCA_LVQ model can identify the SIP with a recognition rate higher than 90%,indicating that the property of SIP extracted by PCA network flow is different from non-SIP.The model has good effect on identifying SIP.
  • GUO Fan,ZHOU Xuan
    Computer Engineering. 2016, 42(6): 131-138. https://doi.org/10.3969/j.issn.1000-3428.2016.06.023
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Taint analysis usually uses approximate or simplified method to analyze large scale program.So that the results are imprecise.In order to solve this problem,by extending current definition of dependency relation,modeling dependent relation between parameters of the method,modeling the relation between heap variable and parameters,this paper presents a new method to construct a demand-based data dependent graph orienting J2EE programs.This method uses predefined taint analysis method and extending definition for dependency relation to build data dependency edges.Multistage analysis method traverses dependency paths in dependency graph so as to analyze large scale programs efficiently.Experimental results show that the method has much improvement on analysis precision and time performance compared with Taint Analysis for Java(TAJ) method.
  • ZHANG Hongjun,LIU Ke,MOU Zhansheng
    Computer Engineering. 2016, 42(6): 139-143,150. https://doi.org/10.3969/j.issn.1000-3428.2016.06.024
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The secret sharing algorithm based on the traditional mathematical assumption is difficult to resist the attack of the quantum algorithm.Therefore,a lattice-based threshold secret sharing algorithm is proposed.The basic process of the secret sharing strategy is analyzed,which is divided into public key generation,share generation and secret reconstruction.The threshold secret sharing algorithm based on lattice is discussed.The algorithm can be reduced to the closest vector problem,and its security is analyzed.Calculation results prove that the algorithm is correct and safe.
  • LIN Chunli,CUI Jie
    Computer Engineering. 2016, 42(6): 144-150. https://doi.org/10.3969/j.issn.1000-3428.2016.06.025
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Wireless Sensor Network(WSN) uses multipath routings and data splitting to improve intrusion tolerance.It is more secure when using more roads and data fragments.However,larger computing,storage and energy costs are needed.Aiming at these problems,this paper presents an adaptive multipath secure routing protocol to provide diverse transmission scheme corresponding to data security level.Firstly,the protocol uses the neighbor nodes that are two hops away from the source node as the identity to build multiple disjoint routes from the source node to the destiny,and assesses the safety of each path based on on-demand routing.Secondly,the base station selects the routes and data fragmentation parameters according to the data security requirements.Finally,the source node uses the weighted threshold secret sharing scheme to fragment the data into multiple shares and transfers them to the base station through multiple paths.Analysis results show that,the protocol can reduce nodes’ computing time and storage space compared with the routing protocols using (t,n) threshold scheme and RS coding to fragment data.
  • CHEN Long,WANG Lisong
    Computer Engineering. 2016, 42(6): 151-155,160. https://doi.org/10.3969/j.issn.1000-3428.2016.06.026
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The traditional hazard analysis methods for Integrated Modular Avionics(IMA) reconfiguration are static system structure hazard analysis method.It cannot make hazard analysis on the failure condition of the dynamic reconfiguration of IMA,and the classical Petri net based analysis method needs to generate all the reachable graph when the system dynamic analysis is carried out,it is easy to cause the state space of the reachable graph to be too large.Therefore,a new hazard analysis method for IMA reconfiguration is proposed.It is based on the AADL model of the control process of IMA reconfiguration.The reconfiguration control flow model is transformed into Petri net model,uses a backwards critical state algorithm of Petri net,and the hazards of IMA reconfiguration function are analyzed based on the reachability analysis of Petri net.Analysis results show that,it not only overcomes the weakness of the traditional system function hazards analysis method that is difficult to analysis runtime hazard behavior,but also avoids the state explosion problem brought by the traditional Petri net reachability analysis.Meanwhile,the critical control elements which can cause high-risk status are identified.
  • WANG Xiaokai,SUN Yi,GUO Dabo,WANG Yunyan
    Computer Engineering. 2016, 42(6): 156-160. https://doi.org/10.3969/j.issn.1000-3428.2016.06.027
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In quantum cryptography communication,distributions of the non optimized Low Density Parity Check(LDPC) has low performance and it has low reconciliation efficiency and short transmission distance.Aiming at this problem,a quantum key distribution data coordination algorithm based on continuous variables is proposed.Take the advantage of characteristics of Extrinsic Information Transfer(EXIT) chart,from the results of Guassian approximation based on Particle Swarm Optimization(PSO),EXIT chart selects better performance distributions by comparing the threshold value and optimize LDPC code.Simulation results show that,compared with the density evolution algorithm and the Gauss approximation algorithm,the LDPC code distribution can be optimized effectively by EXIT chart.Meanwhile,when the number of packets of continuous variables is 2×105,compared with of the non optimized LDPC codes,the optimized LDPC codes can reduce the convergence Signal to Noise Ratio(SNR) and improve the reconciliation efficiency and maximum transmission distance.
  • TANG Yan,LIU Ruiqi,YANG Panlong,FAN Xiaochen,LI Qingyu
    Computer Engineering. 2016, 42(6): 161-166. https://doi.org/10.3969/j.issn.1000-3428.2016.06.028
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Crowd sensing networks achieve large-scale sensing by using the existing sensing equipment of users as well as the deployed communication networks.Hence,it can efficiently address the critical problem of high cost of large scale network.However,during the process of task transfer and allocation for mobile users,group collusion as well as task copy and forwarding may cause serious threats for specific users.Aiming at this problem,based on crowd sensing network,this paper proposes a secure task distribution technique,whose main idea is to apply d-choice method in balls-into-bins theory into task allocation process and achieve load balancing for all users while the established threshold strategy ensures the security of user tasks.Simulation results show that,compared with the random task allocation technology,the proposed technology is able to balance the task load more efficiently,thus ensuring task safety.
  • JIANG Yanxia,WU Tengfei,LIU Ziyuan
    Computer Engineering. 2016, 42(6): 167-170. https://doi.org/10.3969/j.issn.1000-3428.2016.06.029
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Based on Local Maximal Margin Discriminant Embedding(LMMDE) method,an algorithm named Weighted Neighborhood Maximum Margin Discriminant Embedding(WNMMDE) is proposed.This algorithm is a feature extraction algorithm based on manifold.It preserves the neighborhood geometry structure of the data while constructing objective function by optimal reconstruction coefficient of data.At the same time,the algorithm does not need to compute the inverse of the high dimension matrix,and it can overcome the small sample problem in feature extraction.Recognition experimental results on two general face image database show that the proposed algorithm makes full use of the discriminant information of each manifold,minimizes the distance between the same class of neighboring nodes as far as possible,increases the distance between different classes of neighboring nodes,which effectively distinguishes different categories,and can get better recognition results.
  • ZHANG Xiaobing,LI Yanping,WANG Shuangjie
    Computer Engineering. 2016, 42(6): 171-174. https://doi.org/10.3969/j.issn.1000-3428.2016.06.030
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem that speech endpoint is difficult to detect in different noise background,especially in non-stationary noise,an effective endpoint detection method is proposed based on Hilbert-Huang Transform(HHT) of instantaneous energy in the low Signal-to-noise Ratio(SNR) environment.Every frame of signal is decomposed into finite Intrinsic Mode Functions(IMF) by Empirical Mode Decomposition(EMD),the instantaneous energy of IMF1 is calculated to get the energy value and combine with the IMF3 to detect the speech endpoint.The proposed method can effectively extract the noise component in the non-stationary noise,and avoid the limitation about the traditional method of selecting the former several frames as the noise,meanwhile selecting IMF3 to endpoint detection can achieve the effect of the filter.Experimental results show that this algorithm improves the endpoint detection accuracy in different noise background and the low SNR environment.
  • LI Bo,CHEN Zhigang,HUANG Rui,ZHENG Xiangyun
    Computer Engineering. 2016, 42(6): 175-179,184. https://doi.org/10.3969/j.issn.1000-3428.2016.06.031
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The popularity of the Internet and electronic music resources makes it easier for people to obtain music resources.However,as the music library becomes larger and more abundant,it is difficult to find favorite music accurately and timely.Therefore,on music sites,a suitable music recommendation algorithm is particularly needed for users.According to the deficiencies of music recommendation based on audio information and the collaborative filtering method,this paper analyzes the user’s music listening data and download data,combined with the Latent Dirichlet Allocation(LDA) theme mining model,proposes a music recommendation algorithm.Experimental results show that compared with the collaborative filtering algorithm based on user and the collaborative filtering algorithm based on item,MR_LDA algorithm can be more efficiently recommend interested music to users.
  • SHI Nianyun,GE Xiaowei,MA Li
    Computer Engineering. 2016, 42(6): 180-184. https://doi.org/10.3969/j.issn.1000-3428.2016.06.032
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem of data sparsity and cold-start in Collaborative Filtering(CF) algorithm,this paper puts forward an improved recommendation algorithm combined with user demographics and trust mechanism.When calculating the similarity of user score,it fuses the user demographics,and produces the overall similarity.In addition,it introduces the trust mechanism by considering the local trust of the users’ interaction information and global trust of user in the whole system,and puts the hybrid value that mixes the overall similarity and trust degree as the recommended weight to produce recommendations for user.Experimental results show that the proposed algorithm can effectively improve the predication accuracy of the cold start users.
  • LIU Chao,HE Lijun,ZHU Guangyu
    Computer Engineering. 2016, 42(6): 185-190,195. https://doi.org/10.3969/j.issn.1000-3428.2016.06.033
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    A new fitness assignment strategy is proposed for solving the high dmension multi-objective optimization problem,which is the Fuzzy Relevance Entropy Method(FREM).The FREM is built by integrating the fuzzy information entropy theory and the membership function.The membership function is used for transforming the ideal solution and the Pareto solution into fuzzy set.The fuzzy information entropy theory is used for calculating the internal relations between the ideal solution fuzzy set and the Pareto solution fuzzy set,meanwhile the fuzzy relevance entropy coefficient is used as the fitness value to guide the evolution of the swarm intelligence algorithm.Experimental test is conducted on DTLZ test function set.The results show that FREM can solve the high dmension multi-objective optimization problem,avoid the influence of the increasing number of the sub-objectives on the algorithm,and the solutions are better than those of random weighting method and NSGA-II.
  • SHI Yongbin,YU Qingsong
    Computer Engineering. 2016, 42(6): 191-195. https://doi.org/10.3969/j.issn.1000-3428.2016.06.034
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    New words or compound words are not included in the dictionary of text segmentation system,however these words have strong theme performances.To address this problem,the key words extraction algorithm based on chi-square value of co-concurrence words is proposed.Co-concurrence words are extracted by the associations among words,which are established according to the dependency parsing from the Language Technology Platform (LTP).The chi-square is used to test whether obvious differences exist among the distributions of co-concurrence words.Co-concurrence words with higher obvious differences have greater probability of being key words.The algorithm is also valid for the single word.Taken the single word and co-concurrence words as candidate key words,the algorithm extracts full text key words with the consideration of the chi-square value,word frequency and number of the candidate key words.Experimental result shows that the key words extraction algorithm based on chi-square value of co-concurrence words is better than the TextRank algorithm as the precision of key words extraction reaches 38.07% and the accuracy of the co-concurrence words reaches 80.15%.

  • SHI Weihang,LIN Nan
    Computer Engineering. 2016, 42(6): 196-200,207. https://doi.org/10.3969/j.issn.1000-3428.2016.06.035
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problems of monstrous space of feature sequences and high computing complexity in the process of feature learning for time-series data,this paper proposes a classification algorithm that jointly learns feature sequences and classification parameters.After feature transformation of the time-series data,it applies linear classifier to learn model parameters from the minimum distance matrix and predicts the target variables.In the objective function,it jointly learns the loss function and linear weights of the classifier in classification prediction,and applies stochastic gradient descent approach to solve the optimization problem.Experimental results show that,compared with F-Stat and expression transformation method,the proposed algorithm has higher classification prediction accuracy while keeping low execution time.
  • LI Mingyao,YANG Jing
    Computer Engineering. 2016, 42(6): 201-207. https://doi.org/10.3969/j.issn.1000-3428.2016.06.036
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Entity relation extraction is a part of the Information Extraction(IE).Its objective refers to determining whether there is a kind of semantic relationship between entities.To break the limitations of complex Chinese grammar,flexible expression and various semantic,which results in the vague relationship between entities simply using verbs as relational expressions in Chinese,this paper presents an open Chinese entity relation extraction method using dependency parsing.This method first does dependency parsing to the input sentence.Whether it is verb predicate sentence can be judged through the dependency arc by dependency parsing.If it is verb predicate sentence,relationship expression can be extracted combined with Chinese grammar heuristic rule.The location of the argument is determind according to the distance,evaluating the triples and outputting these qualified triples.Experimental results on SogouCA and SogouCS corpus show that the proposed method is suitable for large-scale corpus,and has good performance and portability.Contrast with unsupervised clustering method based on kernel tree,F-measure is increased by 16.68%.
  • WEI Tao,JI Xinsheng
    Computer Engineering. 2016, 42(6): 208-212,217. https://doi.org/10.3969/j.issn.1000-3428.2016.06.037
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In image recognition field,the fact is that the labeled data is far less than the unlabeled data.In order to make full use of unlabeled data to improve the ability of image recognition,a Self-labeling Online Sequential Extreme Learning Machine(SLOSELM) algorithm is proposed.Based on the labeled data in the source domain,an Extreme Learning Machine(ELM) model is trained to recognize the unlabeled data in the target domain.The high confidence samples in the recognition results are selected to adaptivly adjust the trained ELM model using the SLOSELM algorithm,thus can enhance the recognition accuracy.Experimental results on real data sets show that the average recognition ability of the ELM model is improved by about 18% after using SLOSELM algorithm,and the recognition time is shorter than that of the Co-training algorithm.
  • GE Shun,XIA Xuezhi
    Computer Engineering. 2016, 42(6): 213-217. https://doi.org/10.3969/j.issn.1000-3428.2016.06.038
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Usually,problem of intelligence assistant decision is an integrated evaluation problem from multiple criteria related to some special decision event.Aiming at the intelligence decision support problem of multiple criteria,a model is proposed of establishing Bayessian Network(BN) with cause-and-effect relationships.It inferences the state probability of every components of decision criterion vector according to the probability of decision event based on BN model,and defines the multiple criteria decision evaluation function from decision objective and finds the best option of the decision event with the function.The model inherits the ability of probability quantizing reasoning of BN model and can analyze multiple criteria synthetically.Experimental results show that the model can find the optimal alternative of intelligence decision problem from multiple criteria with complicated relationship.
  • PENG Lihong,LI Zejun,CHEN Min2,REN Rili
    Computer Engineering. 2016, 42(6): 218-223,229. https://doi.org/10.3969/j.issn.1000-3428.2016.06.039
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper considers complexity of the classifier and the geometric distribution of data, and proposes a semi-supervised learning algorithm to predict the associations between drugs and targets combining drug-target interactions network based on drug structure similarity and target sequence similarities. Experimental results show that, this algorithm has better prediction performance compared with DBSI algorithm, KBMF2K algorithm, etc.. The drugs-target interaction data predicted by the proposed algorithm is scored and sorted, and parts of interactions data can be retired from KEGG, DrugBank, SuperTarget and ChEMBL among the predicted interactions with 30% highest scores.
  • SUN Pan,HU Weiling,LIU Jiquan,WANG Bin,DUAN Huilong,SI Jianmin
    Computer Engineering. 2016, 42(6): 224-229. https://doi.org/10.3969/j.issn.1000-3428.2016.06.040
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Image registration of endoscope video is the most important problem for Computer-aided Diagnosis(CAD) in endoscopy.For gastroscopy,this paper presents an image registration method based on homography assumption.It gets initial matching point pairs by feature point detection algorithm,and uses Delaunay triangulation to cluster the initial matching point pairs into corresponding triangules.If the triangle patch satisfies homography theorem,it marks three vertexes as the matching points.Otherwise,it registers the inscribed circle center of corresponding triangle patch with epipolar constraint.Afterwards,the corresponding triangle patches are divided into smaller ones with the inscribed circle centers and vertexes,and those newly generated patches are fed into the next loop process.The method is completed until no new patches and matching point pairs can be generated.Experimental results show that this method has better performance than FAST,SIFT,SURF and GFTT in both distribution uniformity and registration precision.
  • LU Lai,WANG Junmin,FAN Rui
    Computer Engineering. 2016, 42(6): 230-234,240. https://doi.org/10.3969/j.issn.1000-3428.2016.06.041
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Classic descriptors such as Scale Invariant Feature Transform(SIFT) and Speeded up Robust Feature(SURF) have some drawbacks in storage capacity and parameter adaptive learning,so a binary descriptor for images based on Adaboost is proposed,which can obtain image descriptor from optimal learning.A general framework using the learning method to obtain the image descriptor is developed,and a modified similarity function is presented on the basis of similarity function based on threshold response,by which the image descriptors and binary descriptors can be quickly learned.Weak learners are constructed by using the gradient features of the image,and the optimal weights and non-linear characteristic response of weak learners are computed by using the Adaboost method.The resulting local feature descriptor is discriminative and robust.Experimental results on image matching show that the proposed binary descriptor occupies less storage space and has good matching performance.

  • WANG Jianwen,LIN Jie
    Computer Engineering. 2016, 42(6): 235-240. https://doi.org/10.3969/j.issn.1000-3428.2016.06.042
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper presents a color histogram feature annotation method based on Pyramid Match Kernel(PKM) for automatic image annotation.This technique works by partitioning the image into increasingly fine grids and computing color histogram within each grid,then applying a binary logarithm transformation to color histogram for regularizing the contribution of each color in the final descriptor.The histograms of all grids are connected to a single vector after weighted processing,named Color Histogram Pyramid(CHP).The similarity between vectors can be measured by histogram intersection distance.Experimental results on corel5k dataset show that the proposed method outperforms traditonal methods on global color histogram,block-based color histogram,and features exacted by scale-invariant feature transform.Compared with Spatial Pyramid Matching(SPM) method based on PKM,F-measure is improved by 10%.
  • WANG Lingjiao,ZHONG Yiqun,GUO Hua,PENG Zhiqiang
    Computer Engineering. 2016, 42(6): 241-246. https://doi.org/10.3969/j.issn.1000-3428.2016.06.043
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In image segmentation,Conditional Random Field(CRF) model and its higher-order model are widely used as energy function.The latter is based on second-order model by introducing higher-order potential function to reflect consistency of pixel labels of each segment,so that the segmented object boundary is more accurate.But the computing efficiency of energy minimization is not ideal.Aiming at this problem,this paper presents an improved image segmentation algorithm based on robust PnPotts high-order CRF model,which computes local optimal solution by running max-flow/min-cut algorithm according to the set of labels.The local optimal solution is used to modify the label of nodes and the α expansion run for the remaining nodes.The flows and residual capacities of the side are dynamically updated in each iteration process,so the running time of each iteration rapidly decreases.Experimental results show that compared with the α expansion algorithm is run,the improved algorithm can not only maintain the original segmentation effect,but also achieve 2~3 times of speed-up in the time of energy minimization on the same image.
  • QU Shaojun,LI Qiaoliang
    Computer Engineering. 2016, 42(6): 247-254. https://doi.org/10.3969/j.issn.1000-3428.2016.06.044
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Attributed to the diversity of human body posture,difference between clothing color and texture,the presence of noise,low contrast,uneven illumination and complex background,there are enormous difficulties with human image segmentation.So this paper proposes an automatic human image segmentation method based on face detection and Cellular Automata(CA).It uses face detection algorithm to recognize human faces and gets facial contours.Then it establishes object and background seed point estimation model based on the position of the detected face,and gets the object and background seed points.It performs pixel labeling task by cellular automata,and the image is divided into two parts of object and background.The method realizes fully automatic human image segmentation.Different kinds of human images are segmented over a segmentation database.Experimental results demonstrate that compared with Grabcut method,the proposed method can automatically and accurately segment the human images,and effectively improve the segmentation efficiency.
  • TAO Zhiqiang,LI Hailin,ZHANG Hongbing
    Computer Engineering. 2016, 42(6): 255-260. https://doi.org/10.3969/j.issn.1000-3428.2016.06.045
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The Iterative Back Projection(IBP) Super Resolution Reconstruction(SRR) algorithm based on improved Keren registration method uses bilinear interpolation method to get the initial estimations of high-resolution images,which leads to the sawtooth in the edge of the reconstructed image.To solve this problem,an IBP super resolution reconstruction based on New Edge Directed Interpolation(NEDI) is proposed.The NEDI method computes local covariance coefficients of a low-resolution image and uses these covariance estimations to adapt the interpolation at a higher resolution based on the geometric duality between the local covariance of the low-resolution image and that of the high-resolution image.Experimental result indicates that the proposed method can reduce the edge sawtooth,increase Peak Signal to Noise Ratio(PSNR),reduce Root Mean Squared Error(RMSE) and improve the subjective visual effect of the image.
  • LI Yunfeng,LI Shengyang,HAN Xixi
    Computer Engineering. 2016, 42(6): 261-264. https://doi.org/10.3969/j.issn.1000-3428.2016.06.046
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    A monitoring image super-resolution algorithm based on multi-neighborhood information is proposed to solve the problem of target identification due to the low image resolution of current video surveillance.On the basis of the bi-cubic interpolation,according to the change of each pixel along multi-directional gray gradient in the source image,optimized algorithm is used to acquire the super-resolution image.The multiplicity of gray scale directional change in the edge part of source image is taken into full consideration,so that the details section of the reconstructed image is more natural and clear.Experimental results show that,compared with the traditional bi-cubic interpolation algorithm,the algorithm can well represent the edge information of the image,obviously improve the subjective visual effect of the imag.The evaluation indexes of the obtained super-resolution image,which include peak signal to noise ratio,mean square error,and image similarity degree,etc.,are better than traditional methods.
  • CHANG Jian,BAI Jiahong
    Computer Engineering. 2016, 42(6): 265-273. https://doi.org/10.3969/j.issn.1000-3428.2016.06.047
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the shortage that single scale Retinex algorithm has the halo phenomenon,noise amplification phenomenon and image graying,based on rotational symmetric bilateral filtering,this paper proposes a Retinex image enhancement algorithm.The illumination images are estimated with bilateral filtering invariant to rotation and their influence on visual effect is removed to get the reflecting images’ attributes,which can overcome the halo phenomenon effectively.Then threshold values are determined by Otsu threshold segmentation algorithm to judge the bright areas and dark areas of images,and the noise in these areas is removed to varying degrees by bilateral filtering,which can remove noise as well as reserve image details.The contrast in low contrast regions is enhanced with piecewise linear transformation,which can improve the insufficient about grey phenomenon of images.Experimental results show that the proposed algorithm can remove the halo phenomenon effectively,improve the noise amplification phenomenon significantly,get more abundant image information and enhance the contrast of images,which is better than the single scale Retinex algorithm and the Retinex algorithm based on bilateral filtering.
  • KE Weiyang,GUO Lijun,ZHANG Rong,WANG Yadong
    Computer Engineering. 2016, 42(6): 274-279. https://doi.org/10.3969/j.issn.1000-3428.2016.06.048
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Person re-identification is the key technology in human object extraction and tracking.Now,mostly person re-identification methods extract the human appearance feature and compute the feature similarity to implement the human matching.These methods can get accurate results for the images which have apparent differnce,but they are difficult to recognize images which have similar areas.Aiming at this problem,considering the region structure informations of the appearance feature in the different images of the same pedestrian are more similar than that in the similar images of different pedestrians,this paper proposes a person re-identification algorithm based on dense patch matching with local saliency.It extracts the combination feature of each dense block,calculates the similarity of each block in the local area and its saliency weight in local region according to the similarity.Experimental results on the VIPeR dataset and the CUHK dataset show that the proposed algorithm has higher recognition rate.It is invariant to the effects of variations of viewpoints,poses,lightings,and it can provide accurate matching results of pedestrians who have the most similar areas.
  • RUAN Yingying,WANG Xili,LIN Hongshuai
    Computer Engineering. 2016, 42(6): 280-286. https://doi.org/10.3969/j.issn.1000-3428.2016.06.049
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    p-voltages algorithm is a semi-supervised classification algorithm,which is based on samples’ theoretical voltages in the electricity network system.It has some disadvantages in image classification,such as only two labeled samples (source node and sink node) are selected,resulting in poor accuracy of classification,and the high complexity of the design of graph makes it unable to process large images.To solve the above problems,image classification with p-voltages algorithm based on mean shift is proposed.Mean shift method is used to smooth the image for reducing the diversity of image features,and multi-labeled samples are chosen from the smoothed image as the source and sink nodes to improve the effectiveness of learning.A sample in each smoothed area is randomly selected as the unlabeled sample to ensure that it carries abundant image feature information.In order to reduce the scale of design of graph and furthermore provide conditions for large-scale image classification,the labeled and unlabeled samples are used as the subset of the original image for composition.Experimental results indicate that the proposed method not only improves the classification accuracy,but also receives a higher time efficiency,suitable for large-scale image classification with complex features.
  • LU Xingjia,WANG Yujin,CHEN Zhirong,LIN Yong
    Computer Engineering. 2016, 42(6): 287-292. https://doi.org/10.3969/j.issn.1000-3428.2016.06.050
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The composite detection template and deformation constraint problems of object detection are studied in this paper.On the premise that the object appearance meets Gaussian distribution,an object detection algorithm is proposed based on Latent Support Vector Machine(LSVM) and Gaussian Mixture Model(GMM).It utilizes sliding window algorithm to extract Histogram of Oriented Gradient(HOG).The semi-convex constraint of LSVM is transformed into the convex optimization problem by introducing the quadratic loss fuction in the training phase.Then the global object detection optimization results are obtained through GMM.Experimental results show that the proposed algorithm has higher object detection accuracy than Dual Tree Branch and Bound(BB) algorithm and Deformable Part Model(DPM) algorithm.
  • YANG Jianhui,LIU Zhen,CHEN Hao
    Computer Engineering. 2016, 42(6): 293-298,304. https://doi.org/10.3969/j.issn.1000-3428.2016.06.051
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Human-computer emotion interaction is one of the research highlights in the virtual reality.The research of human-computer interaction based on depth camera is valuable with the popularity of depth camera.Using Microsoft Kinect2.0 as the interactive device,this paper defines different emotional semantic postures and gestures,and designs the gesture recognition method based on template matching.In order to realize the harmonious human-computer emotion,it proposes the method to realize emotional interaction of virtual characters and constructs cognitive structure and emotional interaction rules for the virtual avatar.The prototype system is realized in computer.Experimental result shows that the virtual avatar can respond to user’s body movement through facial animation and head motion.It proves that the proposed technical solution is feasible.
  • SHANG Xiongwei,ZHANG Zhixiang,QIU Shuting
    Computer Engineering. 2016, 42(6): 299-304. https://doi.org/10.3969/j.issn.1000-3428.2016.06.052
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The style of computerized learning shifts from content delivery towards personalized online learning with Intelligent Tutoring System(ITS).Spoken ITS in the domain of regulations and operational procedures that are expected to be remembered accurately and obeyed strictly,can provide an immersive interactive learning environment,which is helpful to improve students’ ability of grasping and using related knowledge.This paper proposes a series of methods for building ITS for regulations and operational procedures training.A type-related knowledge representation is proposed.The speech recognition and word segmentation are based on key words.And an answer assessing algorithm is designed accordingly.A hierarchical dialogue management framework is proposed for speech tutoring,consisting of subject control level,tutoring level and discussion level.A prototype system for shipboard damage control tutoring shows that the proposed method can work well on building ITS in specific domain conveniently,and the concept of this research can be extended to another domain with minor modification.
  • LU Tao,LIU Zhen,LIU Tingting,LIU Cuijuan,CHAI Yanjie,FANG Hao
    Computer Engineering. 2016, 42(6): 305-309. https://doi.org/10.3969/j.issn.1000-3428.2016.06.053
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In the urban traffic simulation,vehicles start slowly with the existing Intelligent-driver Model(IDM).To solve this problem,this paper proposes an improved IDM,which considers how vehicle distance affects vehicle velocitiy,uses acceleration term based on vehicle distance and introduces the difference between real vehicle distance and expected distance.These approaches make vehicle have a greater starting acceleration and reduce the distance between vehicles.To verify the rationality and effectiveness of the model,this paper models 3D urban roads and simulates the traffic situation at the crossroad and realizes traffic flow simulated animation.Experimental result shows that the improved model can reduce a phenomenons of slow starting and abnormal changing of distance between vehicles.It makes driving behavior look much more like real urban driving behavior,then the car-following behavior in the urban traffic can be realized realistically.
  • ZHOU Xiaobo,LIU Sang
    Computer Engineering. 2016, 42(6): 310-315. https://doi.org/10.3969/j.issn.1000-3428.2016.06.054
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem of degradation among foggy images,a fast haze removal algorithm based on the theory of dark channel prior is proposed to deal with video images.According to the application of real-time processing of video images in many occasions,this algorithm is implemented on the hardware platform,based on certain characteristics of Field-programmable Gate Array(FPGA) hardware platform.The algorithm is simulated and verified with Matlab,and is realized on the FPGA hardware platform,which is especially designed for video processing.Experimental results show that this system is stable and effective.It is capable of processing video images up to 60 frames per second.It can effectively solve problems brought by the haze and other factors on the collected image.
  • LUO Shicao,DING Yongsheng,HAO Kuangrong
    Computer Engineering. 2016, 42(6): 316-321. https://doi.org/10.3969/j.issn.1000-3428.2016.06.055
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In the industrial production line,due to the randomness of component placement,the traditional genetic algorithm is difficult to get a good sorting scheme.This paper puts forward a framework of Cataclysm Symbiotic Evolutionary Algorithm (CSEA) based on the idea of biological symbiosis evolution strategy to solve the problem of optimizing assembly paths.The algorithm adopts the strategy of disaster to avoid premature convergence and solve the problem that the optimal solution is affected by the initial population.It sets the threshold T at first.If the algorithm does not get a more optimal solution after T times of consecutive evolution,it reckons and produces the symbiotic population again.But,the new population’s topological structure inherits from the last population.As a result,it can obtain some global information efficiently on the basis of the optimal information from the original population.Experimental results indicate that,compared with the hierarchial approach and simple symbiotic evolutionary algorithm,the new algorithm efficiently improves the rate of convergence and can get a shorter path for one specific sorting assembly operation.