Author Login Editor-in-Chief Peer Review Editor Work Office Work

20 January 2012, Volume 38 Issue 2
    

  • Select all
    |
    Networks and Communications
  • HONG Lu, CHENG Yao-Dong, CHEN Gang
    Computer Engineering. 2012, 38(2): 1-3. https://doi.org/10.3969/j.issn.1000-3428.2012.02.001
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper designs and implements a metadata server of GRASS mass storage system, analyzes and optimizes three performance factors of the metadata service: Metadata organization, communication performance and search efficiency, and introduces the implementation of Bloom Filter algorithm in name space search. Experimental results show that, performance of the target system is significantly improved, and it basically fulfills the requirement for large-scale computation of high energy physics.
  • GAO Jian-Wei, LI Lei, TAO Rui, SUN Jin-Qiu, ZHANG Yan-Ning
    Computer Engineering. 2012, 38(2): 4-7. https://doi.org/10.3969/j.issn.1000-3428.2012.02.002
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper presents a real-time detection and tracking method for dim-small target based on Kalman Filtering(KF). In adjoining frames same reference stars are selected and it calculates the distance of every star to the reference stars. Because the star points in the background have different movement from the targets, the true targets can be found from the stars. To solve the problem of targets loss, KF is used to forecast the position and the picture is segmented to find the lost target, and target chains are built with the movement stability. Experimental results show that the method can perfectly meet the requirements of the real-time space target detection with a high detection probability and a low false alarm rate.
  • QU Hai-Beng, HU Lu
    Computer Engineering. 2012, 38(2): 8-10. https://doi.org/10.3969/j.issn.1000-3428.2012.02.003
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to reduce the energy consumption in data center, its energy consumption effectiveness is researched with a framework named EADC. In the framework, there are two types of modules: Virtual Environment Manafement System(VEMS) and Data Center Manafement System(DCMS), which put down the energy usage by changing the status of computing nodes(VEMS) and satisfy the Quality of Service(QoS) by controlling the nodes migration among VEs(DCMS). Test results shows that EADC makes data center achieve the balance between energy consumption and performance, it has the lowest energy consumption while meeting the QoS.
  • YUAN Yu-Qian, HU Xiao-Hui, YANG Ji
    Computer Engineering. 2012, 38(2): 11-13. https://doi.org/10.3969/j.issn.1000-3428.2012.02.004
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In service composition, due to dynamic changes in the environment and reasons for the services itself, the behavior of the Web services involved in service composition may evolve and influence the service composition results. An adaptive service selection framework is proposed to solve the problem. The information from public service registries is captured and recorded locally in the binding repository, then link analysis algorithm is applied to the binding repository to get the highly linked services at the moment for service selection. Simulation experiment results show that the framework and the algorithm can select high-quality services dynamically, and reduce the failure rate of service composition results caused by evolving Quality of Service(QoS).
  • LI Liang-Bin, WANG Jin-Lin, CHEN Jun
    Computer Engineering. 2012, 38(2): 14-16. https://doi.org/10.3969/j.issn.1000-3428.2012.02.005
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    By analyzing the state transition process of components of survivable system under action of attack, resistance and recovery, this paper designs the simulation platform for system survivability based on Coloured Petri Net(CPN), which simulates the behavior of survivable system from aspects of attack intensity, attack density, recovery intensity, attack strategy and recovery strategy. The structure and operating mechanism of the platform is illustrated in detail, and by using an IPTV network service system as an example, it simulates its service delivery capacity under different attacks with the platform. Simulation results show it can realize the survivability analysis well.
  • BO Hua-Wei, MENG Ai, GAO Chun-Ming, LEI Yuan
    Computer Engineering. 2012, 38(2): 17-20. https://doi.org/10.3969/j.issn.1000-3428.2012.02.006
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    When calculating the rotational information of the skeleton joints, the traditional data conversion method can reduce the accuracy of the rotation data. Aiming at the problem, a data conversion method for motion capture is proposed. It establishes the human body tree structure skeleton model by using the structural relationship between their own frame human skeleton model, and designs the method of construction and decomposition to solve the three freedom rotation information of the key points, and uses this information to the drive human skeleton model. Experimental results demonstrate the effectiveness of this method.
  • LIU Chun-Feng, YANG Shan-Lin
    Computer Engineering. 2012, 38(2): 21-24. https://doi.org/10.3969/j.issn.1000-3428.2012.02.007
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper deals with the project scheduling to minimize project duration, in which learning workforce have multiple skills and increasing efficiencies. A zero-one integer nonlinear programming model is constructed, and a discrete Hybrid Particle Swarm Optimization(HPSO) algorithm is proposed. HPSO applies priority rule-based heuristic to generate good initial particle, introduces discrete operators to modify the classical particle’s velocity and position equations, and employs the revised forward recursion algorithm to compute the objective function value of particle. Numerical experiments are conducted to show that the proposed HPSO can convergence to better solution than the conventional particle swarm optimization algorithm within the same runtime.
  • TIAN Zheng, XU Cheng, YANG Zhi-Bang
    Computer Engineering. 2012, 38(2): 25-28. https://doi.org/10.3969/j.issn.1000-3428.2012.02.008
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper proposes a complicated airbag control system based on the Freescale’s chip technology. It includes initial module, startup self-check module, airbag control module, real-time self-check module and timer modules. The system employs the 9S12 series 16 bit microcontroller, and integrates several MMA series Micro Electro Mechanical Systems(MEMS) based acceleration sensors and squib drivers MC33797. Experimental results show that the whole system has high level of integration and reliability, and has good real-time.
  • ZHANG Li-Beng, LI Song, HAO Xiao-Gong, WANG Miao, CA Zhi-Chao
    Computer Engineering. 2012, 38(2): 29-31. https://doi.org/10.3969/j.issn.1000-3428.2012.02.009
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To handle the Nearest Neighbor(NN) query problem on the cylindrical and cone surface effectively, the two methods are proposed. The methods is respectively the method based on the Voronoi diagram and the method of the curve-divert-plane. The Voronoi diagram can be constructed on the cylindrical and cone surface and the query can be done based on the properties of the Voronoi diagram. In the method of curve-divert-plane, the cylindrical and cone surface can be diverted into the 2D plane and the divert-rule, the query algorithm are given. The performances of the two methods are analyzed by experiment. Experimental results show that the method based on the Voronoi diagram can deal with the nearest neighbor query of the static datasets well and the method of the curve-divert-plane can handle the dynamic datasets effectively.
  • LIU Beng-Feng, ZHANG Pei-Lu, CHEN Dong-Lin
    Computer Engineering. 2012, 38(2): 32-35. https://doi.org/10.3969/j.issn.1000-3428.2012.02.010
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To solve the problem of ontology instance redundancy in heterogeneous electronic catalog integration, this paper proposes a mechanism of instance duplicate elimination for electronic catalog ontology merging. This mechanism incorporates ontology concepts, attributes relationships into the calculation of semantic similarity between two ontology instances, while semantic similarity on concepts is calculated by string matching and Wordnet, that on attributes is calculated according to both data-type and object-type, and that on relationships is calculated based on multi-inheritance. If two instances have a semantic similarity above the preset threshold, one of them is deleted from ontology base to ensure the efficiency of operation on ontology instances. Experimental results show the validity of this mechanism.
  • XIANG He-Lin, ZHANG Meng-Xi, LI Po-Han, HE Shen-Ying, HONG Wei
    Computer Engineering. 2012, 38(2): 36-38. https://doi.org/10.3969/j.issn.1000-3428.2012.02.011
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Latent Semantic Analysis(LSA) lacks computation efficiency and has storage deficiencies when it is used in the large scale semantic retrieval. To solve this problem, this paper proposes a clustering-based semantic retrieval algorithm. This algorithm clusters the documents using their structural information, and applies the LSA process on those clusters to efficiently reduce the number of documents. Experimental results show that the algorithm can exponentially decrease the time of inquiring and get good retrieval accuracy.
  • ZHENG Di, WANG Dun, BEN Ge-Rong
    Computer Engineering. 2012, 38(2): 39-41. https://doi.org/10.3969/j.issn.1000-3428.2012.02.012
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to support component based applications of pervasive environments, traditional component adaptation is extended and a Context-aware Component Adaptation Model(CACAM) is put forward based on component middleware and Context-aware Component Adaptation(CACA) algorithm according to support dynamic awareness and reconfiguration for the context based adaptation. Experimental results show that this model can support the context-aware component adaptation efficiently in the pervasive computing environment.
  • GUO Chao-Zhong, WANG Wei, ZHOU Gang, HU Yan
    Computer Engineering. 2012, 38(2): 42-44. https://doi.org/10.3969/j.issn.1000-3428.2012.02.013
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Based on analyzing existing parallel Single Source Shortest Path(SSSP) algorithm, aiming at the problem of dynamic data processing on Graphic Processor Unit(GPU), this paper designs and implements parallel Moore SSSP algorithm on GPU. The algorithm applies strategies like hierarchical task arrangement, hierarchical work queue and hierarchical Kernel invokes in key step. Experimental results indicate that the algorithm can reduce idle thread cost, memory access cost and synchronizing cost.
  • LIU Jin-Fen, YIN Jing, JIANG Lie-Hui, LIU Tie-Ming
    Computer Engineering. 2012, 38(2): 45-47. https://doi.org/10.3969/j.issn.1000-3428.2012.02.014
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem that IDA can not provide supports of disassembling for all kinds of processor module at present, this paper proposes a way of establishing a formalized description language to back for the extension of IDA processor module plug-in. The description language uses the context-free grammar and the attribute grammar, including the declaration of the memory system and the syntax and semantic description of the processor instruction set. Application result shows that this method suits for the extension of IDA processor module plug-in.
  • KONG Yan-Yan, SHI Hua-Ji
    Computer Engineering. 2012, 38(2): 48-50. https://doi.org/10.3969/j.issn.1000-3428.2012.02.015
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem that the noise information may interfere with the identification of the data region in Deep Web search result pages. This paper proposes an automatic approach to identify data region in Deep Web search result list pages. It employs continuous repetitive structure and similar URL to divide the sample pages into different semantic blocks, and identifies the block where the data region locates. Experimental results show the approzch can imprave the recall rate and accuracy of the date region identification.
  • LIANG Bao-Hua, HONG Shi-Xi, CA Min
    Computer Engineering. 2012, 38(2): 51-53. https://doi.org/10.3969/j.issn.1000-3428.2012.02.016
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Using order list to store data set objects and borrowing the idea of allocation by keys in radix sorting, its time and space complexity for U/C is and O(U) respectively. To avoid large space to store discernibility matrix and use the intuition of it, a expressions to compute the number of discernibility objects is presented when computing attribute reduction sets. Two algorithms are designed with time and space complexity only and max( ). Theoretical analysis and experimental results show that the algorithm is effective and feasible.
  • CHENG Xiao-Ju, LI Ren-Fa
    Computer Engineering. 2012, 38(2): 54-56. https://doi.org/10.3969/j.issn.1000-3428.2012.02.017
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to improve the performance of regression test for large software and overcome the disadvantages of the entire retest of embedded software system because of the small changes in the source code, this paper presents FunctionSlice. It gives the conception of function slice. It reduces the regression test suites by slice ideas, selecting the test suite which is associated with source code changes for regression test. Experimental results show the algorithm substantially reduces the embedded software regression test suite and improves the efficiency of regression test. It is easy to be used on more complex embedded systems for the regression test process, and has great practicality.
  • HUANG Xian-Zhen, YANG Yu-Zhen, LIU Pei-Yu
    Computer Engineering. 2012, 38(2): 57-59. https://doi.org/10.3969/j.issn.1000-3428.2012.02.018
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Currently, the items selection and calculation of weight are divided by most studies in Vector Space Model(VSM). Defects, such as the semantic vacancy of words after segmentation and low degree of differentiation based on the methods of frequency-based weight calculation, are caused. To overcome this shortcoming, a method of keywords extraction based on statistics and rules is proposed. The basic phrases are extracted by the rules of phrase syntax and instead of the words as terms in this method. Full account of feature frequency, position, distribution and grammatical role or other information, a joint feature weight function is constructed, to improve the differentiation of terms and weaken the semantic vacancy of words. Experimental results show that the keywords based on statistics and rules are more effective than others in the text information filtering.
  • XU Jin-Long, JIANG Lie-Hui, DONG Wei-Yu, WANG Li-Xin, CHEN Jiao
    Computer Engineering. 2012, 38(2): 60-62. https://doi.org/10.3969/j.issn.1000-3428.2012.02.019
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Common Translation Cache(TransCache) management methods of Dynamic Binary Translation(DBT) are researched. Aiming at the weak points, TransCache division method is designed. The entire large TransCache is subdivided into few equal areas. This method makes it efficiency. Experimental results show this method can produce high chance for translated blocks in the TransCache, and it brings no unavailable TransCache fragmentation, and improves the efficiency.
  • HUA Zhu-Han, WANG Gui-Rong, XU Nan, LIU Zhi-Qiong, YANG Xiang-Quan
    Computer Engineering. 2012, 38(2): 63-65. https://doi.org/10.3969/j.issn.1000-3428.2012.02.020
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The selection test of Enterprise Service Bus(ESB) is difficult to effectively evaluate, and test data can not effectively quantify the problem. This paper proposes a test design and evaluation model based on ESB middleware. Through the establishment of functional model and the use of sub-period, multi-weight sampling and other methods of evaluation are proposed. It validates assessment model based on ESB middleware. By 5 major manufacturers of evaluation, ESB validation results show that the targeted research can effectively detect the difference between different ESB products.
  • JIAN Chong-Jun, HONG Xin
    Computer Engineering. 2012, 38(2): 66-68. https://doi.org/10.3969/j.issn.1000-3428.2012.02.021
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Focusing on the characteristics of supply chain management system, aiming at the integration requirement of Web services in cross-organizational process, this paper brings forward an improved modeling method supporting integration of external Web services suitable for supply chain management system——Web Service-based Process Model(WSBPM), and extends the ability of the model, i.e. the ability to control the execution of services, the ability to sense the external environment and the ability to dynamically select and integrate services. The architecture of supply chain management prototype system based on WSBPM model is presented. The model can efficiently support integration of external Web services in cross-organizational process, which is suitable for software development of supply chain management system.
  • CHOU Shu-Li, CHU Dian-Hui, MENG Fan-Chao
    Computer Engineering. 2012, 38(2): 69-71. https://doi.org/10.3969/j.issn.1000-3428.2012.02.022
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    For the issue of that Spring, the open source framework does not support dynamic evolutions, this paper comes up with a component dynamic evolution mechanism based on Spring, from the view of the development mode and architecture of software system. This mechanism divides the system’s business logic and configuration files into many modules from the point of development mode. In the aspect of architecture, it leads in evolution agent to decouple the invoking among modules, while modules and their callings are managed and controlled uniformly by instance management center. This mechanism is implemented in the Spring framework, and it is proved by calculating the value of Π. Experimental results show that the mechanism can let system implement evolutions at run-time and does not influence the system efficiency.
  • BO Dong-Sheng, ZHANG Zhao-Hui, DAI Xiu-Juan, YANG Juan
    Computer Engineering. 2012, 38(2): 72-74. https://doi.org/10.3969/j.issn.1000-3428.2012.02.023
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The classic data mining algorithm produces a lot of frequent-item set, which is not applied to the massive data mining in Intelligent Transportation System(ITS). This paper proposes an algorithm based on level grads without candidate items analysis that is used for computing association rules under the heterogeneous environment. It uses the concept of both level grads and mining topic transaction databases forming the level transaction database and mining the local frequent-item. The main-node uses the concept of weakly-entropy to abstract some association rules. Simulation results show that this algorithm has better performance in collaborative mining without candidate support.
  • ZHANG Zong-Yu, ZHANG E-Beng, ZHANG Jing-Yuan, ZHANG Xiao-Jun
    Computer Engineering. 2012, 38(2): 75-77. https://doi.org/10.3969/j.issn.1000-3428.2012.02.024
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper proposes an improved Apriori algorithm VGApriori, which is based on bit vector and undirected graph methods. The algorithm maps the database into a Boolean matrix by scanning the database, and the frequent itemsets can be generated by simple vector operation and undirected itemsets Graph search. The efficiency is distinctly improved in discovers frequent itemsets. This algorithm is applied in college teaching management and achieves good results.
  • YANG Bei, TUN Zhen-Jiang, FU Xiang-Ping
    Computer Engineering. 2012, 38(2): 78-81. https://doi.org/10.3969/j.issn.1000-3428.2012.02.025
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The static integrity measurement cannot ensure the integrity of system in the run-time process. This paper presents a Dynamic Integrity Measurement(DIM) model based on trusted computing. Compared with other existing measurement architectures of integrity measurement, this architecture introduces virtualization technology to help the system administrators control the integrity of system in the run-time process. It monitors the processes’ behavior in the run-time process and completes the DIM. Results prove that malicious attacks which damage to the integrity of system in the run-time process are defended against and the security of the system is improved.
  • BANG Zhi-Beng, JIA Zhan-Feng, ZHOU Chao
    Computer Engineering. 2012, 38(2): 82-84. https://doi.org/10.3969/j.issn.1000-3428.2012.02.026
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The independence and intelligence of data in enterprise supply chain is low. In order to solve the problem, this paper proposes an application of multiple knowledge base integration technology in enterprise supply chain. Integrating knowledge base methods are divided into finding the overlapping region between TBox, establishing the relevance of conception, and eliminating the redundancy and inconsistency of data. The paper designs multiple ABox optimization technique and its implement algorithm, introduces the service request subsystem, service receiving subsystem and knowledge base integration centre. Experimental results show that the application is effective, and can reduce the running time of system.
  • LI Feng
    Computer Engineering. 2012, 38(2): 85-87. https://doi.org/N945
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to realize knowledge acquisition and knowledge sharing in Web 2.0, a case-based knowledge management system is presented. According to bottom-up approach of knowledge innovation in Web 2.0, methodology of Case-based Reasoning(CBR) is adapted to manage knowledge. In phase of case representation, tags are introduced to label each case for better case classification and identification by system users. In phase of case retrieval, domain ontology is also introduced to find the most similar history case. To maintain case base and ontology base, artificial neural network algorithm and tags created by system users are used. Experimental results show the system can improve the efficiency of the retrieval precision, and the history case has high similarity.
  • CAO Fu-Hu, ZHANG Yan-Mei
    Computer Engineering. 2012, 38(2): 88-90. https://doi.org/10.3969/j.issn.1000-3428.2012.02.028
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    For the topology control problem of the dynamic and heterogeneous wireless sensor network, a self-organized network topology model based on aggregation for wireless sensor network is proposed, including a distributed node aggregation algorithm. The algorithm uses a deterministic annealing techniques, and considers the factors that affect the sensor performance. Simulation experimental results show that the aggregation formed by algorithm is appropriate scale. The time cost is moderate growth. It is scalable and self-adjustable.
  • SONG Xian-Feng, CHEN Guang-Chi, LI Xiao-Long
    Computer Engineering. 2012, 38(2): 91-93. https://doi.org/10.3969/j.issn.1000-3428.2012.02.029
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper proposes a secure routing algorithm for Wireless Sensor Network(WSN) based on average Hamming distance. It judges the nodes is whether normal according to the average Hamming distance between real and normal packet ration short sequence of the nodes, and it introduces a kind of neighborhood table, which saves the energy of node in the process of reclustering when malicious node emerges. It considers the energy, hop counts and route relibility comprehensively during the routing selecting. Simulation experimental results show the algorithm has a low power consumption and high detection accuracy.
  • SUN Yong-Beng, MA Jian-Guo, GENG Ling
    Computer Engineering. 2012, 38(2): 94-96. https://doi.org/10.3969/j.issn.1000-3428.2012.02.030
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The overall behavior network users access the network is regarded as the superposition of the user interest, in order to establish the model on the degree of overall users’ activities. Based on the model, an adaptive mechanism for distributed proxy is proposed, and the structure of the cluster system and model proxy rules are given. Experimental results show that the capacity of adaptive mechanism for distributed proxy network dealing with network links is better than regular network and gathering network.
  • TUN Dan, WANG Gai-Yun, LI Xiao-Long
    Computer Engineering. 2012, 38(2): 97-99. https://doi.org/10.3969/j.issn.1000-3428.2012.02.031
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    For the limited energy in the Wireless Sensor Network(WSN), and the shortage of low error detection rate in traditional in-net aggregation. A data aggregation algorithm for WSN based on Minimal Covering Set(MCS) is proposed. It constructs a tree, which contains minimum intermediate forwarding nodes, and root at the sink node. The forwarding nodes are the minimal covering set of the tree. In order to remove the redundant and error data, intermediate nodes introduces similarity judgment of read vectors. Experimental results show the algorithm can reduce the energy consumption of communication within the network, and can improve the accuracy of data collection.
  • DIAO Jin-Long, GAO Zhong-Ge, GU Ku-Wen
    Computer Engineering. 2012, 38(2): 100-102. https://doi.org/10.3969/j.issn.1000-3428.2012.02.032
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper presents an identification method of network topology based on end-to-end unicast measurements. For data measurement, it obtains delay difference by improving the sandwich group to infer from the relevant node. For the topology inference, the reference to the tree depth-first node of the relevant sequence comparison, it uses a combination of iterative and recursive methods to build the network topology to reduce the amount of probes sent. Simulation results show that the topological identification method is effective.
  • BO Nan, WANG Yong, DAO Xiao-Ling
    Computer Engineering. 2012, 38(2): 103-105. https://doi.org/10.3969/j.issn.1000-3428.2012.02.033
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    n order to improve the network topology discovery efficiency of link layer, this paper proposes a discovery algorithm for link layer of topology based on Simple Network Management Protocol(SNMP). By describing the connections between switches as a tree, the connection relationship of each switch is established for every layer according to top-down manner. By improving the conditions of the connection between switches, combined with the thread pool and hash search to improve the efficiency of topology discovery. Experimental result indicates that the algorithm can discover link layer topology rapidly and completely, and the possible network elements can be discovered.
  • LIANG An-Min, SHAO Dan
    Computer Engineering. 2012, 38(2): 106-108. https://doi.org/10.3969/j.issn.1000-3428.2012.02.034
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper analyzes the network topology of national Internet Service Provider(ISP) by introducing the concept of attraction degree, and regards that the growth of Internet topology is attributed to interaction between internal factors such as bandwidth and external factors of node such as geography location. A new modeling algorithm for Internet router-level topology is proposed by considering the influence of both node property evolution and geography limit. By analyzing power-law and non-Signal Laplacian Spectral(non-SLS), the modeling algorithm proposed is proved to simulate the Internet router-level topology more exactly.
  • SUN Lin, JU Guo-Wei, LI Fei, CHEN Dan-Ning
    Computer Engineering. 2012, 38(2): 109-112. https://doi.org/10.3969/j.issn.1000-3428.2012.02.035
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper designs centralized monitoring system of busbar temperature using Wireless Sensor Network(WSN) technology. By deploying several wireless nodes with temperature sensors in the key position of busbar, the running temperature of busbar can be monitored in real time with multi-dimension. Monitoring center provides functions such as centralized monitoring busbar running status, early warning, analysis and diagnosis of busbar fault, maintenance of busbar fault database, etc. Application results show the packet reception ratio of each node is above 97%, and the packet reception ratio of all nodes keeps above 96% in 24 hours. The system is stable.
  • CUI Huan-Qiang, WANG Yang-Long, LV Jia-Liang
    Computer Engineering. 2012, 38(2): 113-115. https://doi.org/10.3969/j.issn.1000-3428.2012.02.036
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    A survey on mobile beacon assisted localization methods for Wireless Sensor Network(WSN) is presented. According to the different characteristics of the method, it is classified to the distance and without ranging-based method, centralized and distributed method, single power and more power beacon assisted localization method, omni-directional antenna direction localization method and certainty and probabilistic localization method. It introduces the typical algorithms of static and dynamic path planning method, and points out the deficiency. Analysis results show that this method can guarantee the localization accuracy and WSN energy consumption.
  • ZHANG Shou-Zhu, LI Jing, CUI Hui-Juan, TANG Hun
    Computer Engineering. 2012, 38(2): 116-118. https://doi.org/10.3969/j.issn.1000-3428.2012.02.037
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Based on the classic decoding algorithm of Block Turbo Code(BTC), the relationship between decoding parameters and decoding complexity, and the impact of different parameters on performance, are analyzed. Taking a kind of (15, 11)×(13, 9) BTC for example, the software implementation scheme on C55 series Digital Signal Processor(DSP) is presented considering the compromise between performance and complexity. The optimization is done through different levels such as fix-point processing, complier options, high-level language and assembly language. Computational complexity has 89% reduction after optimization.
  • WANG Pan-Zhe, HONG Xin, QIU Yi-Cuan, BO Jiang
    Computer Engineering. 2012, 38(2): 119-122. https://doi.org/10.3969/j.issn.1000-3428.2012.02.038
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The article studies the test method of Services Supporting Information Processing(SSIP) in sensor networks. On basis of which, it also presents a solution of validation and test platform. It analyzes both active and passive test method. Using active test method, in which the server sends test stimulus to the stimulating node, and the sink node sends back test results. It gives out test flows and contents of request indication and confirm in each services. It designs and implements a validation and test platform of strong reusability. Relevant applications show the test methods are effective.
  • FEI Ze-Gen, XIAO Meng-Jun, HUANG Liu-Sheng
    Computer Engineering. 2012, 38(2): 123-125. https://doi.org/10.3969/j.issn.1000-3428.2012.02.039
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper proposes a Location Related Routing(LRR) algorithm in Delay Tolerant Networks(DTN). It turns the routing problem among mobile nodes into the routing problem among static locations by introducing the location information to DTN routing, and then selects the relay location according to the probabilities of node’s access to locations. LLR need not know global contact probabilities. Experimental results show that compared with the existing algorithms, LRR can obtain high successful transmission rate and small average transmission delay.
  • NIU Chu-Fen, WANG Cai-Fen
    Computer Engineering. 2012, 38(2): 126-128. https://doi.org/10.3969/j.issn.1000-3428.2012.02.040
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Network coding is highly susceptible to pollution attacks, which can not be prevented by using standard signature. Based on homomorphic function and bilinear pairings, an efficient signature scheme for multi-source networks coding against pollution attacks is proposed. The intermediate nodes with the corresponding public keys can verify the integrity of the received messages signed by different source nodes with private keys. Under the random oracle model, the scheme is proved to be secure against the source nodes and intermediate nodes attacks.
  • ZUO Li-Meng, SHANG Feng-Zhi, LIU Er-Gen, XU Bao-Gen
    Computer Engineering. 2012, 38(2): 129-131. https://doi.org/10.3969/j.issn.1000-3428.2012.02.041
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper researches the model for detection method of malicious codes based on characteristics of malicious behaviors, and analyzes the key techniques in the realization. The method uses customizing code of the malicious behavior to match and uses two malicious behaviors in short period as the decision-making standard, the information entropy characteristics of the two malicious behaviors are analyzed by the maximum entropy principle. Experimental result shows that the method works in most cases of detection and only has minor errors in few conditions, and it has very positive sense for unknown malicious code detection.
  • ZHOU Cai-Hua, ZHOU Kun, HU Ri-Xin, JIANG Yong-He
    Computer Engineering. 2012, 38(2): 132-134. https://doi.org/10.3969/j.issn.1000-3428.2012.02.042
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper analyzes three identity-based signcryption schemes, attacks the first two schemes using Indistinguishability under Chosen Ciphertext Attack(IND-CPA), attacks the last one using IND-CPA and forgery attacks method and provides improved schemes respectively. Analysis results show that the improved schemes maintain higher efficiency while satisfying confidentiality, unforgeability, nonrepudiation, public verification and forward security.
  • DIAO Ti-Hua, ZHOU Mo-Qing
    Computer Engineering. 2012, 38(2): 135-137. https://doi.org/10.3969/j.issn.1000-3428.2012.02.043
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to overcome the shortcomings that packets filtering firewall on host only provides single-level and static network security protection, a firewall filtering rules dynamical generation scheme is designed. Attacks behavior information from network layer packets and application processes are detected by using expert knowledge and corresponding filtering rules are generated by using expert system reasoning. Experimental results based on Windows system demonstrate that the scheme can detect various attacks, and generates corresponding rules in time.
  • YANG Lu
    Computer Engineering. 2012, 38(2): 138-140. https://doi.org/10.3969/j.issn.1000-3428.2012.02.044
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Due to the large amount of computing cost in bilinear pairing, this paper proposes a certificateless implicit authentication and key agreement protocol without pairing operation, and proves its security in the random oracle model. The new protocol is based on the discrete logarithm problem and the Computational Diffie-Hellman(CDH) assumption, requires only three times exponentiations and two times hash functions. The computing costs of this protocol lower than costs of the other ones that are the same type with the one in this paper.
  • LIN Jun, JIANG Wen-Jun, WANG Guo-Jun
    Computer Engineering. 2012, 38(2): 141-143. https://doi.org/10.3969/j.issn.1000-3428.2012.02.045
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In P2P environment, most of the existing trust models can not reflect the subjectivity, fuzziness and uncertainty of trust. This paper presents a new trust model for P2P environment——CloudBT. In the calculation of the node global trust value, it introduces time weight function, and combines with cloud model to get the node trust value which integrates cloud model with reputation-based model to describe the measure and uncertainty of trust, and considers both the trust value of nodes and behavior of the fluctuations when making trust decisions. Simulation results show that the CloudBT model has the high success rate of trade and strong ability against the attack in P2P e-commerce environment.
  • DENG Yu-Jiao
    Computer Engineering. 2012, 38(2): 144-145. https://doi.org/10.3969/j.issn.1000-3428.2012.02.046
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Based on the difficulty in solving quadratic residues problem, this paper proposes a forward secure proxy re-signature scheme. Through a semi-trusted proxy, proxy’s signature on a message is transformed into consignor’s signature on it. Theroy analysis proves that the scheme can resist forgery attack, and it can maintain the security of former signature even if signature key of current cycle leaks.
  • WANG Meng-Hui, WANG Jian-Dong
    Computer Engineering. 2012, 38(2): 146-147. https://doi.org/10.3969/j.issn.1000-3428.2012.02.047
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Most three-party authentication key exchange protocols are not security enough, and can not resist the undetectable online dictionary attack. Aiming at these problems, this paper proposes a three-party authentication key exchange protocol based on password. It analyses the vulnerability of the simple three-party authentication key exchange protocol, and proposes an improved security new protocol. Analysis result shows that, compared with the simple 3PAKE and the other protocols, the execution efficiency on calculation of the new protocol is better.
  • HU Cheng-Jun, LI Chuan-Dong
    Computer Engineering. 2012, 38(2): 148-150. https://doi.org/10.3969/j.issn.1000-3428.2012.02.048
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Based on Lyapunov stability theory and differential inequality technology, this paper gives a sufficient condition of exponential synchronization by using the linear matrix inequality technique, and designs an exponential synchronization controller. The synchronization controller is applied to secret communications through chaotic masking method. Simulation result based on Ikeda chaotic system shows that this method can accurately and rapidly recover the useful signal, and it is robustness for noise to achieve the purpose of secret communication.
  • MA Qiao-Mei, WANG Chang-Beng
    Computer Engineering. 2012, 38(2): 151-152. https://doi.org/10.3969/j.issn.1000-3428.2012.02.049
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the securing hole of UMA-RFID protocol, a ultra-lightweight RFID mutual authentication protocol is proposed. Through improving the interaction of UMA-RFID, the identifier of tag can be avoidably leaked and the freshness of message for tag answering can be realized each time. XOR operation and shift operation are merely utilized in the proposed protocol, and the demand for the capacity of calculation and storage of tag are reduced. The analysis of security and performance shows that this protocol can efficiently resist spoofing attack, which is suitable for much lower-cost RFID system.
  • ZHONG Meng-Quan, LI Huan-Zhou, TANG Zhang-Guo, ZHANG Jian
    Computer Engineering. 2012, 38(2): 153-155. https://doi.org/10.3969/j.issn.1000-3428.2012.02.050
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In allusion to the shortage of high unreported rate of current detection method for Trojan, using dynamic and static characteristics of Trojan, Trojan detection system based on weighting of dynamic and static characteristics is designed and realized. By in-depth research of work mechanism of Trojan, custom characteristic library for Trojan is built. Detection idea for Trojan and work logic of detection system is introduced, pick-up procedure of Trojan characteristic is analyzed, and distribution method of weight for Trojan characteristic is given. Experimental result proves that the Trojan detection system has high accurate rate.
  • LIU Yong-Wen, LI Tian-Rui, CHEN Gong-Mei, GAO Zi-Zhe, GU Xiao-An
    Computer Engineering. 2012, 38(2): 156-158. https://doi.org/10.3969/j.issn.1000-3428.2012.02.051
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper researches the variation of approximate set in covering generalized rough set when the attribute set changes, through analyzing the approximate set properties and discussing the relation between boundary and approximate set when the attribute set varies with time, it concludes the variation trend of the approximate set, on the basis of this, the paper proposes an method for updating approximate set incrementally. Examples show the validity of the proposed method.
  • FENG Xiao-Lei, XU Hong-Chao
    Computer Engineering. 2012, 38(2): 159-162. https://doi.org/10.3969/j.issn.1000-3428.2012.02.052
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To solve the problem that Affinity Propagation(AP) algorithm has poor performance on non-convex and asymmetrical density dataset, kernel clustering is introduced into algorithm. The dataset in kernel space are farther separable through non-linear mapping. Then a similarity measure with shared nearest neighbor is imported, and a density insensitive-affinity propagation algorithm named Density-insensitive Affinity Propagation(DIS-AP) is proposed. DIS-AP overcomes the shortcoming of original AP based on Euclidean distance that is easily influenced by the dimension and density of dataset. It can effectively solve the problem of clustering non-convex and asymmetrical density dataset, and developed its applied range. Experimental results show that this algorithm has better clustering effect.
  • BAI Li, FANG Chi, DING Xiao-Jing
    Computer Engineering. 2012, 38(2): 163-165. https://doi.org/10.3969/j.issn.1000-3428.2012.02.053
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper proposes a novel high resolution face recognition method based on skin texture feature. In this method, texture features are extracted based on facial contour, and Gabor wavelets are exploited to extractive skin texture features. It introduces a feature matching algorithm based on texture regional relevance. The proposed method is evaluated on the experiment of FRGC v2.0 and obtains 97.8% verification rate at False Accept Rate(FAR) is 0.1%, which is comparable to the best known results. It shows the face recognition performance can be significantly increased with the use of high resolution image.
  • YANG Jin, LI Ken-Li, TUN Fan
    Computer Engineering. 2012, 38(2): 166-168. https://doi.org/10.3969/j.issn.1000-3428.2012.02.054
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The classic Genetic Algorithm(GA) limits the evolution because the next generation cannot inherit the most adaptable chromosome. To improve the algorithm, this paper proposes a dynamic genetic algorithm. It creates a model for the heterogeneous system, and formulates criterion for measuring load balance according to the model, then uses the formulated criterion in scheduling jobs on the heterogeneous system. The algorithm allows configuring the maximum evolution generation dynamically. Experimental results show that the improved algorithm has better load balance performance.
  • QIN Yu-Jiang, ZHANG Xue-Yang
    Computer Engineering. 2012, 38(2): 169-171. https://doi.org/10.3969/j.issn.1000-3428.2012.02.055
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper presents a conception of emotion space modeling using psychological research for reference. Based on this conception, this paper studies the Valence-Arousal-Power(VAP) distribution of the seven emotions for speech emotional recognition, including joy, anger, surprise, fear, disgust, sadness and neutral, in the three dimensional space of VAP, and analyses the relationship between the dimensional ratings and the Zero Crossings with Peak Amplitudes(ZCPA) prosodic characteristics in terms of maximum, minimum, mean and absolute square difference sum of ZCPA. Experimental results show that the conception of emotion modeling is helpful to describe and distinguish speech emotions.
  • ZUO Jing-Long, TU Gui-Lan
    Computer Engineering. 2012, 38(2): 172-174. https://doi.org/10.3969/j.issn.1000-3428.2012.02.056
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to solve the requirement of Quality of Service(QoS) multicast routing with restrain in bandwidth, latency and other aspects, a QoS multicast routing with restrain based on quantum ant colony algorithm is proposed by combining quantum computation with ant colony algorithm, the ant position is represented by a group of quantum bits. And the ant pheromone rule is updated by a dynamic adjustment rotation angle strategy. The feasible path can be found quickly and avoid being trapped in local optimum by the proposed algorithm. Simulation test results indicate that the proposed algorithm is performing well both in global optimal ability and convergence speed.
  • HUANG Li-Jin, SHI Dun, ZHONG Jin
    Computer Engineering. 2012, 38(2): 175-177. https://doi.org/10.3969/j.issn.1000-3428.2012.02.057
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the curse of dimensionality, the kernel entropy component analysis(KECA) is used to reduce the dimension of data, which is compared with Principal Component Analysis(PCA) and Kernel PCA(KPCA). The low dimensional data after dimension reduction are classified by Support Vector Machine(SVM) algorithm to compare the accuracy. Experimental results indicate that high classification accuracy can be obtained at low dimension number with KECA, which reduces the processing complexity and running time. It suggests that KECA-based dimension reduction algorithm has the feasibility to be applied in the fields of machine learning, pattern recognition, etc.
  • ZHANG Wei-Song, GAO Zhi-Yang
    Computer Engineering. 2012, 38(2): 178-180. https://doi.org/10.3969/j.issn.1000-3428.2012.02.058
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper presents the research on fast Multi-classifier Ensemble(MCE) algorithm. MCE gets a number of classifiers, and assigns weights to the classifiers. A certain number of best classifiers can be gotten based on the error rate of every classifier. Assigning the weight of classifier is researched on, and two training methods are presented. The first is Biased AdaBoost algorithm which is sequentially to compute the weight of classifier. The second is DE-MCE based on Differential Evolution(DE) algorithm which optimizes the weights of all selected classifiers. Experimental result on face recognition shows that the training time of the algorithm is better than AdaBoost algorithm, and has high accuracy rate.
  • DIAO Tie-Fang, MA Li-Xiao, HU Ji-Wei
    Computer Engineering. 2012, 38(2): 181-183. https://doi.org/10.3969/j.issn.1000-3428.2012.02.059
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper analyzes the selection problem of manufacture resources, and establishes a mathematical model. The paper presents a hybrid evolutionary algorithm with Differential Evolution(DE) and orthogonal design to solve this problem. Subspace shrinking technology and the method of several offspring competition are applied to the hybrid evolutionary algorithm, which make the algorithm convergence velocity has obvious improvement. Compared with other algorithms, experimental results show that the hybrid evolutionary algorithm is excellent in quality, stability and convergence velocity.
  • CHEN Xue-Fang, YANG Ji-Chen
    Computer Engineering. 2012, 38(2): 184-185. https://doi.org/10.3969/j.issn.1000-3428.2012.02.060
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To improve the precision of speaker index, a speaker indexing algorithm of three-layer criterion is proposed. In the first layer, penalty distance is proposed to judge whether speaker changes. In the second layer, speaker model bootstrapping is used to identify speaker first time. In the third layer, GMM Speaker Supervector(GMMSS) is used to identify speaker further in order to settle the problem of data mismatch in speaker model bootstrapping. Experimental results show that, it is no need to tune penalty factor compared to BIC and F1 can improve 2% compared to DISTBIC; speaker indexing accuracy can improve 8.95% and the accuracy on the number of speaker can improve 18.25% by using GMMSS in speaker identification.
  • CHEN Wei, LI Hui, ZHANG Kun-Lei
    Computer Engineering. 2012, 38(2): 186-188. https://doi.org/10.3969/j.issn.1000-3428.2012.02.061
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Channel mismatch of speaker verification system results from mismatches between training and test voices. A speaker verification system is proposed which based on Nuisance Attribute Projection(NAP). Voices from a large number of known channel information are used to train higher-dimensional space mapping matrix. Then, Gaussian Mixture Model(GMM)-supervectors which are used as Support Vector Machine(SVM) inputting parameter are projected in order to eliminate the effects of channel information. Experimental result indicates that the system can reduce the negative effects of channel mismatch.
  • WANG Pan, XIE Xiao-Fang, TUN Long-Bao, MA Yu, SHU Zong-Jian, WANG Feng
    Computer Engineering. 2012, 38(2): 189-191. https://doi.org/10.3969/j.issn.1000-3428.2012.02.062
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to recognize the random crack in the X film image of the metal cast automated, this paper proposes a new algorithm to recognize the crack of metal. To realize the method, it constructs the ideal probability model according to the character of crack image and transform the model to the ideal template. It computes the Bhattacharyya coefficient between the ideal template and the original image in order to construct the Bhattacharyya coefficient matrix and transform the matrix to a grayscale image. The complete and accurate outline of the crack can be got by morphological operating. Experimental result shows the outcome of the algorithm is both adaptive and accurate.
  • ZHOU Chi-Beng, DAO Li
    Computer Engineering. 2012, 38(2): 192-194. https://doi.org/10.3969/j.issn.1000-3428.2012.02.063
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    For traditional Harris detection algorithm cannot suit the situation of tracking well, a algorithm of corner point detection which depends on different ellipse regions is proposed. For the influence of body attitudes variation or occlusion, a strategy of points updating is adopted simultaneously. The result of multi-zone tracking against the global tracking based on dominant hue and multi-features fusion is balanced. Experiments show the algorithm proposed can overcome the body posture change or occlusion effectively, meets the requirement of real-time, and the accuracy of tracking in non-occluded environment is also higher than traditional algorithm.
  • WANG Shi-Bo, ZHANG Da-Meng, LUO Bin, ZHANG Chun-Yan
    Computer Engineering. 2012, 38(2): 195-197. https://doi.org/10.3969/j.issn.1000-3428.2012.02.064
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to calculate the landslide areas from remote sensing image, a semi-automatic algorithm for extracting the landslide is proposed based on spectral matting. Matting Laplacian matrix is established to calculate feature vector, automatically determine the number of clusters, and hill-climbing algorithm is used for clustering image. Matting components are calculated based on feature vector and the user’s data. The smoothing term is removed, and alpha is got. Experimental results show that the method can effectively extract the landslide, performs high accuracy and strong stability.
  • HU Dan, MEI Xue
    Computer Engineering. 2012, 38(2): 198-200. https://doi.org/10.3969/j.issn.1000-3428.2012.02.065
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    A method for respectively applying Fourier transform and edge wavelet moments descriptor to recognize human behavioral motion is proposed. In the process of contour feature extraction, contraposing not single line between centroid and contour of concave-convex complex images, a kind of directional distance contour description matrix with multi-lines is proposed. Some simulation experiments are done about two kinds of human bodies and four kinds of behavioral motion, experimental results show the edge wavelet moments descriptor has a good description and recognition rates to local feature of shape contour.
  • FENG Jun, LI Gang, SUN Xia, FENG Hong-Wei
    Computer Engineering. 2012, 38(2): 201-203. https://doi.org/10.3969/j.issn.1000-3428.2012.02.066
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to solve practical teaching in the teaching content, this paper puts forward the definition of knowledge oriented to teaching, and based on the maximal matching algorithm segmentation unit foundation. In the relationship of extract knowledge, combines with the association rules and mixes classification method, and improves the accuracy of knowledge relation extraction. Using the obtained field text knowledge and its relationship, establishes the knowledge structure graph oriented to teaching content. Experimental results show that the method is suitable for extraction of Chinese knowledge point relationship.
  • HUANG Guang-Qiu, LIU Jia-Fei, TAO Yu-Xia
    Computer Engineering. 2012, 38(2): 204-206. https://doi.org/10.3969/j.issn.1000-3428.2012.02.067
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper studies the Artificial Fish Swarm Algorithm(AFSA). The continuous search space is discretized based on the interval-value that each component of a feasible solution locates, each point in the discrete space is just a position state of an artificial fish, its energy(food density) is the objective function value at this point. The whole discrete space and the set of all artificial fishes are also divided into a series of non-empty subsets. During preying, swarming or following activities of artificial fishes, each artificial fish’s transition probability from a position to another position can be simply calculated. Each position state corresponds to a state of a finite Markov chain, then the stability condition of a reducible stochastic matrix can be satisfied. In conclusion, the global convergence of AFSA is proved.
  • WU Ning, XIAO Xing-Xing, FENG Rui
    Computer Engineering. 2012, 38(2): 207-209. https://doi.org/10.3969/j.issn.1000-3428.2012.02.068
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Based on a home robot platform, this paper implements a classical speaker recognition algorithm: Gaussian mixed-universal background model algorithm. It also introduces the speaker recognition theory in the robot system. In order to improve the robustness of the system in real home environment, it make some improvements in framework and algorithms. The recognition system can be applied in access system or check-in system as well.
  • SU Bing, LI Gang, WANG Hong-Yuan
    Computer Engineering. 2012, 38(2): 210-212. https://doi.org/10.3969/j.issn.1000-3428.2012.02.069
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Traditional Gaussian Mixture Model(GMM) is very sensitive to light mutations and is slow for convergence speed. This paper presents a detection method for moving object based on improved GMM. The method can eliminate the effect of illumination by mismatching pixel. Background image is exacted by improved GMM. Binary difference image is got by background image difference, then moving object is got from difference image. Experimental results show that the detection can adapt illumination changes well and improve the accuracy and robustness of moving object diction.
  • CHANG Feng, FENG Nan, MA Hui
    Computer Engineering. 2012, 38(2): 213-214. https://doi.org/10.3969/j.issn.1000-3428.2012.02.070
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper presents a document clustering algorithm based on word co-occurrence to solve the problem about information deletion of text subject expression. It uses the word co-occurrence of document set to establish the document theme vector presentation model, and applies to the hierarchical clustering algorithm, through the clustering entropy to find the best level partition, and accurately reflects the relationship between documents’ theme. Experimental results show that the algorithm results is better than other phrases document hierarchical clustering algorithm.
  • LI Chu-Fei, TAN Chang-Geng, HAN Yu
    Computer Engineering. 2012, 38(2): 215-217. https://doi.org/10.3969/j.issn.1000-3428.2012.02.071
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Combining the repair link routing protocol and vehicular network, this paper proposes a new algorithm to predict the link break-time by turning time. It creates two accelerate model. One is accelerates velocity from current speed to the max speed, and the other is in the opposite. It discusses the movement of vehicle in 3 time zones to design the moving predict algorithm. It adds empirical prediction factor and enviroment prediction factor to improve the weighted prediction algorithm. Experimental results show that the algorithm can improve the predictive accuracy rate to 12.8%.
  • TAO Yan, SHANG Jin, LUO Bin
    Computer Engineering. 2012, 38(2): 218-220. https://doi.org/10.3969/j.issn.1000-3428.2012.02.072
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Based on the high compressibility of the fractal code, an effective watermark scheme using the iterative fractal decoding and inpainting for image tamper detection and recovery is proposed. In watermark insertion section, it gets a look-up index table, and inserts the fractal code and the parity-check bits into the original image according to the look-up index table to obtain the watermarked image. By watermark extraction, it can automatically localize the altered region, and iteratively apply the fractal decoding and image inpainting for image recovery. Experimental results show the effectiveness of the proposed algorithm.
  • JIAN Ya-Ru, GUO Zhong-Hua, YONG Hui
    Computer Engineering. 2012, 38(2): 221-223. https://doi.org/10.3969/j.issn.1000-3428.2012.02.073
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    An improved normalized block semi-norm algorithm is presented. The mode of sampling contraction for the codebook and search for the nearest farther block are improved. It can reduce the searching range of the matched block and improve the matched accuracy, and in order to improve the coding rate and decoding image quality. Experimental results show that the coding time of the improved algorithm is shorter than the fast algorithm based on normalized block semi-norm.
  • ZHANG Xian-Quan, YANG Jian-Zhong, FU Nian, WANG Xian-Hui, DAI Xuan
    Computer Engineering. 2012, 38(2): 224-225. https://doi.org/10.3969/j.issn.1000-3428.2012.02.074
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper proposes a edge-preserving image denoising method. The noise point and noise-free point are computed by four directional convolutions operators, and noise-free point are not dealed with. The mininum distance noise-free point in the 3×3 neighborhood is obtained. If there is not noise-free point, the neighborhood is extended to 5×5. The median value of them is used to replace the noise point while existing noise-free point. Otherwise, the noise point is replaced by the average value of the maximum distance pixel. Experimental result shows that the algorithm can effectively remove image noise, and can better protect the image edge.
  • ZHENG Jiang-Yun, JIANG Ju-Lang, HUANG Zhong
    Computer Engineering. 2012, 38(2): 226-228. https://doi.org/10.3969/j.issn.1000-3428.2012.02.075
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to remain the hue invariability during the color image enhancement, this paper presents an enhancement algorithm of color image based on RGB gray value scaling. It uses the twicing function to enhance the maximum of RGB pixel values, and gets the scaling k to adjust the RGB value with gain acquired. Experimental results show that the algorithm is efficient for enhancement of color image with compression at different quality, and outperforms the existing algorithm in most cases.
  • GU Zong-Yun, LV Huan-Li, LUO Bin, HAN Cheng-Mei
    Computer Engineering. 2012, 38(2): 229-230. https://doi.org/10.3969/j.issn.1000-3428.2012.02.076
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    For the type of problem of region duplication forgery in the digital image, a passive algorithm of digital image, based on its own SIFT feature point, is put forward, which can resist geometric transformation. It extracts the image feature points with SIFT algorithm, and matches them. According to the feature that the feature point does not match one another in the same natural image, it can detect the tampered region which is translation, rotation, scaling and other geometric transformations. Experiment verifies the effectiveness of the algorithm of digital image region duplication forgery, which can resist geometric transformation with SIFT feature point.
  • LIU Yan-Qi, HU Heng-Wu
    Computer Engineering. 2012, 38(2): 231-233. https://doi.org/10.3969/j.issn.1000-3428.2012.02.077
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the limitation of Expectation Maximization(EM) algorithm for mixture model parameters, this paper presents a fuzzy constrained mixture model for image segmentation. According to the mixture model based on the precondition of pixel independence, it solves the parameters of model by Expectation Maximization(EM) algorithm. The pixel spatial information with the fuzzy method is introduced to correct the independent assumption of the pixel and reduces the influence among the parameters of mixture components. Experimental results show the algorithm does not add model parameters. It needs a model selection criterion to choose suitable number of mixture components.
  • LIU Xun, TUN Jin, HAO Ying-Meng, SHU Feng
    Computer Engineering. 2012, 38(2): 234-236. https://doi.org/10.3969/j.issn.1000-3428.2012.02.078
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    According to the way how human eyes obtain information, this paper proposes a new image enhancement method. It can ensure that the gray differences between adjacent regions are maximally perceived by human eyes under the premise of keeping information. In this method, based on the adjacency relation of image regions, a gray consolidation strategy is proposed to represent image using the least gray. Then according to the Just Noticeable Difference(JND) curve, it signs a gray mapping relation for maximum perception of human eyes to enhance image. Experimental results show that this algorithm is better than current image enhancement methods in evidence.
  • XU Jian-Hua, LI Yuan
    Computer Engineering. 2012, 38(2): 237-239. https://doi.org/10.3969/j.issn.1000-3428.2012.02.079
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the schedulability test problem of task set which only contains periodic tasks, this paper proposes a simulation-based schedulability test tool for task set. It simulates the system clock in the process of scheduling by setting clock variable. In the process of this clock variable growth, according to the task priority from high to low sequence, it analyzes each task deadline constraint, judges the single task schedulability. Results of the tool are proved to be correct with schedulability test cases.
  • GAO Hong-Wei, LI Bin, CHEN Fu-Guo
    Computer Engineering. 2012, 38(2): 240-241. https://doi.org/10.3969/j.issn.1000-3428.2012.02.080
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    According to the rock sample problem in planetary exploration task, binocular vision system is applied to sample different rock images and the rocks’ surface 3D point cloud data are acquired by stereo vision algorithms in this paper. The relative plane on the rock is automatically evaluated by the proposed method based on C-means clustering between the triangulation pieces’ normal vector after triangulation, and the different plane is signed by different color. Simulation results show the validity of the algorithm.
  • DAN Chang-Zhen, YANG Xue, WANG Zhen-Song
    Computer Engineering. 2012, 38(2): 242-244. https://doi.org/10.3969/j.issn.1000-3428.2012.02.081
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper proposes a design of high performance parallel Fast Fourier Transform(FFT) processor. It uses four butterfly units in parallel processing. It uses an improved conflict free memory addressing method, and 16 data can be read, processed and written in one cycle simultaneously. It gives the FPGA implementation of the processor, performance evaluation results show that the high performance parallel FFT processor is superior to other FFT processors, and can meet the application needs.
  • LIU Yao
    Computer Engineering. 2012, 38(2): 245-247. https://doi.org/10.3969/j.issn.1000-3428.2012.02.082
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    A universal timing optimization oriented FPGA packing algorithm is proposed in this paper. Configure and user circuits are converted to directed graphs to solve the sub-graph isomorphism problem. Aiming for timing optimization, net delay is used as variable to define the criticality, which is used for cost function to guide packing procedure. Experimental results on VPR platform prove this algorithm performs less timing delay than other similar packing algorithms, and can applied in various kinds of FPGA CLBs.
  • LI Shi-Meng, CHEN Bin, CENG Xiao-Xiang
    Computer Engineering. 2012, 38(2): 248-249. https://doi.org/10.3969/j.issn.1000-3428.2012.02.083
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper presents a configurable FFT/IFFT processor oriented to Multiple Input Multiple Output(MIMO) Orthogonal Frequency Division Multiplexing(OFDM). It gives the pipelined architecture, introduces one data reordering module to deal with 4 multiple data sequences, with different rate. Performance analysis shows that the processor is designed in SMIC 0.13 μm technology with the core size of 1.800×1.500 μm2.
  • TUN Xu, GONG Hua, LI Hong-Gen, FANG Qun
    Computer Engineering. 2012, 38(2): 250-252. https://doi.org/10.3969/j.issn.1000-3428.2012.02.084
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Oriented to X-ray flourescence spectrometer data collecting in heterogeneous architecture of great cement factories, this paper proposes a data collecting algorithm based on characteristic string matching, and develops a general Cement Quality Management System(CQMS) to automatically collect, store and upload real-time quality data. Application result proves that the system is running steady and has robust adaptive ability. It can help cement firm to improve ability of quality management and product controlling.
  • LI Gui-Qi, HAN Jiang-Hong, LIU Xiao-Beng
    Computer Engineering. 2012, 38(2): 253-255. https://doi.org/10.3969/j.issn.1000-3428.2012.02.085
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to achieve low-cost but highly reliable data communications for several kilometers, the communication program is designed as the core of the embedded modem and Micro Control Unit(MCU). MCU sets MT9234 module through AT commands. PC connects with MT9234 module by the level-shifting interface circuit. Two MT9234 modules uses twisted pair to establish the data communication after three stages of the handshake process in leased line mode and ultimately builds communications between PC and remote MCU. Hardware simulation results show that, compared with traditional modem, remote data communication in leased line mode is farther and faster.
  • ZHANG Yu-Pei, KONG Min, DI Su-Lan, LUO Bin
    Computer Engineering. 2012, 38(2): 256-258. https://doi.org/10.3969/j.issn.1000-3428.2012.02.086
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper presents a video summary generation method based on shot mark and dynamic sliding window. The video is segmented to a shot set, the shots are divided into two catalogs: static shot and dynamic shot, by cumulating difference of frames. The marked shots are classified using dynamic sliding window method. According to certain rules, key frames with less redundancy and covering the rich content of the video are extracted to generate the video summary. Experimental results show the method can effectively and quickly generate the video summary.
  • LI Jin-Wei, CHEN Geng-Sheng, YIN Wen-Bei
    Computer Engineering. 2012, 38(2): 259-260. https://doi.org/10.3969/j.issn.1000-3428.2012.02.087
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Motion estimation is one of the most important parts in the digital video encoding progress, and it is effective to have a motion estimation algorithm with low computation and complexity. This paper proposes a modified Bit Plane Matching(BPM) algorithm by choosing the threshold value and the candidate motion vector, which can reduce the computation and complexity of the motion estimation progress and have an image improvement for the sequences.
  • XUE Rui, SU Guang-Da
    Computer Engineering. 2012, 38(2): 261-263. https://doi.org/10.3969/j.issn.1000-3428.2012.02.088
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Audio Video coding Standard(AVS) and H.264 achieve significant gain in coding efficiency by adopting variable sizes in inter-frame coding, which leads to high computation complexity when using the full mode decision scheme. This paper presents an inter- frame mode decision algorithm based on the evaluation of the motion homogeneity and the spatial-temporal correlation. The current macro-block 16×16 mode prediction is performed by adopting neighboring macro-blocks and the collocated macro-block in the previous frame. If the prediction fails, the inter-frame mode is predicted by the motion vector, which is generated by using the 8×8 inter-frame mode. Simulation results demonstrate that the algorithm can achieve 41.2% encoding time reduction without any noticeable loss of PSNR.
  • BANG Wei, LI Jian-Xin, YAN Bin, TONG Chi, CHEN Jian
    Computer Engineering. 2012, 38(2): 264-266. https://doi.org/10.3969/j.issn.1000-3428.2012.02.089
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper designs a Compute Unified Device Architecture(CUDA) storage models optimal strategy is designed, and also an improved empty voxel skipping method applied to CUDA storage model is presented to reduce the operation for skipping the empty voxel. It realizes high quality ray casting accelerated algorithm based on Phong lighting model. Experimental results illustrate that the algorithm can significantly accelerate the render speed of ray casting and maintenance the high quality of render image simultaneously.
  • LIU Qiang-Hua, LIU Xiao-Lin, CHEN Zi-Jiang
    Computer Engineering. 2012, 38(2): 267-269. https://doi.org/10.3969/j.issn.1000-3428.2012.02.090
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To optimize the degree distributions of irregular Low-density Parity-check(LDPC) codes, a search algorithm of extreme value based on differential evolution is introduced. The variables’ degree distributions of the LDPC code are determined by using the best member vectors, the check nodes’ degree distribution is adjusted according to both the expected rate and the variable nodes’ degree distribution, and the LDPC codes under the expected rate are designed. To get more efficient controlling on the iteration number, the evolution stop criterion is modified. A group of irregular LDPC codes under AWGN channel are designed. Experimental results show that this method has low decoding complexity, and LDPC code has the high noise threshold.
  • LIU Feng, YANG Zhao-Shua, HAN Dong, SUN Qi
    Computer Engineering. 2012, 38(2): 270-271. https://doi.org/10.3969/j.issn.1000-3428.2012.02.091
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Un-uniform illumination and complex background will influence Data Matrix 2D barcode decoding effect in real application situation. Aiming at this problem, this paper proposes a hierarchical method of Data Matrix real-time decoding. The method implements barcode region segmentation from background with Otsu algorithm to get the binary barcode image. It coarsely localizes the barcode region and achieves fine-level localization, and develops a terminal for real-time Data Matrix decoding based on Digital Signal Processing(DSP). Experimental results demonstrate that the method is real-time and robust.
  • GU Ji, WANG Hui-Qin, HU Yan, MA Zong-Fang
    Computer Engineering. 2012, 38(2): 272-275. https://doi.org/10.3969/j.issn.1000-3428.2012.02.092
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to overcome the disadvantage that it needs long time for Support Vector Machine(SVM) to solve problems when the data quantity is large, this paper puts forward a video fire smoke recognition algorithms based on Least Squares Support Vector Machine(LS-SVM). Through a second segmentation of suspicious smoke areas, the color feature, correlation coefficient and area change rate are selected as the input feature vector. The dimension of input vector and the training time has been reduced. Experimental results show that the algorithm enhances the classification speed and identify accuracy.
  • SHU Xue-Qiang, XU Cheng, LIU Pan, YANG Zhi-Bang
    Computer Engineering. 2012, 38(2): 276-278. https://doi.org/10.3969/j.issn.1000-3428.2012.02.093
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the temporary overload problem in real-time systems, this paper presents an adaptive elastic schedule algorithm for dynamic executing time tasks. Considering the task deadline is less than the task period, this schedule algorithm dynamic adjusts the periods of elastic tasks, based on the basic elastic schedule algorithm and feedback mechanism. The result of simulation experiments shows that this algorithm is able to effectively improve the solution rate for random elastic tasks.
  • ZHANG Yu-Dong, XU Cheng, YANG Zhi-Bang
    Computer Engineering. 2012, 38(2): 279-281. https://doi.org/10.3969/j.issn.1000-3428.2012.02.094
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In a real-time system, each task must complete and produce correct output by the specified deadline, it is not possible to meet each deadline because of system overload. The current closed-loop schedule based on feedback focused on the treatment after the overloaded, this paper proposes strategy to prevent system overload based on regression model and imprecise computation techniques. Experimental results demonstrate the effectiveness of the proposed schedule algorithm when there are bursts of total load in the system.
  • JIAO Xin-Quan, CHEN Jian-Jun, CHAN Pan-Hu
    Computer Engineering. 2012, 38(2): 282-283. https://doi.org/10.3969/j.issn.1000-3428.2012.02.095
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    A method to construct a flexible length or rate of quasi-cyclic Low Density Parity Check(LDPC) codes based on Balanced Incomplete Block Design(BIBD) and cycle permutation matrix is proposed. Based on actual demand, the corresponding template matrix is constructed by BIBD method. The template matrix is extended by the appropriate cycle permutation matrix. The quasi-cyclic LDPC codes constructed by the proposed method have a good structure and flexibility. This method improves the efficiency of construction good codes which meet the needs of a practical application. Simulations show that quasi-cyclic LDPC codes constructed by the proposed method perform well over AWGN channels with iterative decoding and have a lower error floor.
  • HONG Hai-Bin, CHA Dai-Feng, LONG Dun-Bei
    Computer Engineering. 2012, 38(2): 284-284. https://doi.org/10.3969/j.issn.1000-3428.2012.02.096
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The performance of Direction of Arrival(DOA) estimation based on conventional Spatial Time-frequency Multiple Signal Classifi- cation(STF-MUSCI) degenerates in α stable distribution environment. A new Time-frequency Fractional Lower Order Moment MUSIC(TF- FLOM-MUSIC) method is proposed, second covariance matrix is substituted by Fractional Lower Order Matrix(FLOM) and Fractional Lower Order Moment Spatial Time-frequency Distribution Matrix(FLOM-STFDM) is defined, FLOM-STFDM is decomposed in the method. DOA estimation Mean Squared Error(MSE) and Generalized Signal Noise Ratio(GSNR) are analyzed and algorithm steps are summarized. Simulation results show that TF-FLOM- MUSIC algorithm can reduce effectively DOA estimation MSE and improve estimation resolution.
  • LI Yan, CHEN Cai, LI Tie-Song, SU Lan-Meng
    Computer Engineering. 2012, 38(2): 288-289. https://doi.org/10.3969/j.issn.1000-3428.2012.02.097
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Game agents must respond quickly to the changes of dynamical topography in large environments. The efficiency of Lifelong Planning A*(LPA*) algorithm is sensitive to the location of the updated nodes. This paper proposes a Hierarchical Path Finding and Lifelong Planning A*(HPLPA*) algorithm, where LPA* is combined with HPA* algorithm. Based on HPA*, an abstract graph is generated and updated in time. LPA* is used on the abstract graph to generate an abstract path. The abstract path is refined to a local path. Experimental results are conducted to compare LPA*, HPA* and HPLPA*, which show that HPLPA* is more effective.
  • SU Cheng, CHEN Wen-Na, CHEN Meng, HUANG Dong-Mei
    Computer Engineering. 2012, 38(2): 290-292. https://doi.org/10.3969/j.issn.1000-3428.2012.02.098
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to satisfy the intelligence and scalability demand for digital marine chart coordination processing, on the basis of the characteristic of “Digital Ocean” system, this paper designs a cooperative digital chart processing model based on multi-Agent techniques on the basis of the social model. It analyzes the specific collaborative process of this model, and creates a mechanism for collaborative work. The simulation results show the validity of the model.