Author Login Chief Editor Login Reviewer Login Editor Login Remote Office

15 May 2015, Volume 41 Issue 5
    

  • Select all
    |
  • SHI Jingyan,CHEN Deqing
    Computer Engineering. 2015, 41(5): 1-5.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Traditional file system management tools achieve management and monitoring functions by traversing file directory tree to get metadata information. For large file systems,it is a time consumed task to get metadata which can not meet the demand of the current management of large data background. This paper integrates policy engine Robinhood and TORQUE job management system. A distributed parallel computing is used to get the file system metadata information which is saved into MySQL database. Based on metadata information saved in database,the tool achieves monitoring,file management and system backup. The tests indicate that distributed computing is able to fully use the computing utilities of the cluster,not able to enhance the speed of traversing file system,and makes sure of the progressing on monitoring, management and back up of the file system.
  • YANG Xingyao,YU Jiong,Turgun Ibrahim,LIAO Bin,YING Changtian
    Computer Engineering. 2015, 41(5): 6-13.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem of data sparsity in traditional collaborative filtering models,a collaborative filtering recommendation model based on trust model filling is proposed. The model gives emphasis to the trust attributes,and prefills the rating matrix by establishing trust model,in order to improve the data storage density. It obtains the similarity between items from the perspective of items and user attributes by similarity models. It coordinates the two types of similarity measurements by a self-adaptive coordination factor to gain final rating predictions of items. Experimental results,tested in different data sets,show that the newly proposed model can efficiently solve the problem of data sparsity in rating matrix,and provide better prediction accuracy of ratings involving an average improvement of 8% ,compared with traditional collaborative filtering models.
  • XIE Chunli,WANG Shuqin
    Computer Engineering. 2015, 41(5): 14-18,25.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to improve the prediction accuracy of reliability for composite services,this paper proposes a new dynamic prediction model for Web service reliability. A composite service is divided into executing path, service composite module and atomic service according to composite grain. Binding graphs for composite unites are built,and the reliability models for the composition unites are presented based on these binding graphs. The reliability models are integrated for composite services. Example analysis result show that,compared with the traditional reliability prediction model,the proposed model only computes the updated reliability,reduces the computation complexity,and a more flexible sensitivity analysis is performed to determine which service component has the most significant impact on the improvement of composite service reliability.
  • LENG Yonglin,CHEN Zhikui,ZHANG Qingchen,LU Fuyu
    Computer Engineering. 2015, 41(5): 19-25.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Traditional big data filling algorithms fill missing values depending on the statistical theory of the data set, and they are corrupted by noise data which decrease the imputation accuracy. This paper proposes an algorithm to fill missing values based on distributed incomplete big data clustering. It clusters incomplete big data directly by proposing a new similarity metrics,and uses cloud computing technology to improve clustering efficiency by designing MapReducebased distributed Affinity Propagation(AP) clustering algorithm. The data in the same cluster is utilized to fill missing values. Experimental result demonstrates the proposed algorithm can cluster the incomplete big data directly and improve the filling accuracy of missing data effectively.
  • HUANG Qiulan,CHENG Yaodong,DU Ran,CHEN Gang
    Computer Engineering. 2015, 41(5): 26-32,37.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To solve the problems caused by the storage expanding in high energy physics mass storage system,a scalable distributed metadata management system is designed,which includes metadata management,metadata service, cache service and monitoring information collector. Based on it,a new Adaptive Directory Sub-tree Partition(ADSP) algorithm is proposed. ADSP divides the file system namespace into sub-trees with directory granularity and adjusts subtrees adaptively according to the load of metadata cluster for achieving the storage and distribution of metadata in cluster.Experimental results show that the algorithm can improve the metadata access and retrieval performance,provides a scalable and dynamic load balancing of metadata service to ensure the availability,scalability and I / O performance of metadata management system is not affected by the storage scale,thereby it can meet the growing storage requirements of high energy physics experiments.
  • SUN Ya,LI Zhihua
    Computer Engineering. 2015, 41(5): 33-37.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Dissimilarity or similarity is the key issue in data mining. data is hard to measure because of its original structure. Aiming at the problem of time series similarity measure,this paper proposes a re-description method based on locally extreme point of time series. In which,the original time series is described by extracting the locally extreme points from time series,reflecting the main features of the time series effectively and achieving the compression of time series data. Measuring the extreme series after equal-length treatment enhances the flexibility of the algorithm,and reduces its limitations. Based on the above,it is applied to hierarchical clustering of the time series. Simulation experimental results show that the clustering effect and data compression is obvious,and the clustering accuracy greatly improves compared with other algorithms based on time series trend features extraction.
  • BO Luo,ZHAO Gangyao
    Computer Engineering. 2015, 41(5): 38-44,55.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Route planning is an effective way to improve the ability to survive of Unmanned Aerial Vehicle(UAV),for which can make the UAV reach the destination safely and fast. In this paper,the route planning algorithm based on MapReduce and multi-objective Ant Colony Optimization ( ACO) is put forward, which named RPMA. The multiobjective ACO algorithm is designed in the RPMA and different varieties of optimization strategies are used to improve the RPMA. The RPMA uses cloud computing technology and makes it solve the route planning problems in distributed cloud computing environment and parallel technology. A number of paths are planned in advance. The RPMA is able to make the UAV choose different routes according to different missions or choose the appropriate route according to different temporary needs. The preferable result is got in the simulation experiment,which indicates that the RPMA is an efficient way to solve the route planning problems and has the qualities of convergence and scalability. In addition,the RPMA has the handling abilities of large-scale data.
  • CHEN Chao,CHAI Yunpeng
    Computer Engineering. 2015, 41(5): 45-49.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Flash memory is developing very rapidly in recent years. It becomes a new non-volatile storage storage product with large capacity,high performance,low power consumption,and can effectively make up for the performance gap of memory and disk to improve the storage system. Based on it,this paper designs and implements a flash-based hybrid storage simulation system called HybridArch. It supports file splitling distribution,and the file-interface accessing by increasing the file distribution layer,and implements some typical architectures of hybrid storage,including Two Layers(TL),Pure Flash (PF),Flash-as-Part-of-Disks (FPD),Tower (TO),and Horizontal Cache (HC). And then a series of evaluation experiments are made based on HybridArch to compare and analyze the performance and flash write amounts for different architectures. The results indicate the Horizontal Cache(HC) architecture makes a good balance among performance,cost-efficiency,and lifetime.
  • YANG Li,QIN Zhidong,XIAO Fangxiong,WANG Shaoyu
    Computer Engineering. 2015, 41(5): 50-55.
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    The existing Row Rippling Column Stealing(RRCS) topology reconfiguration algorithm,which is based on the idea of hierarchical optimization,achieves the overall optimal solution through searching the local optimal solution. However,the local neighborhood searching of the RRCS algorithm is unidirectional, easily leading to the worse suboptimal solutions or the serious chain column stealing operations that the former unit sequentially occupies the optimal solution of next unit. Facing this situation,an optimized RRCS algorithm is proposed to improve the local solution and avoid the chain operations by using local neighborhood bidirectional searching. Experimental results show that when the fault cores on physical topology are more,the optimized algorithm is better.

  • TANG Dingyong,LIN Zhenghong,JIANG Hong
    Computer Engineering. 2015, 41(5): 56-61.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To solve the service queue management problem in the Enterprise Service Bus(ESB) integration platform, the buffer management strategy of priority message service queue is proposed. This strategy puts the different priorities business data into different queues. The Business is serviced according to the order of priority packets. Before the next packets arrive,the strategy uses gray prediction to make a real-time prediction about the priority queue’ s buffer size which can be assigned,makes the queue’s buffer allocation more reasonable. Experimental results show that the proposed strategy not only can guarantee high priority and low priority traffic to run smoothly in ESB integration platform,but also can reduce the average waiting time,the average residence time for high priority traffic and the risk of message queue congestion.
  • ZHANG Daoguang,NIE Lanshun,JIN Jintao,ZHAN Dechen
    Computer Engineering. 2015, 41(5): 62-69,76.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Spatial resource project scheduling problem not only satisfies the conventional resource constraints of the task order relations,human resources,equipment,etc. ,but also satisfies the resource constraints of spatial resource occupied by activity groups. The constraints between activity groups make the problem extremely complicated. In the context of block manufacturing in shipbuilding,a mathematical model of the spatial resource constrained project scheduling problem occupied by activity groups is proposed by modeling of different kinds of resource. On the basis of the parallel scheduling scheme,a heuristic scheduling algorithm based on priority rules is proposed. Example test compared with the different data scale is made and the result shows that the algorithm is correct and effective.
  • ZHAO Lu,ZHANG Jianpei,YANG Jing
    Computer Engineering. 2015, 41(5): 70-76.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To verify multithreaded concurrent programs,combine stateless or stateful search based software model checking methods with Dynamic Partial-order Reduction(DPOR) so as to significantly reduce the state space of programs explored. DPOR uses current candidate backtrack set to refine corresponding backtrack set, however, the former computation cost actually exceeds the latter refinement demand. To solve the problem,a stateful DPOR method of shrinking candidate backtrack set is presented. The shrinking candidate set is formally defined,which can delete the redundant transitions for the same backtrack condition. For every interleaving backtrack state, the proposed method exploits current shrinking candidate set to refine corresponding backtrack set. Consequently,the method performs the stateful DPOR method to verify concurrent programs. Experimental results show that the method reduces the number of transitions explored,speeds up the refinement process and increases the efficiency of dynamic model checking compared with existing DPOR method.
  • DANG Xiaochao,LI Fenfang,HAO Zhanjun
    Computer Engineering. 2015, 41(5): 77-82,88.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to effectively reduce the influence of the complex environment factors having on node localization accuracy in Wireless Sensor Network(WSN),this paper proposes a node localization algorithm in three-dimensional space based on the mobile anchor node and the fuzzy information among the nodes. The node localization is determined by measuring direction angle and pitch angle. The algorithm updates the speed and direction of anchor node and the located nodes act as the static anchor nodes to localize other nodes. Simulation results show that the algorithm improves the accuracy of node localization and lower the delay and energy consumption of networks compared with the APIT-3D and Bounding cube algorithms.
  • HOU Zhiwei,AN Lixia,BAO Liqun,WANG Haiyong
    Computer Engineering. 2015, 41(5): 83-88.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    For a single core processor is difficult to achieve data synchronization acquisition and real-time processing in the multiple parameter data acquisition system,this paper gives a solution for the multi-parameter data parallel acquisition and inter-processor communications of dual-core processor based on NiosII. The dual-core processor system is built in the FPGA chip,and CPU1 is in charge of digital-analog conversion and preprocesses for the input signal,while CPU2 is responsible for the display of collected data and communications with PC. The high-speed data transmission channel is established between the cores with the communication mechanism of Scatter-Gather Direct Memory Access(SG-DMA) to two level data cache to realize the data exchange and synchronization. The solution is applied to air quality on-line monitoring system,and the results show that the transmission speed between nuclear reaches 496 MB / s,which meets the demands of multiple parameter data synchronous acquisition and parallel processing.
  • XU Tongyang
    Computer Engineering. 2015, 41(5): 89-92,.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    An optimized localization model in Wireless Sensor Networks(WSN) is proposed based on the Unknown Node-oriented Disk of Scatterers Model(UN-DSM). It uses the Time of Arrival(TOA) measured by every anchor node and Angle of Arrival (AOA) measured only by the main anchor node to estimate the positions of scatterers and the unknown node. The draft results and related ranges are obtained. The range redundant information is used for modification estimation and gets double suppositional scatterers information. Combine all scatterers information and range information to solve the position of the unknown node. Experimental results show that the proposed location method can restrain Non-Line-of-Sight(NLOS) error effectively,and has better location effect than the traditional position methods with good robustness.
  • WANG Shiguo,YI Jin,PENG Haixia
    Computer Engineering. 2015, 41(5): 93-96,101.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Combining relay cooperation with Cognitive Radio(CR) technology is one of the effective ways to improve the utilize efficiency of spectrum resources and reduce power consumption. In the context of no direct link between cognitive users’ ends,relay’s working in Amplify-and-Forward(AF) mode,and limitation of cognitive users and relay’ s transmitted power,this paper discusses the power allocation in relay cooperative CR systems. Adopting KKT condition and sub-gradient method to solve the optimization problem,an optimal power allocation algorithm is proposed. This algorithm not only can ensure authorized users’ average communication quality and worst communication quality are not affected,but also can maximize cognitive users’ communication capacity. Meanwhile,in the Rayleigh fading channel, numerical simulation of the proposed algorithm verifies itself further.
  • FENG Jiangpeng,ZHENG Liming
    Computer Engineering. 2015, 41(5): 97-101.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In Vehicular Ad Hoc Network(VANET),the high dynamic network topology caused by high-speed vehicle mobility leads to the frequent link fracture and unstable path. Aiming at this problem,a routing protocol for VANET based on the Link Expiration Time(LET) control is proposed. The vehicle builds a path which owns the same travel way and the longest LET. In the routing maintenance phase,the node sets a timer to trigger a new route discovery process for finding an effective link to form a new path. The vehicle increases the transmitting power to extend LET and enlarge the search range if it can not find out the efficient link,which will minimize the probability of fracture link. Compared with AODV and LED-AODV protocol,the simulation results show that the new routing protocol has an obvious improvement, which has the longer path expiration time,the less transmission delay,the higher average throughput and the more stable link.
  • LI Haifeng
    Computer Engineering. 2015, 41(5): 102-105.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In view of the complex calculating problem caused by too large Multi Dimensional Scaling(MDS) matrix in the MDS-MAP algorithm,this paper presents a fast improved MDS-MAP algorithm. This algorithm fuses improved iterative positioning algorithm of centroid algorithm on the basis of the original algorithm on MDS-MAP,as an alternative to the classical MDS solution. It configures the distance among the wireless sensor nodes as a matrix,first according the hops of the anchor nodes and the actual distance between the anchor nodes to estimate the average distance per hop of the wireless sensor nodes. Then according the constructed hops matrix between the nodes,it calculates the distance between the nodes,finally obtains the relative coordinates to absolute coordinates of the wireless sensor nodes. Simulation results show that the algorithm in the error sets ensure the considerable accuracy of the MDS-MAP algorithm,while also reduces the running time of the algorithm significantly.
  • QIN Xiean,WANG Ling,ZHAO Haitao,DENG Yi
    Computer Engineering. 2015, 41(5): 106-110.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To improve the utilization of spectrum resource,with real wireless channel,a software radio system consisting of Matlab and Universal Software Radio Peripheral ( USRP ) is applied as a new experiment platform for wireless communication system. With the platform,this paper completes an energy-based detection algorithm for spectrum sensing and estimating available spectrum bandwidth,realizes spectrum sensing and provides a basis for judgment in second user’s spectrum access. Experimental results show that the method can de used to design wireless system communication quickly and accurately. In high Signal Noise Ratio(SNR) and the more number of sampling cases,it can well realize spectrum sensing and meet the requirements of spectrum sensing of cognitive radio.
  • CAO Yatao,LENG Wen,WANG Anguo,LIU Lihong
    Computer Engineering. 2015, 41(5): 111-117,124.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In the digital wireless communication system of OQPSK modulation,a large amount of autocorrelation is required for frequency offset estimation based on time domain autocorrelation, which leads to high computational complexity. To overcome this drawback,an optimization algorithm being suitable for Field Programmable Gate Array (FPGA) implementation of the frequency offset estimation is proposed,which is based on the phase difference of the adjacent received signals autocorrelation. In this paper,the serial computation is realized by controlling the three-port RAM’s read address to achieve data connection,which saves a number of hardware resources. In terms of algorithms, addition and subtraction are utilized to improve sliding autocorrelation,which reduces the computational complexity. Finally,the timing simulation for the whole system is carried out. A good agreement between simulated results and true values is obtained. The feasibility of the scheme and the correctness of the algorithm are validated.
  • WU Wentie,LI Min,WEN Yongge
    Computer Engineering. 2015, 41(5): 118-124.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Energy of nodes in Wireless Sensor Network ( WSN ) directly impacts on the network lifetime. Therefore,from the perspective of saving energy of nodes,sink node moving scheme based on energy awareness is proposed in this paper,which is marked as EASM-INL. In EASM-INL scheme,according to energy level,sensor nodes adjust the transmission range. As the slack,it shortens the transmission range in order to save electricity. In addition,sink node collects the information about energy of sensor nodes,and computes the Maximum Capacity Path (MCP). As long as there is a path capacity value less than the threshold,sink node is ready to move. It computes the maximum capacity path in north,south,east,west direction,and moves along the direction in minimum value. Then,it establishes theoretical derivation model and analyzes the EASM-INL scheme in term of improving the performance of network lifetime. Finally,this paper uses numerical simulation to verity the performance of EASMINL scheme. Simulation results show that EASM-INL scheme can prolong network lifetime compared with traditional sink moving schemes.
  • YANG Zhicai,QIU Hangping,QUAN Jichuan,LEI Zhipeng,HUANG Liang
    Computer Engineering. 2015, 41(5): 125-129,138.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Considering the different functions of nodes,the collaboration of different nodes and the directionality of information flows,a network flow path model is established which matches the features of military network,according to the network flow theory. Then the communication reliability and the importance of communication nodes are defined based on the communicating delay based on this model,which can reflect the reliability of network in running. After that, the communication reliability of military communication network is analyzed in general case and emergency case. The result shows that it exists peak of communication reliability in two cases,which suggests the reliable communication capability of military communication network. And the importance of communication nodes is calculated, then the bottleneck nodes are found through that results. Finally the validity of this model is verified by analyzing the load of communication nodes.
  • ZHUO Zepeng,CHONG Jinfeng,YU Lei,WEI Shimin
    Computer Engineering. 2015, 41(5): 130-132.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The correlation coefficients of cryptographic functions are very important concepts in designing cryptographic functions. In this paper,by using some known conclusions and the definitions of Fourier coefficients and cross-correlation coefficients,the relationship between cross-correlation coefficients and Fourier coefficients of two q-ary cryptographic functions is given. Based on it,the relationships between Fourier coefficients and auto-correlation coefficients of one function,and cross-correlation coefficients of two functions and their auto-correlation coefficients are obtained. Also,the regular Bent functios are discussed. In particular,the duality of regular Bent functions is studied by using some known conculsions,and the relationship between the Fourier coefficients of derivate of two regular Bent functions and the Fourier coefficients of derivate of their dual functions is obtained.
  • TAO Wenqing,GU Xingyuan,LI Jing
    Computer Engineering. 2015, 41(5): 133-138.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the security problem of typical masked Data Encryption Standard (DES) implementation,this paper introduces a Correlation Power Analysis(CPA) method,which combines the last two rounds of DES algorithm and selects discrete bits of intermediate data as target function. Using Hamming Weight (HW) model,it guesses the 16th round of DES key and calculates the correlation between power and HW of data. By ranking the correlation value,it can break the masked DES key. Experimental result of attacking smartcard with masked software DES shows that it can successfully break the 64 bit DES key.
  • LI Guobing,HUI Hui
    Computer Engineering. 2015, 41(5): 139-143.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In cooperative communication network,a relay selection procedure is usually based on the channel conditions of legitimate channels in traditional relay selection methods,thus it is hard to guarantee the security and privacy of the transmission in the presence of eavesdroppers. To tackle this problem,an opportunistic relay selection method considering the Channel State Information (CSI) of both the legitimate users and the eavesdropper is proposed to minimize the Secrecy Outage Probability ( SOP ) in a cooperative Decode-and-Forward ( DF ) network. Firstly, the optimal power allocation method is given when the total transmit power of the source and relay is constrained. Based on that,the best relay selection algorithm is proposed and the closed form expression of secrecy outage probability is derived. Simulation results verify the correctness of the closed form expression of secrecy outage probability,and show that the proposed algorithm greatly reduces the secrecy outage probability of the system compared with the traditional methods.
  • YANG Ping,NING Hongyun
    Computer Engineering. 2015, 41(5): 144-148.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Some improvements are made based on the analysis of the traditional Kerberos protocol’s security. To solve problems of the password guessing attacks and the complexity of symmetric key storage,public key encryption and private key decryption mechanism is presented in this paper. The new methods of combining the message sequence number with the random number is used to help the application server to distinguish the message replayed by the attacker and the message resent by the legal client,so as to solve the problem that the encrypted request message is seized and replayed by the attacker. Also,in view of the problem that the session key is intercepted,the non-volatile memory is adopted on the client and application server to store the key chain and the message list,and message between client and application server is encrypted by the key in the key chain instead of the session key issued by the Ticket Granting Server (TGS),the dynamic key ensures the integrity of the message. Analysis result shows that the improued protocol can improve the security of the system.
  • YIN Fengmei,HOU Zhengfeng,PU Guangning,CHEN Hong
    Computer Engineering. 2015, 41(5): 149-152,158.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Among the present anonymous authentication schemes,anonymous authentication process is more complex, and the number of anonymous tracing members requires less,which can increase the time of anonymous authentication, and reduces the security of privacy. A new anonymous authentication scheme is presented combined with the threshold secret sharing scheme. Every member’s secret share and the group’s public key can be obtained with the help of the theory of linear equations secret sharing. The anonymous authentication can be achieved based on the idea of 1/ n signatures,with which the prover can unrestrictedly choose anonymity set from the group U. In order to improve the safety of anonymity,only t members can trace the identity of the prover,which can be verified by the verifier whether is real or not. Analysis results show that,compared with anonymous authentication scheme without trusted center,this scheme can satisfy the security requirements of anonymous authentication,and the computational cost is smaller.
  • ZHU Jialiang,WEI Yongzhuang
    Computer Engineering. 2015, 41(5): 153-158.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    As a new lightweight block cipher,LBlock cipher receives much attention since its excellent performance on hardware and software platforms. Currently, the secure evaluation on LBlock cipher heavy relies on the traditional mathematical attacks. The cache attack is a type of side channel attacks, and it has actual threat to the secure implementation of ciphers algorithm. In all kinds of Cache attacks,trace driven Cache attack has the advantage of using less samples and having higher efficiency. Based on the structure of the cipher algorithm and the property of its key schedule,this paper proposes a trace driven Cache attack on the LBlock algorithm. This attack recovers the secret key by capturing the leaked information in the process of accessing to the Cache. Analysis result shows that this attack requires a data complexity of about 106 chosen plaintexts,and a time complexity of about 27. 71 encryption operations. Compared with the proposed side channel cube attacks on LBlock and trace driven Cache attack on DES which also has the structure of Feistel,the attack is more favorable.
  • CHEN Wen
    Computer Engineering. 2015, 41(5): 159-162,168.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem of absence privacy protection in mobile social networks,parameter model of privacy preservation is designed to realize the personalized privacy protection parameter setting,the necessary conditions of privacy protection are proposed,and the generalization method is used to achieve the effect of absence privacy protection. Experimental results show that compared with the WYSE ,this algorithm has lower release delay under the condition of the equivalence computation time.
  • HAN Zhigeng,CHEN Geng,WANG Liangmin,JIANG Jian
    Computer Engineering. 2015, 41(5): 163-168.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to improve the reliability of reputation evaluation,from the perspective of improving the timeliness of reputation evaluation,a general reputation revaluation model named GMRR based on time-lag weakening strategy is proposed. The feature of GMRR is that on the basis of four kinds of basic data such as raw reputation,reputation history, reputation fluctuation rate and reputation fluctuation trend,the new model can revaluate the raw reputation dependably from multiple dimensions by the integration of timeliness mechanism with the existing dependable reputation evaluation mechanisms. After the time-lag weakening strategy for reputation timeliness mechanism and GMRR model description and related algorithms are given,the reputation revaluation effect of new model is evaluated. Experimental results show that, compared with the Srivatsa reputation revaluation model,GMRR can make the revaluation reputation more closer to the real behavior of the target entity,and has better inhibitory effect on any fluctuation behavior of malicious entity.
  • WANG Ying,YUAN Kaiguo,XI Minchao
    Computer Engineering. 2015, 41(5): 169-174.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To evaluate the performance of watermark algorithm,robustness and transparency is a couple of very important standards. As in many proposed algorithms, the insurance of transparency is based on the reduction of robustness. This paper describes a removable video watermark algorithm. This algorithm can solve the problem mentioned before. To improve the robustness, the proposed algorithm does not limit the strength of watermark embedding. Before the video playing,the watermark in the media is removed according to the user’ s secret key, unauthorized key will lead to an additional distortion on video,which improves the security of the watermark algorithm.Experimental results imply the proposed algorithm has strong robustness against to the process of resolution change, recoding and bit-rate decreasing.
  • LUO Banghui,ZENG Jianping,DUAN Jiangjiao,WU Chengrong
    Computer Engineering. 2015, 41(5): 175-179.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Abstract Traditional text classification models and latent semantic analysis model map text to vocabulary text or semantic space,focusing on the ability to distinguish words. But it can not give a clear image of semantic description of the space. As a result,the scalability and accuracy of a text classification algorithm is limited. In this paper,based on the classification of human emotions in psychology,it assumes that there is a strong association between emotions and opinions. It uses lexical semantic extension and feature selection methods to build three emotional representation model, and maps documents which can express human emotions tended to the emotional space. Using emotion features in stock message board obtained by the emotional representation model,it builds the emotion space model and designs opinion classification method. Experimental results on actual stock forum show that the classification accuracy of this method is high.
  • SHENG Yaqi,ZHANG Han,LV Chen,JI Donghong
    Computer Engineering. 2015, 41(5): 180-184.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper analyses the main method of recognizing textual entailment,and proposes a method named mixed topic model to recognize textual entailment,and describes a probabilistic model based on the assumption. Texts are generated by mixtures of latent topics. It takes the T(Text) and H(Hypothesis) as a different expression of the same semantic mean. These can be represented as multi mode data. If text entails hypothesis,they have the similar probability distribution of the topic,shares the same mixed bag of words and topics. The model is used in the task RTE-8,parallel tests of mixLDA and LDA models are designed,and a system experiment uses the Support Vector Machine(SVM) to classify the features which consist of the textual similarity made by this model and other features. Experimental result demonstrates the high accuracy of the mixed topic model to recognize textual entailment.
  • SHAO Jingfeng,WANG Jinfu,BAI Xiaobo,LEI Xia,LIU Congying
    Computer Engineering. 2015, 41(5): 185-190,196.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To ensure the stability of fabric quality and the accuracy of yield data acquisition in the weaving process,the existed data fitting methods are comparatively analyzed,and aiming at the shortage in the nonlinear loom sound signal processing aspects of these methods,the reasons why the fabric quality fluctuates are theoretically analyzed from the formation mechanism of uncertainty factors. Being combined with the advantage in the nonlinear signal process of Empirical Mode Decomposition(EMD) algorithm,an online weaving data fitting method based on EMD is constructed,and EMD is applied in the real-time acoustic signal characteristic extraction process of weaving machinery. Experimental results show that compared with the existed data fitting methods,the method of EMD significantly improves the fabric quality index,and effectively ensures the stability of fabric quality in the weaving process,and the accuracy of yield data acquisition.
  • ZHANG Naizhou
    Computer Engineering. 2015, 41(5): 191-196.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    A common practice for search engines is that exploiting query suggestion demonstrates the diversity of user’s query intent and automatically provides alternatives to them. For current researches on query suggestion,however,there are few studies that focus on the influence of time on formulation of query suggestions. Actually,in many cases,the query intent of users can change over time. This paper presents a temporal click graph mining based method for query suggestion. A raw query log file is preprocessed,and a temporal click-through graph can be generated by it. To eliminate the temporal clickthrough graphs’non-connectivity,two basic operations:checking and merging disconnected subgraphs will be executed over it. A random walk based graph mining algorithm is exploited to generate a set of query suggestions for a given query. It conducts the extensive experiment over real Web environment,and experimental results show that this approach aims at improving the precision and difference of query suggestions,thereby can generate more reliable query suggestions.
  • WANG Libin,AN Zhipeng,LIN Dan
    Computer Engineering. 2015, 41(5): 197-201.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    A mathematical model based on expectation and variance is constructed,and a Probabilistic Neighborhood Search(PNS) is proposed for the Capacitated Arc Routing Planning with Stochastic Demand(CARPSD). The heuristic generates the initial solution through Stochastic Path Scanning(SPS) to construct the set of optimal solution. According to four key indicators having an influence on the solution quality,it builds four neighborhood structure,applies probabilistic mechanism of heuristic to calculate the intensity of neighborhood search. The size of neighborhood structure is transformed to guide the neighborhood search. Restart strategy is implemented to expand the scope of the solution space and avoids excessive local search, improving efficiency of the algorithm. Computational results show CARPSD is effectively solved and the optimization superiority of this algorithm is over the Adaptive Large Neighborhood Search (ALNS) algorithm.
  • MA Huifang,JIA Meihuizi,YUAN Yuan,ZHANG Zhichang
    Computer Engineering. 2015, 41(5): 202-206,212.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    A novel semi-supervised learning algorithm fully exploring the inner semantic information to compensate for the limited message length is presented. The key idea is to explore term correlation data,which well captures the semantic information for term weighting and provides greater context for short texts. Direct and indirect dependency weights between terms are defined to reveal the semantic correlation between terms. Must-link and cannot-link are encoded as constraints for terms. This paper formulates microblog clustering problem as a semi-supervised non-negative matrix factorization co-clustering framework,which takes advantage of knowledge of features as pair-wise constraints. Extensive experiments are conducted on two real-world microblog datasets. Experimental results show that the effectiveness of the proposed algorithm. It not only greatly reduces the labor-intensive labeling process,but also deeply exploits the hidden information from microblog itself.
  • ZHANG Liyuan,JI Donghong
    Computer Engineering. 2015, 41(5): 207-212.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    For the Gene Ontology Evidence Sentences(GOES) extraction problem,the recall rate and efficiency of the traditional system built on traditional features and Bayesian classification model are relatively low. In order to solve this problem,two systems are built for the single sentence and joined sentences retrieval. System 1 is built on Support Vector Machine(SVM) model and new combination of features,which solves the problem of incomplete coverage. Conditional Random Field (CRF) model and the rules of identification of candidate sentence are added into System 1 to build System 2 which solve the problem of sentences combination. Experimental results show that,in the single sentence extraction problem,compared with the Bayesian model based system,the recall and F-value of System 1 are increased by 39. 7% and 12. 9% . In the joined sentences extraction problem,compared with the Learning from Positive and Unlabeled Documents for Retrieval(LPU) system,the recall of System 2 is increased by 37. 1% .
  • ZHAO Caiguang,ZHANG Shuqun,LEI Zhaoyi
    Computer Engineering. 2015, 41(5): 213-218.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Contrastive divergence has a good result for training restricted Boltzmann machine model as one of the mainstream training algorithm in the experiments. An improved contrastive divergence based on Exponential Moving Average ( EMA ) is proposed by combining with the exponential moving average learning algorithm and Parallel Tempering(PT),which includes updating the model parameters and samples. The improved algorithm is applied to train speech recognition model parameters in Gaussian-Bernoulli Restricted Boltzmann Machine (GRBM),and experimental results of digit speech recognition on the core test of TI-Digits show that the proposed algorithm works better than traditional training algorithms in GRBM,the accuracy can be as high as 80. 53% and increase by about 7% . Recognition accuracy of some other GRBM models also increase apparently based on the proposed algorithm. And its performance keeps well.
  • MA Chao,SHEN Wei,DONG Jingfeng
    Computer Engineering. 2015, 41(5): 219-223.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the disadvantageous affects caused by moving object detection and tracking in complex background of video scenes,a new method of detecting and tracking specific moving objects using adaptive moving energy threshold combined with compact colored SIFT descriptor is proposed. For detection of moving objects,disturbance of complex environment is filtered out automatically by adaptive moving energy threshold. Principal Components Analysis is applied to the Colored SIFT descriptor(PCA-CSIFT) for objects matching. Thereby the continuous tracking of specific moving objects is achieved. Extensive experiments on bench datasets show that,in complex background,the moving objects tracking method is not sensitive to the amount of objects and the ratio of error is stabilized at 6. 5% ~ 34% . The PCA-CSIFT holds high distinctiveness and robustness with ratio of mismatches 25. 33% ~28% . The average processing time of each frame is no more than 0. 26 s,so the method meets the need of real time.
  • SUN Jun,HUANG Zhiyong,CHEN Yilin
    Computer Engineering. 2015, 41(5): 224-227.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to study the influence on object contour caused by significance prior information and appearance information in image,this paper presents an image foreground segmentation method. The method is obtained by considering the spectrum based significance probability map and codebook based appearance priori,and puts them into a unified probabilistic framework,the foreground probability distribution is obtained. For the test image,through the spectrum of significance,it calculates the foreground probability at different positions,and the probability of appearance model as foreground inside the region. Synthetically it obtains the probability of the target area for foreground. When exceeds a certain threshold,it can be considered as foreground. This method only needs a small amount of learning,and can get results very similar to true value image segmentation. Experimental results on standard image set show that the method is simple,fast,and effective.
  • ZHAO Chunlan,WANG Kailing,LIN Cheng,SUN Yu,XIU Yahui,WANG Yexing,HAO Liguo
    Computer Engineering. 2015, 41(5): 228-231.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    It is difficult to accurately extract the brain tumor,because of the low contrast of brain Magnetic Resonance Image(MRI),the blur tumor edge and complicated brain tumor shape. Order to solve this problem,this paper puts forward the algorithm cmbcaused normalized cut with active contour model based on the slsh wessure Force (SPF) function to extract brain MRI tumor. It uses active contour model based on the SPF function to convergent the normalized cut extraction tumor edge,setts the convergence of the iterative number and smooth coefficient to control the MRI tumor edge convergence speed and shape,and makes the tumor edge curve to stop at the real tumor edge. Simulation results show that ths algorithm can overcome the tumor extracting negative impact of tumor shape changing and tumor contrast, and extracts the brain tumor stably and accurately.
  • WANG Yu,YAN Mo
    Computer Engineering. 2015, 41(5): 232-236,242.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Local region-based Active Contour Model(ACM) is easily influenced by the location of the initial curves when it segments images with intensity inhomogeneity and its numerical implementation based on Level Set(LS) method is lower. For this,a new segmentation model is proposed in this paper. The model includes the Local Signed Difference (LSD) energy as data driven term for curve evolution. In order to reduce the dependence on the location of the initial curve,a Globally Convex Segmentation (GCS) scheme is used to derive a discrete convex segmentation model. The new model includes a second order smooth term from Mumford-Shah segmentation model to make the segmented regions smoother. It uses split Bregman iterations to get a fast numerical implementation. Compared with the Local Binary Fitting (LBF) model,LSD model,experimental results show that the model can segment images with intensity inhomogeneity correctly,and is more efficient and more robust.
  • LEI Qin,SHI Chaojian,CHEN Tingting
    Computer Engineering. 2015, 41(5): 237-242.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to enhance quality of sea hazy images,this paper presents a method of dehazing for sea hazy images. It uses the mean shift method and edge detection method of embedding confidence for image segmentation of a sea hazy image,and applies the morphological dilation and erosion operations with binarization to extract regional and non-regional sky area in the hazy image,and finally dehazes the sky area with restricted contrast histogram equalization algorithm,and non-sky area with dark channel prior with guided filtering. The guided filter computes the filtering output by considering the content of a guidance image,and achieves similar results of refining transmission maps compared with the soft matting method,but needs less computation cost. Experimental results show that relative to the dark channel priority method,the proposed method does not provide the transition area and the phenomenon of color cast in the sky area,and achieves high performance of haze removal.
  • LIU Yan,HU Yanhong,SUN Zhanli
    Computer Engineering. 2015, 41(5): 243-248,253.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Technologies of 2D cartoon animation can be roughly categorized into two types:one is computer-aided 2D cartoon animation,and the other is rendering 3D models. These two both have their own pros and cons. So this paper proposes a way which combines both of them,that is 2. 5D cartoon modelling method which constructs the rotation of the cartoon by texture mapping. In 2D animation,a character has an implicit geometry structure. So in this paper,it defines a character as a cartoon object composed of only one typical geometry structure. This paper builds the deformation algorithm of corresponding structure through the two dimensional projection transformation of typical geometries. Detailed analysis of projection transformation patterns of rectangular and sphere are provided. Based on patterns founded,this paper provides algorithm for implementing pseudo-3D rotation effects for the corresponding carton elements,and gives the results of cuboid,sphere and cylinder to support the validity of the algorithm.
  • YANG Aiping,TIAN Yuzhen,HE Yuqing,DONG Cuicui
    Computer Engineering. 2015, 41(5): 249-253.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In view of the poor performance of the K-Singular Value Decomposition(K-SVD) denoising method,a new algorithm is proposed. The denoising performance is improved by the refined K-SVD method with the help of the correlation coefficient matching criterion and dictionary cutting method. By combining the non-local self-similarity as a constrained regularization into the image denoising model,the performance is further enhanced. Experimental results show that compared with traditional K-SVD method,this algorithm can effectively improve the smoothness of homogeneous regions with preserving more texture and edge details.
  • JI Xuye,CHEN Ming,FENG Guofu,ZHAO Haile
    Computer Engineering. 2015, 41(5): 254-258,263.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Gaussian Mixture Model(GMM) is adopted to solve foreground detection problems. However,GMM can not detect objects in which do not move in the scene. This paper proposes the multi-model cooperative method to detect foreground objects in complex scene. Under the assumption that the camera is fixed,it first uses the adaptive GMM to build a background which is updated by the light detection model and the scene detection model. A shadow detection model is also used in this paper at last. It mades a comparison with two algorithms. Experimental results show that this method can completely extract the object contour,and single frame processing time is less.
  • LIU Yujiao,FAN Yong,GAO Lin,YOU Xia
    Computer Engineering. 2015, 41(5): 259-263.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Considering the problem that the number of undelying local spatial temporal is small,and the expression ability of mid-level feature is weak,combined with Spatial Temporal Depth Features (STDF),this paper proposes a human action recognition method. The intense area can provide more discrimination information in human recognition, and it finds the kinetic regions from depth image,computing the optical flow in the region to be the energy function and sampling the features on kinetic region in Gaussian distribution. The collected sample points as the low-level feature are combined with Bag-of-Word ( BoW ) model and Support Vector Machine ( SVM ) classifier to human recognition. Experimental data show that the average accuracy rate on human action recognition based algorithms on STDF in SwustDepth dataset can reach 92%,and has high robustness.
  • LU Jian,SUN Yi
    Computer Engineering. 2015, 41(5): 264-269,273.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Classic multi-frame Super-resolution(SR) techniques strongly rely on the supportability of Low-resolution (LR) frames. When the frames contain insufficient information,annoying artifacts often appear in the SR outcome. To solve this problem,a multi-frame SR combined with sparse coding technique is proposed in this paper. A high-resolution frame is reconstructed by the help of probabilistic motion estimation,and meanwhile effective / ineffective regions can also be determined by using an adaptive threshold segment. A sparse-coding-based completion technique is applied to recover the ineffective regions. Experimental results show that the proposed algorithm can essentially exploit the information from both LR frames and sparse coding dictionary. Compared with SR methods which depend only on image sequence itself or a single frame,the proposed algorithm has better robustness and extensive applicability.
  • SUN Yu,LI Zhanli
    Computer Engineering. 2015, 41(5): 270-273.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at enhancing the spatial coordinates precision solved by 2D images, a new method based on projective geometry for calculation of 3D model coordinate transformation between image side and object side space is proposed. Charge Coupled Device ( CCD ) camera imaging geometric principle is analyzed based on perspective transformation and cross-ratio invariant properties. Imaging geometric model based on linear feature is established using the relationship of parallel,perpendicular and intersecting between straight lines. With acquired photographs and camera parameters,the shape and size of the corresponding scene space are deduced and proved on the basis of the cross ratio invariability of collinear points with perspective projection. The coordinate transformation between image side and object side space of the 3D model is calculated. And perspective correspondence from image side to object side space is built. Experimental results prove that the calculation accuracy of the spatial points coordinates can be enhanced by image geometric information. The method can be applied to enhance the precision of image feature extraction and location of geometric features such as circle,and to increase the measurement precisiong of the photogrammetry system and 3D coordinate recnstruction.
  • YIN Ming,SHUI Jun,LUAN Jing,BAI Ruifeng
    Computer Engineering. 2015, 41(5): 274-279.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    For a single image Super-resolution(SR) reconstruction problem,a novel image SR algorithm based on Softdecision Adaptive Interpolation ( SAI)-Bicubic interpolation and Shift-invariant Shearlet Transform ( SIST ) fusion is proposed. For each source image is separately interpolated by SAI and Bicubic interpolation,and the SIST is adopted to decompose the two interpolated images in different scales and directions,and the low-frequency and high-frequency subband coefficients of the two images are obtained. For the low frequency sub-band coefficients,according to the regional variance to determine the fuzzy similarity,a adaptive weighted fusion rule combined with improved sigmoid function is presented. For the high frequency sub-band coefficients,it uses a new Sum-modified Laplacian(SML) and is combined with the weighted average fusion rule. The high resolution image is obtained by performing the inverse SIST on the combined coefficients. Compared with the SAI,the imposed algorithm has very good effect on improving the clarity of the reconstructed image and Peak Signal to Noise Ratio(PSNR).
  • DAI Huihui,SANG Qingbing
    Computer Engineering. 2015, 41(5): 280-284,289.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Most video quality algorithms regard the average of all the frames of the video image quality as the quality of the video,but these methods only take the quality of the spatial image into account,ignoring the inherent characteristics of the video on the temporal domain,therefore can not accurately describe the relevance of the objective video quality assessment and subjective video quality assessment. Thinking about temporal domain characteristics of video,this paper proposes a video quality assessment algorithm based on wavelet domain and temporal domain. The image of the video is divided into the smooth region and the edge region. The two regions are applied with wavelet transform to obtain the wavelet coefficients of each region of a video frame. The frame quality is obtained by weighting the two parts of the frame. The temporal pooling is adopted to obtain the quality of the video. Experimental results on LIVE video database show that this algorithm and human subjective evaluation results have a good consistency,and Spearman Rank Order Correlation Coefficient(SROCC) value reaches 0. 788 5.
  • LI Lun,WANG Yigang,FAN Shengli,BAI Zhiqiang,LAI Jianning
    Computer Engineering. 2015, 41(5): 285-289.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to overcome the difficulty of eliminating stereo mismatching pairs on Scale Invariant Feature Transform(SIFT) feature matching,an algorithm aiming to remove error match pairs of Stereo Light Microscope(SLM) images is proposed. SIFT feature matching pairs are roughly found,and then calculate 3D coordinate of point cloud using SLM calibration parameters. New match pairs can be obtained by projecting the point cloud to left,right images and generate projection vector set by subtracting image coordinate of new projection point from one of original point, respectively. Outliers of magnitude and orientation in left,right projection vector sets are eliminated to remove error match pairs. Experimental result presents that the error pairs removal percentage of experimental images in proposed algorithm reaches 100% ,meanwhile it does not eliminate correct matching pairs,improving the precision of stereo match.
  • TANG Xiaofang,ZHOU Jinzhi
    Computer Engineering. 2015, 41(5): 290-294.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to select the effective frequency and electrodes components, this paper promotes an Electroencephalogram(EEG) feature selection method based on divergence analysis to improve classification accuracy.Throughout the five tested samples from Brain-computer Interface ( BCI ) Competition III dataset IVa, it utilizes divergence analysis algorithms to select the maximum value of the k-space from the original data features,then uses feature extraction based on Common Spatial Pattern ( CSP ) aimed at this k-space feature and classifies by Linear Discriminant Analysis(LDA). The experiment identification that the average rate of classification accuracy can obtained is 95. 54% under training pattern,while reached 84. 57% under test pattern,higher than the selection algorithm of correlation coefficient and mutual information.
  • CHEN Jinguang,HE Shan,MA Lili
    Computer Engineering. 2015, 41(5): 295-299.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem of nonlinear inequality filtering with state constraints,this paper presents an unscented iterated Kalman filtering algorithm based on sequential quadratic programming optimization method. The method uses sequential quadratic programming method to solve the optimum nonlinear inequality constrained problem. In iterations, quadratic programming sub-problems are employed to determine a descent direction,and these steps are repeated until the solution of original problem is obtained. In order to guarantee the convergence of the algorithm,it balances between the objective function and the inequality constraints. Furthermore, a positive definite matrix is used to approximate the Hessian matrix to reduce the complexity. A constrained tracking simulation is performed and the results show that the new algorithm can effectively enhance the accuracy with a low time complexity.
  • LIU Yan,LIU Dingjia,HAN Zhipan
    Computer Engineering. 2015, 41(5): 300-305.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Action recognition based on motion data is the frontier of computer vision and pattern recognition. However, combined with the results of action recognition,human sentiment extraction is relatively scarce. This paper explores and researches on this aspect. The research starts at the exploration of human emotion dataset. Combining computer graphics, artifical intelligence and machine learning,a standard human emotion dataset can be built. It contains four basic emotions: happiness,anger,sadness,fear,and two derived emotions:surprise and disgust. And it gets the relationship between action and emotion from the perspective of action. According to the divided reference of action recognition,the Period is regarded as the emotion extract unit,to ensure every Period’s emotion list. Combined with the rate,the emotion of the action can be extracted. Experimental results prove that emotion information can be extracted from 3D action data with the approach.
  • CHEN Si,ZHAO Ji,WU Jiaofeng,JIN Zijun
    Computer Engineering. 2015, 41(5): 306-310,315.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to solve the problem of modeling the filling beverage production-line system,a behavior model based on Petri nets with message is proposed. In this paper,behavior is classified as system behavior and entity behavior which is described respectively by system state-transition and entity state-transition. Message presented by a six group plays the role as the carrier of information communication between entities. Message generating function and message consumption function work as a bond of message and behavior,and it takes message into the Petri nets for controlling the system flow. An example of a part of the production-line as a small system shows the work mode of the behavior model proposed. An immerse 3D dynamic simulation system of the filling beverage production-line is implemented through the virtual reality platform of Virtools. Simulation results demonstrate that the behavior model proposed can build,control and analyze the virtual production-line system effectively.
  • YAO Jiaxin,TIAN Huixin
    Computer Engineering. 2015, 41(5): 311-315.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In iron and steel enterprises,power load consuming is small. It does not render the pronounced cyclical variations. Process change leads to instant load fluctuations. The traditional load prediction model cannot effectively predict users’ sudden disturbance. A subspace method for data-driven prediction power load of steel enterprises is used based on subspace algorithm to establish ultra-short term load prediction of power daily load prediction model. To improve the accuracy of prediction models,it introduces the feedback factor and forgetting factor to improve standard subspace algorithm performance. To actual load test data to verify the practicality of the approach method,the results can provide electricity load prediction in steel enterprise and secondary energy smart management provides an effective decision support.
  • WANG Chunling,SHE Zuobin
    Computer Engineering. 2015, 41(5): 316-321.
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To enhance the scientific level of the ancient and famous trees’ monitoring and management,protect and manage these living fossils better,this paper researches the ancient and famous trees’ monitoring and management based on the Internet of Things (IoT). By deploying 433 MHz active Radio Frequency Identification (RFID) and ZigBee wireless temperature and humidity sensors on the ancient and famous trees,it realizes the real-time monitoring of ancient and famous trees and values of temperature and humidity. Visual C # language and SQL Server 2008 database are used to develop the system of the ancient and famous trees’ monitoring and management based on the IoT on the Microsoft .NET development environment. The system realizes the intelligent storage of ancient and famous trees’ information,real-time monitoring of temperature and humidity on the living environment of the ancient and famous trees,real-time alarming when the ancient and famous trees are illegally transplanted. This article also provides a new approach to monitoring and managing the rare forest resources.