Author Login Editor-in-Chief Peer Review Editor Work Office Work

15 November 2016, Volume 42 Issue 11
    

  • Select all
    |
  • YANG Huaizhou,ZHANG Liumei
    Computer Engineering. 2016, 42(11): 1-7. https://doi.org/10.3969/j.issn.1000-3428.2016.11.001
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problems that more and more software and hardware functions of Internet of Things(IoT)are provided as services,and the Quality of Service(QoS)of IoT services is difficult to predict,a method of QoS prediction and service recommendation for IoT network is presented.Based on the historical usage data of service invocations and their QoS,the missing values of the user-item matrix are predicted by combining the user-based and item-based collaborative filtering approaches.The QoS values of services are predicted for the active users by the densified user-item matrix.Furthermore,the prediction results are used to realize effective service selection and recommendation.The presented method is validated by taking advantage of a large-scale data set of real service invocations.The experimental result indicates that better prediction accuracy can be obtained through the method,and it is suitable for IoT service selection and recommendation.
  • YAO Yukun,YANG Jikai
    Computer Engineering. 2016, 42(11): 8-14. https://doi.org/10.3969/j.issn.1000-3428.2016.11.002
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In existing Routing Protocol for Low-power and Lossy Networks(RPL) which selects the best parent according to expected lifetime of bottleneck node on the path,the joining nodes do not consider the traffic generated by the other sub-nodes of the same parent node or the change of bottleneck node after the node’s joining.Aiming at these problems,this paper presents an Energy-balanced and High-efficiency RPL(EBHE-RPL).Firstly,it uses a classifying sending mechanism for DAO-ACK messages to reduce the number of control packets sent by the parent node.Secondly,it uses a traffic accumulation mechanism to calculate the Expected Life Time(ELT) of bottleneck node on each path,so that the calculated life time is closer to the actual value.Finally,it uses a bottleneck re-estimating mechanism after node’s joining to avoid the bottleneck’s change.EBHE-RPL can improve the accuracy for selecting the best parentand balance the energy of networks.Simulation results show that,compared with RPL and EB-RPL algorithms,the performance of EBHE-RPL is improved in the extension of network lifetime and the balance of network energy.
  • MIN Minghui,YANG Zhijia,LI Zhongsheng,LIU Zhifeng
    Computer Engineering. 2016, 42(11): 15-21,26. https://doi.org/10.3969/j.issn.1000-3428.2016.11.003
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper researches the delay-constrained scheduling of data with multiple sampling periods in industrial Internet of Things(IoT)network.To achieve the goal of low-power,low-latency communication and increase network capacity in the process of data transmission,a multi-slot frames scheduling algorithm based on loads is proposed.The strategy with a faster scan rate should be performed with a higher priority to meet the real-time requirements of data with different sampling period.To avoid the conflict and interference in the network,the graph theory methods of matching and coloring are adopted to achieve deterministic allocation of communication resources.Simulation result demonstrates the proposed algorithm can achieve high reliable and real-time communication,and it can maximize utilization of communication resources to increase network capacity and reduce power consumption.
  • ZHOU Rui,LUO Lei,LI Zhiqiang,SANG Nan
    Computer Engineering. 2016, 42(11): 22-26. https://doi.org/10.3969/j.issn.1000-3428.2016.11.004
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Advances on smartphones and built-in inertial sensors have given rise to pedestrian dead reckoning using smartphone sensors.However,an accurate Pedestrian Dead Reckoning(PDR) system using smartphone sensors is not available yet,for smartphone sensors are not accurate enough and pedestrians have natural swings during walking.Based on the analysis of pedestrian walking patterns,a new PDR algorithm using smartphone sensors is proposed.The algorithm first preproccess the original acceleration data,then uses a finite state machine to detect walking gait and thereby counts steps.Step length is estimated by using the relationship between step length and acceleration as well as that between two consecutive steps.And the estimated result is smoothed by Kalman filtering.Experimental results show that the proposed algorithm is able to provide accurate step counts and step length,thus providing accurate location service.
  • LI Qi,YAO Long
    Computer Engineering. 2016, 42(11): 27-31,37. https://doi.org/10.3969/j.issn.1000-3428.2016.11.005
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    In order to solve problems existing in Hulun lake’s environmental monitoring system,such as insufficient monitoring points,high complexity and poor scalability,this paper puts forward an Internet of Things(IoT)platform for lake environmental monitoring based on Representational State Transfer(REST)architecture.The sensor node and node status data are abstracted into resources.Due to the characteristics of concision,high efficiency and loose coupling,this platform is suitable for lake environmental monitoring.Using ECS,Ali cloud server,it builds the development environment,with the aid of Laravel framework and front-end framework of lightweight jQuery to realize user management,Geographic Information System(GIS),real-time display for the data and historical data query.This platform can meet the requirement of fishery,livestock,scientific research and government,and eventually realize data sharing of the Hulun lake’s monitoring system.

  • HU Ying,LOU Hong
    Computer Engineering. 2016, 42(11): 32-37. https://doi.org/10.3969/j.issn.1000-3428.2016.11.006
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In Internet of Things(IoT)environment,in order to improve the IoT system efficiency,this paper proposes a message scheduling scheme considering node failure.It realizes message order rearrangement by using the shortest time scheduling mechanism.The best alternative node is found through the backup node selection mechanism based on saving energy and reducing the cost of backup node deployment.Simulation results show that the scheme can provide a faster response time of messages and a lower energy consumption of the IoT system.From the perspective of prolonging network life,it also provides the optimal number configuration of IoT backup nodes under the condition of different node failure probability.
  • FAN Cunqun,ZHAO Xiangang,HUANG Binbin,MA You,XIE Lizi,FENG Xiaohu
    Computer Engineering. 2016, 42(11): 38-42. https://doi.org/10.3969/j.issn.1000-3428.2016.11.007
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The satellite ground application system based on cloud computing platform carries more and more tasks,which needs to improve the virtual resource utilization and task processing efficiency within the limited system resources.Aiming at this problem,this paper proposes a resource mapping method based on similarity of cloud model.Firstly,the definition of required resources of task processing and distributable resource cloud model in resource pool are given.Then,the digital signatures of task resource and usable resource cloud model are calculated according to the reverse cloud algorithm.Finally,the measure method of cloud model similarity is introduced to calculate the similarity of cloud model,thereby determining prior resource mapping.Simulation results show that the proposed method can improve the resource utilization of clouded systems,and it can meet the timeliness requirements of multi-tasking processing.
  • YANG Changchun,WANG Weiwei,YE Shiren,SHEN Yongmei
    Computer Engineering. 2016, 42(11): 43-49,56. https://doi.org/10.3969/j.issn.1000-3428.2016.11.008
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The traditional community detection algorithm directly introduces the third party algorithm,which reduces computation efficiency.Aiming at this problem,this paper proposes a microblog community detection method based on the finite interval limitation algorithm with multi-partition weight reduction.Firstly,the R-C model of the microblog community is studied and the properties of the weighted reduction curves of the parameters are analyzed.Then the optimal partition algorithm is proposed for most parameter values based on solution of convex optimization problem.Secondly,the parameter range can be defined in a set of finite interval by partitioned sequential search of breakpoints,and the synchronization optimization of partition parameters is implemented,which sloves the multi-information equilibrium problem of single partition.Finally,the data set obtained from Sina microblog is used for experiments,and results show that the proposed algorithm is more effective for user’s microblog community detection,compared with microblog detection algorithm based on relationship of theme and link or label propagation.
  • MA Leilei,LI Hongwei,LIAN Shiwei,LIANG Rupeng,CHEN Hu
    Computer Engineering. 2016, 42(11): 50-56. https://doi.org/10.3969/j.issn.1000-3428.2016.11.009
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper introduces ontology semantics and proposes a new strategy of disaster focused crawler to retrieve disaster theme webpages from the Internet efficiently and accurately.Firstly,the frame and process of disaster focused crawler are designed,and an improved ontology semantic similarity calculation method is proposed.Secondly,the thematic semantic vector is calculated based on semantic similarity,the webpage text feature vector is obtained based on HTML location weighting,and the thematic relevance is calculated.Then a relevance calculation method of URL anchor text is proposed,URL link priority is analyzed,and the crawling queue is optimized.Earthquake disaster and meteorologic disaster are selected to test and analyze,and the experimental results show that the proposed strategy can improve stability and accuracy.
  • WU Wenming,LIU Xiping
    Computer Engineering. 2016, 42(11): 57-63,69. https://doi.org/10.3969/j.issn.1000-3428.2016.11.010
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    It is increasingly important to perform service recommendation as increasingly more users and services are involved in service computing,but some inauthentic service evaluations from a few users decrease the dependability and effectiveness of service recommendation.This paper proposes a service recommendation method based on trusted similar users.Suspicious evaluations are dug out from the history of the user evaluations,which are used for computing the degree of user reliability and subsequently removing the unreliable users.The neighbor sets of each trusted user are determined by computing the user similarity.According to the preference of any user,candidate services from neighbors are carefully overall evaluated and effectively recommended.Experimental results show that the method,in the case of rejecting a malicious user,recommends more reliable service and effectively improves the quality of recommendation.
  • GAO Yongbing,WANG Yu,MA Zhanfei
    Computer Engineering. 2016, 42(11): 64-69. https://doi.org/10.3969/j.issn.1000-3428.2016.11.011
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Automatic document summarization is an approach to obtain important information of microblog,but with the characteristics of short text,high redundancy and high noise of microblog,cause great difficulties for automatic summary.For this problem,an event summary extraction algorithm based on the content and relativity of individual micro blog is presented,called Content and Relativity PageRank(CR-PageRank).It uses a set of events of microblog to build an event graph.And combines with content quality of microblog and calculates the total weight of microblog by using CR-PageRank algorithm,extracts representative microblog to generate the initial summary.It processes the readability to make the final summary more readable.Experimental results show that by comparing with TextRank algorithm and LexRank algorithm,it is precise and recall rate is increased significantly,and the generated content is more concise,more comprehensive information,and better readability.
  • XIAO Kejun,YU Haibo,CHEN Yuting,ZHONG Hao
    Computer Engineering. 2016, 42(11): 70-75. https://doi.org/10.3969/j.issn.1000-3428.2016.11.012
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Many demand-driven points-to analysis techniques are proposed to suit some environments bounded by strict limits of time and memory usage.Improving the demand-driven points-to analysis in a flow sensitive manner helps achieve precise points-to relations for some variables.Due to the existence of strong flow and data dependencies in the large-scale software systems,it is difficult in identifying effectively all the program statements contributing to the points-to relations of the objective variables.This paper proposes a flow-sensitive program representation approach and defines a notion of Context Free Language(CFL) reachability which helps explore all the flow-sensitive points-to relations for the objective variables.It also develops a Seeker tool,which can compute the points-to sets of the variables of interest.Experimental results show that the demand-driven points-to analysis algorithm improves the efficiency of the Flow Sensitive Context Insensitive(FSCI) points-to analysis.
  • LI Qiang,SUN Zhenyu,LEI Xiaofeng,SUN Gongxing
    Computer Engineering. 2016, 42(11): 76-82. https://doi.org/10.3969/j.issn.1000-3428.2016.11.013
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Maximum use of local disk I/O resources is the key to improve computing cluster performance,but most of the scheduling algorithms in Hadoop system do not consider this factor.Aiming at this problem,a new task selection strategy is proposed,which takes the disk workload as a parameter in the procedure of MAP task selection and refers to each disk workload to choose the appropriate task during task scheduling,so as to achieve balanced disk workload on data nodes.Besides,a new task selection module is designed and integrated into the task scheduler of Hadoop.In order to further improve Hadoop system’s performance,an appropriate fully localized job execution mechanism is implemented.Experimental results prove that the proposed strategy makes full use of disk I/O resources,reduces I/O Wait by 5% on average,increases CPU utilization rate by 15% on average,and reduces the job execution time by 20%.
  • ZHOU Yilei,YU Haibo,ZHONG Hao
    Computer Engineering. 2016, 42(11): 83-88. https://doi.org/10.3969/j.issn.1000-3428.2016.11.014
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    During the evolution of software,the debugging of programs is an important and necessary process.For multithreaded programs,the interleaving and scheduling are non-deterministic.Different scheduling can give different results,so it is difficult for developers to debug a multithreaded program.Therefore,this paper analyzes the concurrency bugs in real projects,presents a new multithreaded debugging tool,and proposes the concept of sequence point.It designs the scheduling language for debugging,instruments the test programs in bytecode level,and makes threads schedule in a desired sequence.It implements a debugging plugin in Eclipse.Experimental results show that compared with existing tool IMunit,the tool reduces developers’ workload and enhances two debugging scenarios.It has better availability.
  • YU Xinsheng,PANG Tao
    Computer Engineering. 2016, 42(11): 89-94. https://doi.org/10.3969/j.issn.1000-3428.2016.11.015
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To meet the demand for high availability of military computer systems,combined with the analysis of the concurrent processing architecture among the multiple system boards,a CPCI-based high availability hot swap technology is designed and implemented.The bus control modules are deployed on the system boards to manage the connection and disconnection between on-board processors and PCI buses.Once some failures are detected within the current PCI master device,the bus control module chooses some standby system board as the next PCI master device using the PCI master device selection algorithm.Experimental results show that this technology has little impact on the system startup time and memory occupation,and it meets the real-time,efficient processing and high availability requirements of military computer systems.
  • ZHANG Shaohua,JIAO Yi,CHEN Gang
    Computer Engineering. 2016, 42(11): 95-101,108. https://doi.org/10.3969/j.issn.1000-3428.2016.11.016
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to realize efficiently collaboration in the construction domain,this paper presents a Building Information Model(BIM) cloud platform modeling method for collaborative governance,proposes a BIM collaborative governance framework and divides general BIM application into three patterns,preBIM,BasicBIM and BIMinstance.It defines each BIM pattern with respect to operation,procedure and parallelism,and forms a formal BIM cloud platform model based on object-oriented Petri net theory.This model abstracts BIM cloud platform as four subnets and their interaction relationship.This paper builds Hadoop-based BIM cloud platform with guidance of the proposed model.Collision detection case results show the effectiveness of BIM cloud platform in collaborative governance.
  • SHI Yun,CHEN Zhong,MENG Xianyong
    Computer Engineering. 2016, 42(11): 102-108. https://doi.org/10.3969/j.issn.1000-3428.2016.11.017
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to solve the excessive energy consumption and network transmission quality decline caused by improper clustering in large scale Wireless Sensor Network(WSN) deployment,this paper presents a data transmission support method in WSN based on Self partition cluster head.It specifies the node with the strongest radio frequency signal as the Cluster Head(CH) node.In each area,by calculating the energy threshold of each Cluster Member (CM) node,it selects the node with best threshold and the shortest polling time as a backup node of the CH node.It analyzes and calculates the best time to trigger this rotation so as to find the constraints of network energy consumption minimization and reduce the energy expenditure during the reconstruction of the network topology,thereby improving the life cycle of WSN and guaranteeing efficient data transmission during the survival of WSN.Experimental results show that the proposed mechanism can effectively increase the stability time of network,improve the quality of network transmission,and save energy consumption in the whole network.
  • LIAO Jie,ZHANG Lei,MA Sasa
    Computer Engineering. 2016, 42(11): 109-113. https://doi.org/10.3969/j.issn.1000-3428.2016.11.018
    Abstract ( ) Download PDF ( )   Knowledge map   Save

    Aiming at problems of poor adaptability and complex calculation of node distribution density for the traditional node deployment strategy,a Wireless Sensor Network(WSN) coverage control optimization strategy based on adjustable node distribution density and hybrid scheduling of redundant nodes is proposed.An adjustable parameter is introduced to make a certain proportion of nodes uniformly distributed in the whole monitoring area,and the remaining nodes are deployed in a certain density relationship.A hybrid scheduling strategy of redundant nodes is realized by combining the node scheduling algorithm of Redundant Node Judging Based on Grid(RNJBG) and the Energy Balanced Non-uniform Distribution Node Scheduling Algorithm(EBNDNS) to optimize network coverage.The simulation results show that the strategy can effectively prolong the network lifetime under the premise of ensuring the quality of network coverage.

  • HUA Hailiang,GUAN Weiguo,LIU Zhijian,SUN Zehong
    Computer Engineering. 2016, 42(11): 114-119. https://doi.org/10.3969/j.issn.1000-3428.2016.11.019
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem that the beacon coverage is limited and the positioning accuracy is low when the indoor WiFi and Bluetooth are located separately,an optimal Bayesian fusion and localization algorithm based on WiFi and Bluetooth positioning data is proposed.The algorithm first uses the Gauss kernel function to deal with the separate positioning results of WiFi and Bluetooth as a priori sample information.The rank sum test is used to calculate the credibility of the information source of WiFi and Bluetooth.Then the results and the distribution of the fusion results are obtained by the fusion of multi-source prior information.Finally,the optimal estimation of WiFi and Bluetooth fusion positioning coordinates is obtained by optimizing the distribution density function of the Bayesian and estimating the coordinate deviation.Experimental results show that this algorithm can effectively improve the cooperative positioning accuracy of WiFi and Bluetooth.Under the condition of Gauss noise standard deviation of 3 dBm,the probability of positioning error less than 1 meter can reach 95%.The positioning performance is obviously superior to that of the separate WiFi and Bluetooth localization algorithm.
  • FAN Xinyue,SU Yantao,ZHOU Fei
    Computer Engineering. 2016, 42(11): 120-124,130. https://doi.org/10.3969/j.issn.1000-3428.2016.11.020
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Traditional Compressed Sensing(CS)based channel estimation methods are difficult to implement due to their high computational complexity.To solve this problem,Generalized Orthogonal Matching Pursuit(GOMP)algorithm is used for channel estimation,which reduces computational complexity by selecting multiple indices in each iteration.Considering the power distribution of wireless channel,a classifying-back-tracing based Generalized Orthogonal Matching Pursuit(Cbt-GOMP) algorithm is proposed.This algorithm utilizes classification to select atoms and takes advantage of back-tracing to remove non-matching atoms,which guarantee a fast and accurate atom selecting.Simulation results show that this algorithm can not only guarantee channel estimation accuracy,but also reduce computational complexity effectively.Additionally,this method also has good robustness.
  • FENG Zhengyong
    Computer Engineering. 2016, 42(11): 125-130. https://doi.org/10.3969/j.issn.1000-3428.2016.11.021
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In a single-input-single-output wireless data packet transmission system,the Markov Decision Process(MDP)model is generally used to obtain an optimal transmission rate control policy to reduce the packet loss rate.But in the existing model,the queue overflow packet loss is ignored,causing that the result obtained by model calculating is inaccurrate compared with the practical result.To resolve this problem,an improved cross-layer MDP model is proposed.In this model,a factor of queue overflow packets loss is introduced in the one-step transition reward function.A new optimal transmission rate control policy is obtained from the improved MDP model.Simulation results show that the optimal transmission rate control policy obtained from the improved MDP model is more accurate compared with the previous one and it reduces the simulation deviation error from 12% to 3% on average,yielding better optimal target value.
  • HU Min,LUO Lan,KOU Lan,HUANG Hongcheng
    Computer Engineering. 2016, 42(11): 131-138. https://doi.org/10.3969/j.issn.1000-3428.2016.11.022
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In Delay Tolerant Network(DTN),the forwarding mechanism can cause the relay node to refuse forwarding messages because of the selfish behavior of the node,which results in a decline in network performance such as lower message delivery ratio and longer transmission latency.Focusing on this problem,this paper puts forward a selfish DTN message forwarding mechanism Based on Node Behavior Analysis(BNBA).On the basis of multi-copy transmission and through researching the characteristics of cooperation and noncooperation behaviors between encountered nodes,the node state probability transfer model is established and the process of message delivery between nodes is predicted.According to message copy forwarding conditions,neighbor nodes are selected to forward message with injection.If the message copy doesn’t reach its destination,the submission method based on encountering probability of the destination is enabled.Simulation results show that,compared with Epidemic+TFT,Spay and Wait+TFT and Bubble Rap,the proposed mechanism has higher performance in message delivery ratio,transmission delay and network overhead.
  • QIU Yang,WANG Yijun,XUE Zhi
    Computer Engineering. 2016, 42(11): 139-146. https://doi.org/10.3969/j.issn.1000-3428.2016.11.023
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The traditional static analysis method cannot handle the interaction between the script and the network,and it introduces inaccessible path.The dynamic analysis needs to set up the experimental environment and needs manual analysis.To solve these problems,this paper proposes a Python attack script analysis platform called PyExZ3+ based on symbolic execution.Through the dynamic symbol execution and path exploration of Python script,it can get the input traffic and the corresponding output attack payload,which can realize the automatic analysis of Python attack script.PyExZ3+ uses loop identification and run time solver optimization strategy to improve the path coverage and the efficiency of symbolic execution.Experimental results show that PyExZ3+ has a higher path coverage and execution efficiency compared with the existing symbolic execution tools,such as CHEF and PyExZ3.Besides,PyExZ3+ can dynamically detect the target script’s payload and perform feasible automated analysis efficiently.
  • WU Weibin,LIU Gongshen
    Computer Engineering. 2016, 42(11): 147-151. https://doi.org/10.3969/j.issn.1000-3428.2016.11.024
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to improve the anti-aggressive capability of classifiers in the adversarial environment and training stage,this paper proposes a new attack simulation algorithm.The member classifiers are fitted to simulate and get the decision boundary used by the worst case attack,and the member classifier with poor performance is removed according to threshold setting.The final result of the proposed algorithm is superior to that of the mimicry attack algorithm.Experimantal result shows that this algorithm has no need to get the specific information of the target classifier,and it has higher security while maintaining the accuracy of classification.
  • WAN Liuchan,WEI Yongzhuang
    Computer Engineering. 2016, 42(11): 152-157. https://doi.org/10.3969/j.issn.1000-3428.2016.11.025
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    SPECK is a family of lightweight block cipher algorithms.SPECK cipher receives much attention due to its excellent performance on platform applicability and software implementation.Aiming at the security risk of the SPECK algorithm in resisting cube attack as well as the confusion and diffusion of key bits in the internal structure of the algorithm,this paper applies cube attack on the SPECK32/64 algorithm by combining quadraticity tests with cube test.It finds that 17 bit key can be recovered with a time complexity of about 247when the SPECK32/64 algorithm is simplified into three rounds.Applying cube test on five to seven rounds of SPECK32/64 algorithm,it is found that the key neutral-bit can be captured.So result shows that cube analysis can be effectively resisted,only if iteration of more than 8 rounds is applied to SPECK32/64.
  • WANG Hui,KANG Kaihang,LIU Shufen
    Computer Engineering. 2016, 42(11): 158-164. https://doi.org/10.3969/j.issn.1000-3428.2016.11.026
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    For the characteristics of insider threats in Internet,such as high concealment and difficult management,this paper proposes a Bayesian Network Attack Graph(BNAG) model which aims at insider threats.It takes the behavior of attacker in the attack process as the research object,takes the resource point of the behavior pointing as the basis,and analyses internal threats by the two-tuples .Relying on the BNAG model,it quantitatively analyses the relationship between resources and behavior,behavior and resources in the model,and proposes Probability of Attack Structure Graph(PASG) computational model based on Bayesian reseasoning.It forecasts and analyses the internal threats by the modified likelihood weighting method.The example analysis result shows that the proposed model can effectively forcast and prevent the insider threats.
  • WANG Wen,HUANG Kaizhi,MA Lin
    Computer Engineering. 2016, 42(11): 165-169,176. https://doi.org/10.3969/j.issn.1000-3428.2016.11.027
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In the multi-user network,selfish users refuse to provide other users with cooperation for saving their own energy and computing resources.In order to encourage selfish users to offer their resources,a cooperation incentive mechanism based on users trust degree evaluation is proposed.Relays decide whether to participate in the cooperation or not by comparing the trust value of communication users.The lower trust value of the selfish users,the more likely they will be rejected when they need cooperation.While the higher trust value of users,the more likely they will get cooperation.Simulation results demonstrate the effectiveness of the proposed scheme.The selfish users actively participate in the cooperation to increase their own trust value,and the communication quality is close to that of the cooperation with selfless relays.
  • MA Yang,QIANG Xiaohui,CAI Bing,WANG Linru
    Computer Engineering. 2016, 42(11): 170-176. https://doi.org/10.3969/j.issn.1000-3428.2016.11.028
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Existing domain name detection schemes face difficulties in dealing with large-scale data and various malicious domains.Aiming at this problem,this paper designs a malicious domain detection scheme based on the features of the timeliness,relevant domain set and the corresponding IP.It uses parallelized random forests algorithm to build the classifier and process large-scale data,which improves classification precision and fault tolerance.Experimental result shows that,compared with decision tree classifier,the combined classifier has better performance in precision and accuracy,which can solve the problem of malicious domain detection in large-scale network environment more efficiently.
  • WANG Lei,HOU Zhengfeng,XIANG Runzhao,SHI Zhaopeng
    Computer Engineering. 2016, 42(11): 177-181,188. https://doi.org/10.3969/j.issn.1000-3428.2016.11.029
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The garbage code control flow obfuscation algorithm based on random insertion strategy has uncertain obfuscation potency and extra overhead.To solve the problem,a control flow Obfuscation algorithm based on Nesting Complexity(OB_NC) is proposed.The algorithm quantitatively calculates the overhead introduced by obfuscation,measures the complexity of control flow with nesting complexity metric and selects the insertion point of garbage code considering the intensity and cost of obfuscation with the idea of decision table based on group knapsack theory,which improves the obfuscation potency as much as possible within the threshold.Analysis and experiment results show that compared with the control flow obfuscation algorithm based on random insertion strategy,the OB_NC algorithm has higher obfuscation potency and can effectively control the extra overhead caused by obfuscation.
  • FENG Wenlin
    Computer Engineering. 2016, 42(11): 182-188. https://doi.org/10.3969/j.issn.1000-3428.2016.11.030
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The traditional security services mostly put virus killing entities inside the users’ operating systems,which causes huge resource expenditure and waste,and the security software itself is in the unsafe environment,thus being easily destroyed by malicious programs,so it is difficult to ensure the integrity of the security service.Therefore,this paper proposes a new virus killing mechanism with no proxy.The service entity is placed outside the protected operating system,and no plug-in or proxy program is installed in the protected system.Based on virtualization platform,online and offline virus killing are realized by using virtualization technology,and no plug-ins are required to be inserted into the system,which ensures the integrity of the service.Experimental results show that,the proposed offline virus killing security service has better user transparency,can effectively detect whether the current process of the system is malicious code or not,and successfully terminates malicious programs.
  • LIU Jie,ZHAO Lei
    Computer Engineering. 2016, 42(11): 189-194,201. https://doi.org/10.3969/j.issn.1000-3428.2016.11.031
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the influence of secondary disasters on resource distribution in emergency rescue,this paper proposes an algorithm for collaborative decision making of emergency resources distribution.Firstly,it takes minimizing the time to complete the distribution tasks as the optimization objection and constructs the two-layer optimization model for emergency resource distribution based on random strategy,which considers secondary disasters including damaged roads and mudslides.Meanwhile,in order to solve the multi-extreme value problem of the optimization model,the Memetic distribution algorithm constructed by the Single Objective Vehicle Routing Problem(SVRP) and the Two-layer Multi Objective Vehicle Routing Problem(MVRP) is built based on the Differential Evolution(DE) and Q reinforcement learning theory.Experimental results show that the proposed algorithm has higher convergence speed and convergence precision than Multi start and Branch cut algorithms.
  • LIU Liangxuan,HUANG Mengxing
    Computer Engineering. 2016, 42(11): 195-201. https://doi.org/10.3969/j.issn.1000-3428.2016.11.032
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To solve the problem that traditional topic models based on multinomial distribution cannot properly capture the condition of word burstiness,a continuous-time topic model with Dirichlet Compound Multinomial(DCM) for word burstiness is proposed,which integrates inherent temporal information in the corpus.In this model,the phenomenon of word burstiness is modeled by DCM distribution,while temporal features are characterized by Beta distribution.Gibbs sampling and fixed -point iteration method are employed to estimate the parameters in the model.Experimental results demonstrate that the model has obvious advantages over ToT and DCMLDA in terms of generalization performance when the given number of topics is small,and it can also effectively reveal the latent evolutions of topics in the corpus.
  • SUN Daming,ZHANG Bin,ZHANG Shubo,MA Anxiang
    Computer Engineering. 2016, 42(11): 202-206. https://doi.org/10.3969/j.issn.1000-3428.2016.11.033
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    There are some users that own some records in the search log,but it is not enough to provide the users with personalized query recommendation service.For these users,it can improve their satisfaction with recommendations results if targeted recommendation services can be provided.In order to solve the problem,this paper proposes a differentiated query recommendation method for different search background.Users are divided into different groups according to user search behavior.Different recommendation service is provided for different groups to realize differentiated recommendation service between group users.Experimental results on real datasets show that this method can reduce the risk of recommendation failure and improve the satisfaction of users for recommendation.
  • LIU Xianfeng,GUO Linyuan
    Computer Engineering. 2016, 42(11): 207-212. https://doi.org/10.3969/j.issn.1000-3428.2016.11.034
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to improve the sub graph mining efficiency of Genetic Algorithm(GA) in complex network,this paper designs a new data structure,which is named Adjacency Tree(AT).There is double-tree structure in AT,which is developed from the chain structure in Adjacency List(AL).It means that the head-nodes and list-nodes of original adjacency list are both organized by AVL tree.AT reduces time complexity to O(lb(n2)) and space complexity to O(n).Based on the experiment on datasets of biological networks and social networks with Multi-objective Genetic Algorithm(MOGA),experimental result shows that AT achieves better mining performance compared with the AL and Orthogonal List(OL) in large datasets,and it also has better generality.
  • WU Jie,LIANG Yan,MA Yuan
    Computer Engineering. 2016, 42(11): 213-218. https://doi.org/10.3969/j.issn.1000-3428.2016.11.035
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper researches on attribute reduction method of concept lattice.It proves that any attribute defective value of concept lattice is a discernible attribute set.It also proves that discernibility functions for attribute defective values set of infimum irreducible concepts and all attribute defective value set have the same minimum disjunction normal form.Meanwhile,this paper proves that concept element is an infimum irreducible concept if and only if it is an attribute concept.The set composed of arbitrary elements taken from attribute defective values of each attribute concept definitely is an attribute reduction.A rapid method is presented for obtaining all attribute reductions of concept lattice from the large context based on the existing correlative method and corresponding algorithm is given.The time and space complexity of the algorithm are proved in the polynomial form.Analysis result shows that the proposed method has the advantages of no harsh terms,wide reduction range,short running time and good reduction effect.
  • TAO Wenhua,LIU Hongtao
    Computer Engineering. 2016, 42(11): 219-224. https://doi.org/10.3969/j.issn.1000-3428.2016.11.036
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to avoid conflict between the sub objectives and improve the quality of Pareto optimal solution for the multi-objective optimization problem,a Hybrid algorithm based on Differential Evolution(DE) and Non-dominated Sorting Genetic Algorithm Ⅱ(HDE-NSGA-Ⅱ) is proposed.First of all,DE algorithm with self-adaptive parameters is used for mutation and crossover operations of initial population,so that population diversity is improved.Secondly,a new population marking strategy is adopted to dominate the initial population and testing population of DE to obtain a new population whose individuals are marked.The strategy enables DE to dispose multi-objective problem.Finally,the new population,as the initial population of NSGA-Ⅱ,will generate the next generation population by NSGA-Ⅱ.The quality of Pareto optimal solutions will be further promoted by this step.Four multi-objective benchmark functions are tested by HDE-NSGA- Ⅱ,NSGA-Ⅱ and SADE.Experimental results show that the convergence rate of the proposed algorithm is faster and the spatial distributions of Pareto optimal solution set is more uniform than the other two algorithms.
  • ZHU Jing,PENG Zhan
    Computer Engineering. 2016, 42(11): 225-232. https://doi.org/10.3969/j.issn.1000-3428.2016.11.037
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    When dealing with the filtering results under dense environment or with more motor tracks,the fuzzy track correlation algorithm can associate much better than basic algorithms.When the amount of elements in the fuzzy factor sets is bigger and the distribution of fuzzy factor weight is in relative equilibrium,it is likely to be difficult to differentiate the results of appraisement,which leads up to the track association of a sort.Thereby,this paper studies and ameliorates the fuzzy track association algorithm,and puts forward the fuzzy synthetic decision track association algorithm based on double-layer architecture,by introducing double-layer structure from the synthetic appraisement in fuzzy mathematics.Simulation result shows that,compared with the former fuzzy synthetic decision track association algorithm,this improved algorithm increases 2.85 percent on the average correct rate of track association.
  • YANG Kaibin,TANG Lijun,LIU Xiaochun,WU Dingxiang,BIAN Yijie,LI Zhenglong
    Computer Engineering. 2016, 42(11): 233-237. https://doi.org/10.3969/j.issn.1000-3428.2016.11.038
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The difference of imaging mechanism in different source image leads to large deformation or light and other differences between two different source images in the same scene,so it is difficult to directly use Scale Invariant Feature Transform(SIFT) or Speeded Up Robust Features(SURF) for different source images registration.For this reason,a registration method based on texture common factor is proposed for different source image.According to Fourier transform,the different source image can be transformed to frequency domain,and Gabor template is used for filtering.The method uses Sobel operator to extract the texture common factor from different source image in spatial domain.Matching point is selected by a custom rule,and the matching points can be purred by Random Sample Consensus(RANSC).According to the purified matching points,the method can obtain the homography transformation parameters.Finally,after coordinate transformation and interpolation analysis,it achieves registration between different source images.Experimental results show that the method proposed in this paper has high matching accuracy and robustness compared with SIFT and SURF.
  • ZHOU Xianchun,ZENG Bin
    Computer Engineering. 2016, 42(11): 238-243. https://doi.org/10.3969/j.issn.1000-3428.2016.11.039
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the staircase effect and isolated point in the classical models of PM model and YK model,the paper proposes a combinatorial operator model.Regularization is performed on the image to be processed.The gradient in the above-mentioned classical models is combined with Laplace operator according to certain weighting function.Then,in view of the phenomenon of excessive smoothing in nonlinear diffusion,this paper uses edge harmonic operator for modification.Experimental results show that this model not only effectively controls the staircase effect and isolated point,but retains the detailed texture features of the image quite well,and it has an obvious denoising effect.
  • YE Xueyi,SONG Qianqian,GAO Zhen,HUAN Tianshu,WANG Yunlu
    Computer Engineering. 2016, 42(11): 244-248,254. https://doi.org/10.3969/j.issn.1000-3428.2016.11.040
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Because of the target uncertainty in underwater acoustic data and the gray value overlap between various substances,it is hard to solve the classification problem by only using maximum histogram or entropy of histogram.This paper proposes a classification algorithm which is based on the conditional entropy of histogram.Firstly,the underwater acoustic data is segmented according to its cumulative histogram and the largest value of each section histogram is calculated.Secondly,by using the discrimination of conditional entropy of subsection histogram,a classification threshold is calculated for each section.Finally,the classification threshold is assigned an opacity transfer function.The classification of underwater acoustic data is completed.Experimental results show that the algorithm proposed in the paper achieves better effect in classification of underwater acoustic data.The rendering result of suspected target is clearer and the detail information is richer.
  • HUANG Yonggang,LIANG Xingang
    Computer Engineering. 2016, 42(11): 249-254. https://doi.org/10.3969/j.issn.1000-3428.2016.11.041
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To overcome the disadvantage of Local Binary Fitting(LBF) model that it cannot segment texture image and has too slow convergence speed,an active contour model combining local fitting energy and regional similarity is presented.It introduces an acceleration factor based on image gradient information and a similar term based on local similarity fitting.Image noise can be inhibited and segmentation accuracy is improved by introducing similarity information of regional intensity distributions into the energy function.The speed of image segmentation can be improved by introducing gradient information to amplify the driving force nearby the objective contour.Compared with the LBF model,the presented model can be more rubust to noise,can segment texture image,and has a high rate of segmentation.
  • YANG Tao,TIAN Huaiwen,LIU Xiaomin,KE Xiaotian,GAO Songsong,MA Mengjie
    Computer Engineering. 2016, 42(11): 255-260,266. https://doi.org/10.3969/j.issn.1000-3428.2016.11.042
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    As a classical algorithm,Otsu algorithm is applied widely in image segmentation field.The application of Two-dimensional(2D) Otsu algorithm based on Otsu algorithm is limited for the long computing time and poor anti-noise capacity.Therefore,this paper proposes a modified Otsu algorithm.By using new area partition method,the proposed algorithm respectively applies three edge detection operators(Sobel,Log and Canny) combined with linear fitting method to limit the object and background between a pair of boundaries which are parallel to the diagonal,and then uses the domain average value of noise pixels instead of the grey value.Finally the 2D Otsu oblique segmentation method is used to separate object from background.Experimental results show that compared with the traditional 2D Otsu algorithm and its modified algorithms,the proposed algorithm not only has a short operation time but also has a high quality of segmentation,a strong anti-noise capacity and a preferable adaptive ability.
  • YU Shuineng,WEI Ning,DONG Fangmin
    Computer Engineering. 2016, 42(11): 261-266. https://doi.org/10.3969/j.issn.1000-3428.2016.11.043
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper proposes a novel image rotation object detection algorithm,which is especially suitable to detect objects having non-regular and random rotation symmetry characteristics in real-world images.Based on the implicit model representation,the algorithm counts the spatial distribution of the key points in the image and estimate the rotation center of the object.The algorithm extracts the visual interesting points from the image middle-level features.The unsupervised learning is employed in the key point feature space,aiming at labeling each position of the key points as one of the symmetry feature clusters.Thereby the probability map for center is gained by summing all the values voted in every cluster under some given radius.The maps are weighted summed together to obtain the global rotation center probability map,and the coordinates of the rotation center are achieved by saliency detection method on the map.Experimental results show that the algorithm can effectively detect the irregular rotational symmetry objects in real-world images,and it also has a higher accuracy for the estimation of the rotational symmetry center.
  • ZHAO Zhengkang,LIU Ningzhong,LI Wei
    Computer Engineering. 2016, 42(11): 267-271. https://doi.org/10.3969/j.issn.1000-3428.2016.11.044
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The performance of image inpainting is dependent on the utilization of known information for inpainting the missing parts of the images,and fast speed of inpainting is required.Aiming at these two key issues,an inpainting algorithm is proposed which selects samples according to the average gray entropy and obtains the final matching block through weighted synthesis.The algorithm divides the image to be repaired into grids and calculates the average local entropy in each grid.The Otsu threshold segmentation algorithm is used to classify all grid areas into two mutually exclusive sets.The proposed algorithm determines the range of the sample block according to the average gray entropy of the grid in the area to be repaired,picks up sample blocks according to the Sum of Squared Differences(SSD) criterion,uses attenuation function to determine the weight of each sample block,and ultimately synthesize the final sample block.Experimental results show that the proposed algorithm can achieve good inpainting effect and greatly improve the speed of inpainting.
  • TANG Chunming,JIANG Ang
    Computer Engineering. 2016, 42(11): 272-276,280. https://doi.org/10.3969/j.issn.1000-3428.2016.11.045
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    An improved algorithm is proposed based on cross-scale cost aggregation for stereo matching because the existing stereo matching algorithms cannot take both accuracy and speed into account.Firstly the matching cost volume is computed by the intensity and gradient algorithm.Then the matching cost volume is aggregated by guide filtering and the matching cost volume of different scales is aggregated by the cross-scale model.The patch matching approximation algorithm is used instead of the traditional Winner Taker All(WTA) algorithm to calculate the initial parallax,and the weighted median filtering is adopted for subsequent processing in parallax elaboration.Experimental results show that the algorithm can rapidly obtain the disparity map and also improve the matching accuracy.
  • ZHANG Zhe,YANG Min,ZHU Zhengtao
    Computer Engineering. 2016, 42(11): 277-280. https://doi.org/10.3969/j.issn.1000-3428.2016.11.046
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Due to the rust on metal surface,it is easily to make the line trace images affected by noise,and causes the problem of the image feature extraction,comparison and analysis.Traditional denosing method like Gauss filter causes damage and excurtion to the edge while mean filter cannot distinguish between edge and background effectively.Therefore a new denoising method is proposed in this paper.The diffusion model of this partial different equation filter algorithm is PM equation.According to the textural feature of striation mark image,the different weight functions in different diffusion directions are introduced.However,the diffusion threshold is set up via gray level histogram of image in this method.The method can eliminate image noise without destroying image edges simultaneously.Experimental results show that the algorithm is superior to the PM equation and LinShi operator and it has good application value in line trace image processing.
  • REN Yao1,2,LI Guofu1,YING Xiaogang1,WANG Xiaodan1
    Computer Engineering. 2016, 42(11): 281-284. https://doi.org/10.3969/j.issn.1000-3428.2016.11.047
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The frictional vibration and noise signal are difficult to obtain,signal anti-interference ability is poor,and a large number of samples in the production are difficult to obtain.In view of this a scheme for recognition of friction and wear state of machine tool guide based on wavelet package entropy and Support Vector Machine(SVM) is put forward in this paper.Signal is decomposed into independent adjacent node band by wavelet packet decomposition.Comparative experiment is designed to obtain wavelet packet node sequences corresponding to the guide rail friction characteristic frequency band.The feature vectors are established by the sequence of wavelet packet energy entropy which are as the input parameters of the SVM.Experimental results show that the average recognition rates of the SVM classifier which is established through polynomial kernel function and radial basis kernel function can reach 72.2% and 83.3% separately,which has good prediction generalization ability and high recognition accuracy.
  • HU Xia,YANG Yuhong,JIANG Lin
    Computer Engineering. 2016, 42(11): 285-289. https://doi.org/10.3969/j.issn.1000-3428.2016.11.048
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The Matching Pursuit(MP) is a sparse expression greedy algorithm which is applied to signal processing.Its computation complexity is high due to traversalmatching in atom selection,and the matching process has to know the complete signal needed to be processed,which leads its limited application in real-time circumstance.To solve these problems,this paper presents a new audio matching pursuit algorithm,which adopts a short-term and non-complete dictionary to sparse by express the signal,so that the signal to be processed is free from length limit.Besides,according to the distribution relationship of signal energy,the atom is preprocessed before matching to improve the execution speed during the matching process.Experimental results show that the algorithm not only can compare the efficiency with Krstulovic′ fast algorithm in signal expression,but also can reduce the computation complexity and improve the running speed.
  • MENG Qingqing,WANG Jianyong
    Computer Engineering. 2016, 42(11): 290-294,,299. https://doi.org/10.3969/j.issn.1000-3428.2016.11.049
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to solve the complex network problems in the process of large-scale monitoring video interconnection,such as different packet loss rate,network types and system bandwidth,according to GB/T 28181 national standard and related industry standards,a streaming media protocol framework for the city monitoring and alarming networking platform is proposed,which is based on Session Initiation Protocol(SIP) and Real-time Transport Protocol(RTP).On the basis of this proposed protocol framework,a streaming media system implementation scheme for the city monitoring and alarming networking platform is designed.Based on the central streaming media server and cloud storage technology,this scheme supports wide range multi-level and cross-domain media interconnection as well as efficient and centralized media data storage and distribution.Practical application proves that the proposed scheme is able to reduce the response time of single-domain real-time video request to less than 2 s,and effectively adapt to the network environment whose loss rate is up to 30%.It is better at universality,stability and adaptability,and is able to meet the requirements of public security video monitoring interconnection.
  • LIN Haibo,KE Jingjing,ZHANG Yi
    Computer Engineering. 2016, 42(11): 295-299. https://doi.org/10.3969/j.issn.1000-3428.2016.11.050
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To solve the problem that particle degeneration exists in the re-sampling procedure and the proposed distribution is not accurate for Rao-Blackwellized Particle Filter(RBPF),an improved RBPF algorithm with Particle Swarm Optimization(PSO) Genetic Re-sampling is proposed.In order to improve the accuracy of distribution proposed by RBPF algorithm,the improved algorithm fuses the robot’s odometer information and the distance information collected by laser sensor.The Particle Swarm Optimization(PSO) policy is introduced to adjust particle collection in the sampling by the energy efficiency of particles.Meanwhile,Genetic Variation(GV) is performed on particles with smaller weights in re-sampling to relieve particle depletion,improve the consistency of robot’s pose estimation,and maintain the diversity of particles.The algorithm is verified on the Pioneer3-DX robot which is equipped with a URG laser sensor and based on the Robot Operating System(ROS).Experimental results show that the improved RBPF algorithm can significantly improve the accuracy of robot pose estimation while ensuring the diversity of the particle set.
  • HAN Chengchun
    Computer Engineering. 2016, 42(11): 300-304. https://doi.org/10.3969/j.issn.1000-3428.2016.11.051
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Clinicians diagnose functional ankle instability through X-ray films and experience,which increases the clinical diagnosis cost and diagnostic threshold.For this problem,this paper develops an ankle function evaluation system based on the three-dimensional(3D) measurement and reconstruction model.The system adopts the positive-negative structured light projection technology and detects the structured light edge and achieves sub-pixel localization through the structured light positive-negative stripe images.It uses the sub-pixel position of the structured light edge and active stereo vision principle to realize the 3D measurement and reconstruction of the ankle posture,and analyzes the ankle function instability through the 3D reconstruction model.Experimental results show that the system can analyze and evaluate the ankle function instability effectively,and provide favorable diagnosis and decision-making basis for the early detection and rehabilitation treatment of ankle function instablity.
  • LU Bin,JIANG Xinghao,SUN Tanfeng
    Computer Engineering. 2016, 42(11): 305-308. https://doi.org/10.3969/j.issn.1000-3428.2016.11.052
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to detect objectionable videos on the Internet,a detection method for objectionable video based on unsupervised learning features is proposed.The Independent Subspace Analysis(ISA) network is trained upon a set of unlabeled videos to learn motion patterns.The well trained network can be employed to extract motion features in the videos to be detected.Combined with bag of words,the motion features help classify objectionable videos from normal ones by Support Vector Machine.Compared with traditional hand-designed features such as optical flow and motion histogram,the proposed features have high computing efficiency,besides they are not sensitive to video qualities.After the experiment on the video library,it is found that the objectionable videos detection accuracy rate of the proposed method is promoted by about 10% than that of the approach in comparison.
  • LI Zhi,SUN Yubao,WANG Feng,LIU Qingshan
    Computer Engineering. 2016, 42(11): 309-315. https://doi.org/10.3969/j.issn.1000-3428.2016.11.053
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem that clothing image retrieval algorithm based on deep learning has low classification accuracy,this paper proposes an improved clothing image classification and retrieval algorithm based on Deep Convolutional Neural Network(DCNN).A clothing image database that contains 100 000 images with 16 attributes,called B_DAT Clothing,is established.Duo to the complex performance of clothing images,it uses DCNN to learn the features adaptively from the B_DAT Clothing Database,design the hash index of CNN’s features for building an efficient attribute-based retrieval model,and realize efficient classification and quick retrieval of Clothing images.Experimental results show that the algorithm can achieve better performance in terms of classification and retrieval than the traditional visual feature classification algorithms.
  • XING Yahong,DU Xinhui
    Computer Engineering. 2016, 42(11): 316-321. https://doi.org/10.3969/j.issn.1000-3428.2016.11.054
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In view of complicated data collecting and low operability of power grid load forecasting,a greedy algorithm of no aftereffect is proposed in this paper.In the greedy algorithm,top-down man-machine cooperation is used to select and criticize the greed to select data.And then,integrated land classification is used to re-divide land type and determine the optimal allocation factor,thereby simplifying the data processing process.Afterwards,the spatial load forecasting platform combining the greedy and grid algorithms is established to divide the load level.The power supply situation can be intuitively displayed using the grayscale map through the threshold segmentation of the land-use planning diagram combined with the substation power supply situation.The network planning method is proposed based on power supply zone.Analysis result shows that the proposed method guarantees the stable operation of distribution network,and plays an important role in guiding power grid planning.