Author Login Editor-in-Chief Peer Review Editor Work Office Work

20 December 2012, Volume 38 Issue 24
    

  • Select all
    |
    Networks and Communications
  • XU Feng, LIU Chao, SHI Xue-Meng
    Computer Engineering. 2012, 38(24): 1-4. https://doi.org/10.3969/j.issn.1000-3428.2012.24.001
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To ensure the user data security under cloud computing environment, this paper uses homomorphism encryption algorithm for data and encryption function of privacy protection function, and designs a new fully homomorphic encryption algorithm based on integral polynomial ring. Both the homomorphic encryption and the re-encryption are included in the algorithm. The homomorphic encryption is used to encrypt the data, and re-encryption is used to encrypt the encrypted data. Analysis result shows that the proposed algorithm computing complexity O(n5) is lower than ideal lattice fully homomorphic encryption.
  • GONG Yu, LI Shuai, LI Yong, SU Li, JIN De-Feng, CENG Lie-Guang
    Computer Engineering. 2012, 38(24): 5-8. https://doi.org/10.3969/j.issn.1000-3428.2012.24.002
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to solve the problem that the conventional approaches for network experiment platforms, which construct an experiment platform either as a physical network directly or in the form of overlay networks, cannot guarantee a high resource utilization and a well network link quality at the same time, this paper proposes a scheme for network experiment platform based on cloud computing and virtualization technology, and provides its preliminary implementation called TUNIE. Result shows that the platform is both flexible in function support and convenience for use, as well as providing a high bandwidth connection and a well-isolated network environment for multi-user support.
  • WANG Xiao-Wei, DIAO Yi-Ming
    Computer Engineering. 2012, 38(24): 9-13. https://doi.org/10.3969/j.issn.1000-3428.2012.24.003
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Data security is a big block for the promotion of cloud computing, which is mainly derived from data sharing and privileges of vendors. This paper analyzes features of data storage and user groups in cloud computing, proposes a kind of cloud computing access control based on Task-role-based Access Control(T-RBAC) model, which provides different access control policy for different object to achieve hierarchical safety. And vendors enjoy no privileges in this mode. Analysis indicates that this cloud computing access control model provides more reliable security feature which can no longer rely on credibility of server.
  • CAO Ze-Wen, ZHOU Tao
    Computer Engineering. 2012, 38(24): 14-16. https://doi.org/10.3969/j.issn.1000-3428.2012.24.004
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper analyzes the prevalent problems such as massiveness, high-dimension and sparse of feature vector of the ordinary algori- thms in clustering textual data, then proposes a massive text clustering based on cloud computing technology as a feasible solution. The classical Jarvis-Patrick(JP) algorithm is chosen as a case. It is implemented using MapReduce programming mode and is testified on the cloud computing platform-Hadoop with Sogou corpus provided by Sogou laboratory. Experimental results indicate that the JP algorithm can be paralleled in MapReduce framework and paralled algorithm can handle massive textual data and get a better time performance than single-node environment.
  • LIAO Fu-Rong, WANG Cheng-Liang, CHEN Shu-Yu
    Computer Engineering. 2012, 38(24): 17-20. https://doi.org/10.3969/j.issn.1000-3428.2012.24.005
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The service of cloud computing faces the huge user group, with the expansion of node scale and the growth of task execution time, the failure rate of cloud computing is increased. To solve this problem, the fault-tolerant scheduling algorithm for cloud computing based on task backup is proposed. It maps task to the node which contains the input data of the task and the load is the smallest. According to the level of cloud computing security, it backups the task, and re-schedules for the failure task. Simulation experimental results show that the algorithm has good fault tolerance, and the success rate of task scheduling is 99%.
  • TAN Mao, DUAN Bin, BANG Bang-Lun, ZHANG Jian-He
    Computer Engineering. 2012, 38(24): 21-26. https://doi.org/10.3969/j.issn.1000-3428.2012.24.006
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    With the rise of Cloud Manufacturing(CMfg) and trend in development of large-scale enterprise on manufacturing resulted in new demands for business collaboration, the new demands of collaborative workflow management for enterprise group in cloud manufacturing are analyzed, and four features of such new management are indicated, autonomy, simultaneous, collaboration, and dynamic. A control model of collaborative workflow based on Hierarchical Finite State Machine(HFSM) is presented, the simplified class structure of the model is designed, the dynamic reconfiguration and behavior control mechanism of the HFSM are described. The proposed model is applied to marketing business collaboration of an iron and steel corporation, results show the validity of the model can increase efficiency of business collaboration and improve marketing business ability obviously.
  • HUANG Xiao-Ling, CHEN Gui-Lin, DIAO Sheng-Hui
    Computer Engineering. 2012, 38(24): 27-31. https://doi.org/10.3969/j.issn.1000-3428.2012.24.007
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    With the demands for software test cases in exponential growth, the problems such as relative shortage of test resources, high cost of testing and low efficiency execute of test cases are increasingly outstanding. To solve the above problems, a parallel test schema based on cloud computing is proposed, which uses Finite State Machine(FSM) to define test object and state transition in testing process. Parallel test case generation algorithm based on random routes idea is proposed, and a parallel test script based on the MapReduce and cloud computing platform is designed. Experimental results show that compared with test sequence of executing sequential, the acceleration of schema is up to 20, so the test efficiency is significantly improved.
  • HUANG Jian-Wen
    Computer Engineering. 2012, 38(24): 32-36. https://doi.org/10.3969/j.issn.1000-3428.2012.24.008
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    With the power grid enterprises network education training system concentrated deployment, the system faces a business peak traffic. To solve this problem, this paper presents a design idea of construction of dynamically extensible application system based on cloud computing, puts forward the system architecture. It includes dynamic cluster extension, system integration based on sharing message bus of publish/subscribe model and data decoupling. Through constructing experimental environment, it simulates the scene that through dynamic allocate resources to support the big concurrent access pressure. The simulation results show that the network education training system based on cloud computing can meet the demand of resource for big concurrent pressure, and ensure the user experience.
  • CANG Dong-Song, Vincent Garonne, SUN Gong-Xing
    Computer Engineering. 2012, 38(24): 37-41. https://doi.org/10.3969/j.issn.1000-3428.2012.24.009
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Monitoring and analyzing large-scale distributed applications in grid or cloud environment is very difficult, due to the complexity of platform and network environment. This paper describes a system to monitoring and analysis such applications. This system is based on the concept of data stream management. It uses message queues to collect, cache and distribute trace messages, uses distributed computing framework to analyze the trace messages in real time. The prototype is deployed in a real Petabyte-scale distributed data management system. The usefulness of the collected trace messages is demonstrated by examples. Application result shows that this system is easy to deploy and has little affection on the applications, can well suit the requirement of big data analysis and real-time compute, provides a platform to analyze the performance of large-scale distributed system, predict user behavior.
  • LI Dong, XIE Yong-Jiang, FAN Yan-Fang, TAO Jie, WANG Shi
    Computer Engineering. 2012, 38(24): 42-45. https://doi.org/10.3969/j.issn.1000-3428.2012.24.010
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The resource requirements under the Service-oriented Architecture(SOA) are characterized by dynamic variation and on-demand acquisition as a result of the high workload fluctuations. To solve the problem, this paper analyzes and discusses the major techniques of SOA and virtualization, then puts forward a resources guarantee model for SOA. It includes service application layer, resource matching layer, logic virtual layer and physical resource layer. Experimental result shows that the model can resolve the optimal adaptation problem between user-level application and bottom physical resources, and improve the utilization rate of bottom resources.
  • LIU Jie, WANG Jia-Cha, OU Yang-Yong-Ji, WANG Qing-Xian
    Computer Engineering. 2012, 38(24): 46-49. https://doi.org/liujie710@hotmail.com
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Taint pointers are serious threats to the security of data flow and control flow. A method for binary defect detection is proposed, which is based on dynamic taint propagation, dynamic symbolic execution and bound constraint analysis, including introduction of the pointer propagation rules, generation of trigger condition by combing path constraints with bound constraints. It can generate inputs for four types of code defects caused by taint pointer. Test results show that this method reduces the number of test case generation effectively, and a virtual function call hijack and two pointer memory corruption defects are found in the test of Linux system tools.
  • WANG Meng-Jia, HAN Jing-Diao, HAN Song-Jiao
    Computer Engineering. 2012, 38(24): 50-52. https://doi.org/10.3969/j.issn.1000-3428.2012.24.012
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To deal with the sparsity and expansibility of traditional collaborative filtering algorithm, which affects the accuracy of their recommendations, a collaborative filtering algorithm based on fuzzy cluster is proposed in this paper. It applies fuzzy clustering method to cluster the item, and computes the similarity between the users by analyzing the average ratings that the k users rate the items of the clusters. It predicts the ratings of the items that the k users rate based on the ratings of the neighbors that they rate, chooses the first n recommendations. Experimental result demonstrates that the algorithm can improve the accuracy of recommendation under the condition of the extreme sparsity of user rating data.
  • WANG Xiao-Yang, JIN Li, WANG Xiao-Jing, HUANG Wei-Tong
    Computer Engineering. 2012, 38(24): 53-56. https://doi.org/10.3969/j.issn.1000-3428.2012.24.013
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at helping teachers verify the originality of students reports during teaching, this paper presents the design and development of a similarity detection system based on sequence matching. An explicit similarity measurement model is established, the length of common subsequence is calculated based on the sequence matching algorithm, and the similarity between each pair of students documents in the same group is obtained. The similarity matrix is further normalized and classified into groups, incorporating the impact of document templates. Comparison results are visualized which are intuitively understandable for teachers to learn the similarity distribution across the whole class. Experimental results show the feasibility and practicability of the designed system, which can help teachers quickly detect the plagiarism.
  • ZHOU Feng, ZHOU Hai-Ying, ZUO De-Cheng, LI Tao
    Computer Engineering. 2012, 38(24): 57-61. https://doi.org/10.3969/j.issn.1000-3428.2012.24.014
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper analyzes existing On Line Transaction Processing(OLTP) test methods, presents a Spirent-based Web application performance evaluation. Web server and database server as a whole and comprehensively evaluates the hardware and software performance of OLTP applications by testing Transaction Response Time(TRT), Resource Utilization(RU) of system and Transaction Per Second(TPS) of Web applications. This method can provide the guideline to chose and buy or design and deploy high performance computers. And it rapidly discovers the performance bottlenecks of a Web application, which can provide guidance for Web application optimization. Test results show the method improves the Web performance by more than 40 times, and reduces the response time by more than 10 times.
  • DIAO Yi-A, ZHANG Zhong-Chuo
    Computer Engineering. 2012, 38(24): 62-64. https://doi.org/10.3969/j.issn.1000-3428.2012.24.015
    Abstract ( )   Knowledge map   Save
    In order to analyze the delay performance of self-similar traffic, this paper newly expresses the arrival envelope and effective service curve using the theorem of the Moment Generating Function(MGF) and effective bandwidth, and a novel probabilistic concept with MGF is proposed. Based on the corresponding theory, the end-to-end statistical time delay bounds for self-similar traffic is modeled. Numerical analysis results show that the model largely improves statistical multiplexing, and performance evaluation on the Fractional Brownian Motion(FBM) business has better adaptability.
  • ZHANG Xin, XUE Zhi, FAN Lei
    Computer Engineering. 2012, 38(24): 65-69. https://doi.org/10.3969/j.issn.1000-3428.2012.24.016
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The rapidly expanding network scale in recent years brings more and more severe challenges to the distributed network management and test area. In this paper, a new approach of full connectivity testing suite generation problem of distributed network is presented with the goal of increasing the test efficiency by introducing a space parameter based on Minimum Set Cover(MSC) theory. Experimental result shows that the proposed algorithm is more effective than other generation algorithms, about 20% less of test probes than Greedy Search Algorithm(GSA) algorithm and about 99.9% less of calculation time than GRASP algorithm when the space parameter is four.
  • LI Nian-Qiong, HUANG Hong-Guang, LI Feng
    Computer Engineering. 2012, 38(24): 70-73. https://doi.org/10.3969/j.issn.1000-3428.2012.24.017
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In Wireless Sensor Network(WSN), unreasonable node clustering algorithm will inevitably lead to premature death, unawareness of regional information. An improved LEACH algorithm based on remaining energy and location is presented. The clustering hierarchy arithmetic divides the cluster selection process into temporary cluster head and formal cluster head, considers the nodes distance and the remaining energy as an important factor in selection then chooses the best cluster head within the region. OMNET++ simulation results show its node higher utilization ratio and network reliability of improved LEACH algorithm.
  • DAI Huan, HE Lei, GU Xiao-Feng
    Computer Engineering. 2012, 38(24): 74-77. https://doi.org/10.3969/j.issn.1000-3428.2012.24.018
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to reduce the impact of received signal strength indicator ranging error on positioning accuracy, this paper proposes a new centralized localization algorithm based on statistical uncorrelated vector set. The solving equation of the double centered matrix can be simplified by coordinate transformation. In order to reduce the noise disturbance, a new double centered matrix is reconstructed using statistical uncorrelated vector sets, which can be used to calculate the node coordinates directly. Simulation results indicate that the proposed localization algorithm can improve the localization accuracy efficiently when the distance-measuring error is relatively large, which is particularly suitable for Wireless Sensor Network(WSN) nodes based on low cost hardware.
  • CHEN Bai-Yang, LIU Si-An, ZHANG Jiang
    Computer Engineering. 2012, 38(24): 78-80. https://doi.org/10.3969/j.issn.1000-3428.2012.24.019
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    For the slow convergence and divergence problem of the traditional Unscented Kalman Filtering(UKF) algorithm in target tracking, this paper puts forward the improved UKF algorithm. It can real-time adjust the covariance of the state vector and observation vector by introducing adjustment factor, so as to improve the right ratio between the state information and observation information in the filter results and to improve the performance of the tracking system. Simulation results show that the improved UKF algorithm not only can restrain the spread of UKF algorithm, but also can enhance the convergence rate of the tracking system in dealing with target tracking.
  • WANG Zhong-Wei, JIA Xiao-Yan, DENG Lei, QIN Ti-Zhong, GU Zhen-Gong
    Computer Engineering. 2012, 38(24): 81-85. https://doi.org/10.3969/j.issn.1000-3428.2012.24.020
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem of high call blocking rate in Movable Boundary and Guard Channel(MBGC) method, this paper presents a novel call admission control algorithm. The strategy is that the new call users can use the guard channel reserved for the handoff call users with a certain probability when the handoff call dropping rate is less than a threshold. The scheme also takes into account the priority of different data types, and only when the data packet in high-priority data queue is empty, the data packets in general data queue can be transmitted. Simulation results show that, in case of affecting dropped call rate of handoff calls smaller, it can effectively reduce the new call blocking rate and high-priority data blocking rate.
  • LIU Gao
    Computer Engineering. 2012, 38(24): 86-89. https://doi.org/10.3969/j.issn.1000-3428.2012.24.021
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The search mechanism based on flooding in unstructured Peer-to-Peer(P2P) network increases the load of system and structured P2P network requires greater spending to maintain its topological structure. This paper proposes a hierarchical searching scheme with social network features in P2P network. It adopts the rationale of social network, high semantic similarity nodes are distributed in the same virtual community, and searching links are actively built between nodes in the same virtual community. Experimental results show that this searching scheme is capable of effectively improving the search efficiency of P2P network.
  • BO Pan, CHEN Lan, ANG Zhi-Min, LI Ying
    Computer Engineering. 2012, 38(24): 90-95. https://doi.org/10.3969/j.issn.1000-3428.2012.24.022
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    ZigBee’s tree topology shows better performance in reliability and power consumption than star and mesh topologies in network layer. However, the command and data packets may be inconsistent in tree structure under potential attacking situation. The proposal model which is used by signal-power-comparison method can add parent-child node automatically to form tree topology and transport data packets successfully. Simulation results show that the model can effectively solve confliction between command and data packets, and provide integrate and expandable flow from topology building to data transmitting.
  • GE Jun, ZHOU Lian-Yang
    Computer Engineering. 2012, 38(24): 96-99. https://doi.org/10.3969/j.issn.1000-3428.2012.24.023
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper focuses on how to use a different encoding scheme to improve network throughput. That is, different rates/codes are studied between sensor nodes and the Network Master(NM) and between the NM and the sink. Research results show that throughput increases with the increase of the BCH coding rate. Simulation results also show that the high rate BCH code provides a higher throughput than the low rate BCH code using the single coding scheme. The multi-coding scheme is more efficient and has a very good effect on network throughput over lifetime.
  • LI Zhou, LIU Jian, CHENG Zi-Jing
    Computer Engineering. 2012, 38(24): 100-104. https://doi.org/10.3969/j.issn.1000-3428.2012.24.024
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    As for the real-time problem exists in the redundancy network, this paper presents a fast network topology discovery method based on Parallel Redundancy Protocol(PRP). According to the network requirements, a request packet is sent by the network management system, the agent is sending the topology discovery packet when it receives the packet. Based on the results processed by the network management system, people can obtain the network topology. The method is demonstrated by the platform of OMNET++, results show that time of the network topology discovery is about 60 ms, and this method is compatible with Simple Network Management Protocol(SNMP).
  • LIU Qi-Yong, BANG Hua
    Computer Engineering. 2012, 38(24): 105-107. https://doi.org/10.3969/j.issn.1000-3428.2012.24.025
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the excellent performance that Physical-layer Network Coding(PNC) can improve the communication throughput greatly, this paper proposes a new scheme that can enhance the communicating efficiency much more. In the second stage of the PNC, the scheme uses the idea of PCMA to resolve the problem that the nodes only wait for receiving message and do not sent message to the relay station. Based on the new scheme, the two nodes exchange the messages continuously, and receive two data frames in three slots. Experimental results show that the improved scheme can reduce messages exchange time and enhance the performance by 60%, campared with other schemes of PNC.
  • HUANG Bin, SHI Liang, DENG Xiao-Hong
    Computer Engineering. 2012, 38(24): 108-110. https://doi.org/10.3969/j.issn.1000-3428.2012.24.026
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper concludes that the efficient identity-based signature scheme proposed by Li Jiguo et al is insecure, and gives an attack method, which shows that any attacker can forge a valid signature on any message with respect to any identity. Therefore, Li Jiguo’s scheme does not satisfy existential unforgeability. By making a component of a signature as user’s public key, an improved scheme is proposed, which does not reduce the efficiency of Li Jiguo et al’s scheme while satisfying the existential unforgeability.
  • MAO Yi, CHEN Na
    Computer Engineering. 2012, 38(24): 111-114. https://doi.org/10.3969/j.issn.1000-3428.2012.24.027
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    MD5 algorithm has important applications in the field of data encryption such as digital signature, identity authentication and so on, but it can not resist differential attack and dictionary attack. This paper does certain special transform on some related variables during message preprocessing course of MD5 to improve probability of data high overflow, and increase the avalanche effect and difficulty of differential cryptanalysis, so that collision problem of MD5 algorithm is solved. Through doing collision detection, coincidence detection of encrypted values before and after improvement for improved algorithm, the improved MD5 is proved safe and practical.
  • LIN Xiu-Li, ZHANG Chen
    Computer Engineering. 2012, 38(24): 115-118. https://doi.org/10.3969/j.issn.1000-3428.2012.24.028
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In Wireless Sensor Network(WSN), nodes will inject large amount of false data into the network if they are captured. This paper proposes an en-route filtering enhancement scheme to solve this problem. It employs encryption keys and authentication keys to prevent en-route nodes from distorting data. When en-route nodes are destroyed and can not transfer or detect data, a safety enhanced scheme is applied, which uses the encryption keys of backup nodes to validate the authenticity of transferred data and discard false ones. Parameter MAX_FALSE is introduced to eliminate the influence that incomplete false data make on the received data of base stations. Simulation results show that the scheme is more effective and energy efficient compared with SEF, DEF and FIMA.
  • CHEN Shu-Quan, ZHANG Zhi-Yong, YANG Li-Jun, PU Jie-Shen
    Computer Engineering. 2012, 38(24): 119-122. https://doi.org/10.3969/j.issn.1000-3428.2012.24.029
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In Multimedia Social Network(MSN), the trust relationships between entities directly affect the ways of digital contents sharing and dissemination. In order to accurately evaluate the trust relationships, a novel MSN trust model for Digital Rights Management(DRM) based on small-world theory is proposed by introducing the property factors of digital contents sharing scenario such as digital content credible feedback, feedback balancing factor, sharing similarity between users. Experimental results show that MNSTM can accurately evaluate the trust relationships by updating the trust values in real time, and effectively identify the malicious users in the sharing virtual community.
  • LU Guo-Feng, XIE Hua-Xi, HONG Yun-Lu, ZHANG Yan
    Computer Engineering. 2012, 38(24): 123-127. https://doi.org/10.3969/j.issn.1000-3428.2012.24.030
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    IQTM-F5 method may destroy Discrete Cosine Transform(DCT) coefficient histogram statistic characteristic at the time of high embedding rate. Aiming at the problem, this paper proposes a new F5 steganographic method using the same modified quantization table as that in IQTM-F5, but adopts different embedding strategy to preserve the DCT coefficient histogram statistic characteristic. Experimental results show that the proposed method achieves the same steganographic capacity and embedding efficiency as IQTM-F5, and obtains better stego-image quality and stronger resistibility to histogram statistical steganalysis.
  • LI Zhi-Jiang, YAN Ying-Jian, DUAN Er-Peng
    Computer Engineering. 2012, 38(24): 128-132. https://doi.org/10.3969/j.issn.1000-3428.2012.24.031
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem of sample amount to differential power attack of block cipher, by establishing the SNR model of differential power signal, this paper proposes the expression of the sample amount: . After measuring the parameters σ and ε, the numerical value is got, which is about 8 000. Using the 5 000 samples and 8 000 samples separately to finish the Differential Power Attack(DPA) to Advanced Encryption Standard(AES), and gets the right key when the samples’ amount is 8 000. The result is better than when it is 5 000, so the expression proposed is reasonable.
  • JU Wei, LI Yuan-Xiang, YANG Dun-Jie, ZHOU Ze-Meng
    Computer Engineering. 2012, 38(24): 133-135. https://doi.org/10.3969/j.issn.1000-3428.2012.24.032
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Sparse Representation-based Classification(SRC) method performs excellent in face recognition but shows high complexity in computation. This paper proposes face recognition method based on Compressed Sensing(CS) named Classified Orthogonal Matching Pursuit(COMP). L1-norm minimization representation algorithm is replaced by Orthogonal Matching Pursuit(OMP) algorithm to reduce complexity, and mode category information is introduced in OMP to endow the method stronger ability to category. Experiments based on YaleB face database clarify that the recognition rate of COMP is higher than OMP.
  • CHEN Yi-Qun, MAO Lai-Pan, CHEN Guo-Meng, LI Zhi-Ye
    Computer Engineering. 2012, 38(24): 137-140. https://doi.org/10.3969/j.issn.1000-3428.2012.24.033
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper makes advantage of the Delaunay triangulation of all customers(including the depot), keeps most edges of the solution overlap the edges of Delaunay triangulation to accelerate an improved Tabu search algorithm. Experimental results show that the algorithm well solves the m-Open Vehicle Routing Problem(OVRP) problem with stable performance, and the solution keeps close with the upper bound. The search techniques proposed can be easily applied for other meta-heuristics for problem solving.
  • LI Jie, FENG Rui
    Computer Engineering. 2012, 38(24): 141-145. https://doi.org/10.3969/j.issn.1000-3428.2012.24.034
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper presents a tracking algorithm based on multi-feature fusion in the particle filter framework to solve the problem of pedestrian tracking in onboard videos. To deal with the nonlinearity and non-Gaussianity caused by the motions of the pedestrians and the cameras in onboard videos, the particle filter tracking algorithm based on Monte-Carlo sampling is employed, the targets’ states are predicted by first-order self-regression dynamic models, and the observation model is proposed to fuse four complementary features. Experimental results show that the recall of the proposed algorithm improves by more than 20% at the same precision level than the tracking algorithm without particle filter and multi-feature fusion.
  • CHEN Yi-Meng, DUAN Ling-Yu, HUANG Yan, LI Bing, LIN Jie, HUANG Tie-Jun
    Computer Engineering. 2012, 38(24): 146-151. https://doi.org/10.3969/j.issn.1000-3428.2012.24.035
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to use distributed retrieval method for document clustering partition directly in visual search field, this paper proposes a distributed visual retrieval model based on potential theme. This paper gives the model framework, including dataset partition method of image visual words and image subset selection method, to optimize image distributed retrieval performance. Experimental results show that this model searches in a few selected image collections without losing retrieval accuracy and improves the query throughput.
  • ZHANG Ye, TIAN Wen, LIU Cheng-Feng
    Computer Engineering. 2012, 38(24): 152-155. https://doi.org/10.3969/j.issn.1000-3428.2012.24.036
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Based on a combination of Ensemble Empirical Mode Decomposition(EEMD) and multivariate phase space reconstruction, a new combined forecasting model is proposed for fire time series by using Support Vector Regression(SVR). The fire time series is decomposed into a series of Intrinsic Mode Function(IMF) in different scale space by using EEMD. The phase space of IMF is reconstructed by using of multivariate phase-space reconstruction. Based on nonlinear SVR, a prediction model is developed for each intrinsic mode functions, and these forecasting results of each IMF are combined with SVR again to obtain final forecasting result. Experimental results show that this method is more accurate than single variable phase space reconstruction method and SVR method.
  • FANG Ai-Dong, HU Hua-Gang, CHENG Peng
    Computer Engineering. 2012, 38(24): 156-160. https://doi.org/10.3969/j.issn.1000-3428.2012.24.037
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper proposes a fast method for paper sequence number recognition. Characters are divided into several categories through the characteristics of connected domain and scanline, small fine categories are subdivided with the Hausdorff distance and the close curve characteristics. It uses character differentiator to identificate small fine categories. Experimental results show that this method has simple operation, can avoid complicated calculation, recognition speed is fast, and can improve the recognition rate of character.
  • LIU Jia-Qi, BIAN Hua-Min, YAN Jian-Feng
    Computer Engineering. 2012, 38(24): 161-165. https://doi.org/10.3969/j.issn.1000-3428.2012.24.038
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    News story unit segmentation method based on multi-modal feature fusion is proposed in this paper by analyzing news video structure. News video is divided into audio stream and video stream. Mute intervals are detected as audio candidate points, and the shot segmentations for news video are detected and shot boundary points are chosen as video candidate points, anchorperson shot and topic caption are detected. Story units are detected by fusing audio candidate points, video candidate points, anchorperson shot and topic caption based on time axis. Experimental results show that this method can get 83.18% in recall and 83.92% in precision.
  • DAI Wang, FANG Yu-Chun, LI Yang
    Computer Engineering. 2012, 38(24): 166-170. https://doi.org/10.3969/j.issn.1000-3428.2012.24.039
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The feature selection algorithm can not effectively reduce the feature dimension, and the stability is lower. In order to solve this problem, this paper proposes a feature selection algorithm of fusing filtering and packaging mode. In packaging type algorithm, it designs the feature selection criterion which can maintain topological structure between image, uses Fisher Score as criterion in the filtering type algorithm, and the individual optimum search strategy is used in this paper. Experimental results show that this algorithm can improve recognition rate, reduces the feature dimension, and has good stability in face recognition application.
  • XIAO Xing-Xing, FENG Rui
    Computer Engineering. 2012, 38(24): 171-174. https://doi.org/10.3969/j.issn.1000-3428.2012.24.040
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Under the condition of short time, the performances of the existing methods of speaker recognition decrease apparently. To solve this issue, this paper proposes an algorithm of short time speaker recognition method based on common feature selection. By using the speaker’s voice data, this method obtains the Gaussian Mixed Model(GMM), and extracts the common overlapping part between the speakers and establishes the common overlap model and non-overlap model. Based on the two models, this method finishes the feature selection of test speech to calculate the similarity in all non-overlapping speaker models, and makes decisions based on the principle of similarity maximizing. Experimental results show that this method is robust, and makes system identification error rate low.
  • CHEN Ji, KA Mi-Li-?Mu-Yi-Ding, ZHANG Hui-Yu
    Computer Engineering. 2012, 38(24): 175-178. https://doi.org/10.3969/j.issn.1000-3428.2012.24.041
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    According to the offline and text independent writer identification research, combining with the Uighur text connections and complex glyph characteristics, this paper uses microstructure feature handwriting identification method based on probability distribution function to realize Uighur handwriting identification. The method depicts the writing trend of local fine structures in handwritings and uses Euclidean distance and Manhattan distance metrics to measure the similarity between handwritings. The handwriting samples of 120 Uyghur nationality students are tested. Results show that the method can improve the correctness of Uighur handwriting identification.
  • QIN Chuan-Dong, LIU San-Yang
    Computer Engineering. 2012, 38(24): 179-181. https://doi.org/10.3969/j.issn.1000-3428.2012.24.042
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    When L1-norm support vector machine and L2-norm support vector machine are used to analyse the datasets with small sample, high dimension and high correlation in parts of the variables, the effects of them are not satisfactory. Taking the good advantages of the two methods, an improvement algorithm of doubly regularized support vector machine is proposed. But the inequality constraints and the non-differentiable norm bring many troubles. A positive function and a quadratic polynomial loss function are introduced to change the optimization problem into a differentiable and unconditional constraints one which is easy to compute using many optimization algorithms. Experimental results show the improvement gains better effects.
  • CHEN Ke, CHENG Yi, XIE Meng-Xia, AI Ban
    Computer Engineering. 2012, 38(24): 182-187. https://doi.org/10.3969/j.issn.1000-3428.2012.24.043
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem existing in the traditional methods for Web service discovery, such as unobvious distinction of service matching degree and low precision of service discovery, an algorithm for spatial information service automatic discovery based on service cluster is proposed. This algorithm clusters the advertised spatial information service into some clusters and selects the most matching cluster through computing the similarity between the service request and each clustering center. The most matching spatial information service is determined according to the semantic similarity between service request and each matching spatial information service belongs to the most matching cluster. Experimental results show that this algorithm can quantify the Web service matching degree while improving the distinction, service discovery recall and efficiency.
  • CAO Shi-Tong, CHEN Xian-Fu
    Computer Engineering. 2012, 38(24): 188-190. https://doi.org/10.3969/j.issn.1000-3428.2012.24.044
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem that the quantum physical characteristics are hard to simulate for traditional evolutionary algorithm, a novel quantum evolutionary algorithm is proposed in this paper. Quantum computation is combined with evolutionary algorithm, and random interference is added to the routine chromosome. So the characteristics of the superposition, entanglement of quantum computation is simulated from mathematical aspect. The algorithm is applied to solve Multidimensional Knapsack Problem(MKP), and experimental results show that, the genetic diversity of the population is increased, the capability of global optimization is improved, and the effectiveness of the algorithm is verified.
  • HOU Li-Bin, LI Pei-Feng, SHU Qiao-Meng
    Computer Engineering. 2012, 38(24): 191-195. https://doi.org/10.3969/j.issn.1000-3428.2012.24.045
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Event extraction is an important component of information extraction. Event detection and recognition is the basis of event extraction. It is implemented by two stages which are trigger word detection and event recognition. The two stages are studied. In first stage, words are clustered by LDA model to solve the words overfitting problem and a trigger word detection method is proposed based on character using CRFs model in view of the inconsistency between Chinese word segmentation and trigger word boundary. In the next stage, cross-event inference is applied in Chinese event recognition to enhance the result of event recognition. Experimental results show that the approach can significantly improve system performance, achieving the F-measure of 66.2 and 62.0 on the stage of trigger detection and event recognition respectively.
  • CUI Wen-Bo, LIN Xiang-Gong, XU Man-Yi
    Computer Engineering. 2012, 38(24): 196-199. https://doi.org/10.3969/j.issn.1000-3428.2012.24.046
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    According to the difficulties of coding strategy of Spiking Neural Network(SNN) for image segmentation, two types of Time-to-First- Spike coding methods are proposed: linear coding and non-linear coding. Linear coding uses a linear function that realizes the corresponding relationship from pixel values to the spike times of the neurons, while non-linear coding uses corresponding relationship of Sigmoid function. Experimental results of image segmentation show that, the segmentation result using the non-linear coding is better than the result using the linear coding, and the segmentation image of non-linear coding has greater Shannon entropy. The method of non-linear coding is easier to select optimal parameters, and acquires the best segmentation result of image.
  • CUI Wen-Chao, WANG Yi, FAN Yang-Tu, FENG Yan
    Computer Engineering. 2012, 38(24): 200-204. https://doi.org/10.3969/j.issn.1000-3428.2012.24.047
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    The medical image segmentation based on Local Binary Fitting(LBF) model is sensitive to initial contour and merely available to single object. If its initial contour chosen manually is not suitable, the segmentation needs too much CPU time and sometimes is even unsuccessful. To overcome these disadvantages, an integrate Fuzzy C-means(FCM) clustering into LBF model is proposed for automated image segmentation. The image to be segmented is clustered into objects and background using FCM algorithm, from which the resulted fuzzy membership of each object is transformed into the initial value of level set function with respect to the LBF model. Starting from the initial value, the evolution of LBF model is continued until convergence. Thus, the segmentation is accomplished. Experimental results on the synthetic and real images(blood vessel images and the brain image) show that the proposed algorithm can get the suitable initial value automatically. As a result, the sensitivity to the initial contour is solved effectively and the iteration number is also decreased considerably. Moreover, the multiple objects segmentation can be implemented by choosing the different cluster generated previously from FCM algorithm.
  • WANG Qi-Fan, GU Zhen-Gong, QIN Ti-Zhong, YANG Jie, HU Yang-Jie
    Computer Engineering. 2012, 38(24): 205-207. https://doi.org/10.3969/j.issn.1000-3428.2012.24.048
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to solve the problem that the visual effect of enhanced image is poor and the detail is fuzzy, this paper proposes a new image enhancement algorithm. It calculates every pixel’s relative entropy between date model and uniform distribution model in the image, and sets a threshold to judge whether the pixel’s relative entropy is more than the threshold. If it is that, the pixel will be enhanced by the algorithm based on the neighborhood information of pixel, and other pixels will be enhanced by the algorithm of BHEPL. Experimental results show that the enhanced image by proposed algorithm has a good visual effect and its detail information is clearer.
  • LI Nan, HE Hong, FANG Chao
    Computer Engineering. 2012, 38(24): 208-210. https://doi.org/10.3969/j.issn.1000-3428.2012.24.049
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    For remote sensing image, the data are huge quantity, and the objects are closely related to spatial scale. This paper proposes an adaptive multi-scale integrated remote sensing image segmentation method. It uses color variance to define the distance between regions, and achieves the fast region merging based on region adjacency graph and nearest neighbor graph. Simultaneously, it establishes the relationship between scale and threshold, and then obtains multi-scale segmentation results based on different thresholds, and obtains the results by integrating multi-scale results. Experimental results show that the over-segmentation or sub-segmentation can be eliminated effectively.
  • SHU Jian-Song, CAO Dong-Lin, LI Chao-Ci, LIN Da-Zhen
    Computer Engineering. 2012, 38(24): 211-215. https://doi.org/10.3969/j.issn.1000-3428.2012.24.050
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Up to nowadays, the research of Web-based automatic image annotation is mainly about the problem of the relevance assumption of the image and text, and the main problem of the content-based automatic image annotation is the limit of the database. Aiming at this problem, this paper proposes the Internet-search-based automatic image annotation with the verification of feedback, combining the content-based and the Web-based automatic image annotation. It extracts candidate labels from the search results using Web-based texts associated with the image, and then verifies the final results by using the Internet search results of the candidate labels with the content-based image features. Experimental results show that this method can annotate the large-scale database, with high accuracy, and achieves 7.92% improvement on the basis of web-based automatic image annotation.
  • TU Qing, CENG Cha-Xian, XIE Chi
    Computer Engineering. 2012, 38(24): 216-219. https://doi.org/10.3969/j.issn.1000-3428.2012.24.051
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to effectively extract and describe the image feature and improve image retrieval performance, this paper presents a novel image retrieval algorithm based on fusion of texture, color and shape features. The color image edge is detected, and by means of edge image transform, a motif transformed image is obtained. A Motif Co-occurrence Matrix(MCM) is obtained through traversal of the motif transformed image, and the algorithm calculates the gradient of the all motifs of the motif transformed image to get the motif gradient histogram. It obtains the color histogram by uniformly quantize the RGB color image into 64 colors. The image characteristic is described by three image features and it is used to image retrieval. Experimental results indicate that the algorithm has higher precision and recall rate compared with BCTF and MCM algorithm. It can reduce computational complexity.
  • TONG Fu-Shui, TUN Xiao-Dun
    Computer Engineering. 2012, 38(24): 220-224. https://doi.org/10.3969/j.issn.1000-3428.2012.24.052
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    An improved multi-focus image fusion algorithm based on Pulse Coupled Neural Network(PCNN) is proposed for improving the fusion quality and the fusion efficiency. The image quality evaluation index of each block is calculated, which is selected with enough reason. The difference of the index is obtained by subtracting the two normalized indices. The difference of the index is input into PCNN model as external stimulus, and the output pulse is obtained. Comparing the value of output pulse with a given threshold, the fused image block is selected from the source image block whose evaluation index is large, while pulse output is larger than threshold. Take the source image block with small value of evaluation index. The performance of the proposed method is evaluated using six criteria including mutual information, cross entropy, root mean squared error, peak value signal-to-noise ratio, structure similarity index and correlation coefficient. Experimental result shows that the algorithm can improve image fusion effect.
  • FU He-Ping, ZHENG Qi-Long, CHEN Sai-Ling, FENG Yu-Qian
    Computer Engineering. 2012, 38(24): 225-227. https://doi.org/10.3969/j.issn.1000-3428.2012.24.053
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    According to the problem that the digital signal processor cannot make full use of the complex multiplication instruction offered by Digital Signal Processing(DSP) chip causing the low performance of complex multiplication, this paper proposes the optimization of complex multiplication based on compiling guidance. By passing the compilation guidance commands, it makes the complier identify all the instructions which are related with complex multiplication, and replaces all the multiplication instructions with a single complex multiplication instruction through the recognition algorithms in a control block. Experimental results show that the optimization design can effectively reduce the fft_radix2 and fft_radix4 programs execution cycles.
  • LAI Xin, LIU Cong, WANG Zhi-Yang
    Computer Engineering. 2012, 38(24): 228-234. https://doi.org/10.3969/j.issn.1000-3428.2012.24.054
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to remove the restrictions that the global arbiter is hard to implement and Thread-level Speculation(TLS) runtime can not swap speculative threads out of their host processors when using Cache coherence protocol. By analyzing the speculative data management policy in TLS, it is found that the buffer for modified data for each speculative thread is small and the possibility of speculative threads being swapped out of their host cores is so small that can be neglected. This paper designs a shared-distributed memory system. The proposed memory system uses invalidation vectors and speculation version priority registers for data dependence checking when doing cache coherence checking. A flat of memory in the L2 Cache is reserved to buffer and restore the data modified by the speculative threads, which makes speculative swapping possible. The paper modifies the SESC simulator to confirm the performance and correctness of the proposed memory system. Experimental results show that when keeping the ideal speedups, the proposed memory system can support speculative thread swapping very well.
  • CHEN Jia-Liang, DUAN Zhong-Xin
    Computer Engineering. 2012, 38(24): 235-238. https://doi.org/10.3969/j.issn.1000-3428.2012.24.055
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To solve the problem of the present methods of oil color detection, the embedded on-line oil color detection system based on image processing with the ARM processor as the core, and combining with the Complex Programmable Logic Device(CPLD) and image sensors is designed. BP neural network optimized by genetic and LM(GA-LMBP) algorithm is put forward, hardware design and software process of the system are introduced. Experimental result shows that the system has high accuracy and stability, can effectively avoid subjective error, reduce costs and improve portability, and can be widely used in the field of oil color detection.
  • ZHANG Lei, TAO Pei-Yang, XU Xue-Ji, ZHOU Xi-Jian
    Computer Engineering. 2012, 38(24): 239-243. https://doi.org/10.3969/j.issn.1000-3428.2012.24.056
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem that strike combat task allocation exists task executable restriction, the mathematics model is established. The Genetic Algorithm(GA) is used to solve the model. In order to suit the model characteristic, the coding way of chromosomes is designed. At the same time, a process of arithmetic cross is designed in which double bit is random fixed, the mutation process is designed by means of the consistency single bit redressal. Experimental results of case show that the algorithm solves the problem model available under task executable restriction, the dependability and time effectiveness of this approach are illuminated.
  • TANG Yong-Bei, GUI Wei-Hua, OU Yang-Wei
    Computer Engineering. 2012, 38(24): 244-246. https://doi.org/tyb2003045@yahoo.com.cn
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    A nonlinear fault detection method based on Kernel Principal Component Analysis(KPCA) and Chaos Particle Swarm Optimization(CPSO) algorithm is presented. KPCA performs nonlinear transformation by kernel function to map the nonlinear input space into linear feature space, computes principal component and detects faults by utilizing SPE statistics. The kernel parameters of kernel principal component are optimized in order to enhance the fault detection performance. For the premature convergence problem of the Particle Swarm Optimization(PSO) algorithm, the CPSO algorithm is adopted to utilize the chaos optimization’s search properties. Experimental results of transformer show that the proposed method has better detection performance than PCA, KPCA and PSO-KPCA method.
  • HUANG Yao-Guang, GAO Bo, LI Jian-Xin, YIN Chuan
    Computer Engineering. 2012, 38(24): 247-250. https://doi.org/10.3969/j.issn.1000-3428.2012.24.058
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    To satisfy the requirement of high location speed, this paper presents a method based on the observed information of spatial-frequency domain by fixed single-observer. Location error analysis of this method is touched. The factors influencing the location error and the degree of influence are exploited in virtue of Geometric Dilution of Precision(GDOP) graph. Simulation experimental results show that this method is sensitive to angle velocity measuring error, and is evidently affected by the measuring error of Doppler frequency rate-of-change and the velocity of moving emitter. At the same time, the location error of the method is less affected by frequency measuring error.
  • YANG Le, TUN Ji, LV Ping
    Computer Engineering. 2012, 38(24): 251-253. https://doi.org/10.3969/j.issn.1000-3428.2012.24.059
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to solve the Out-of-Vocabulary(OOV) problem in speech retrieval tasks, this paper presents a construction algorithm of sub-word units based on Maximum Mutual Information and Minimum Description Length(MMI-MDL). It selects candidate pairs according to the mutual information of sub-word pairs, judges whether combining the pairs to a new sub-word through MDL. After getting the sub-word set, map the word into sub-word for retrieval. Experimental results show that compared with the MDL algorithm, the proposed method has a better performance, and achieves a 12.1% relative improvement on the OOV recall rate.
  • LI Lei, ZHOU Xu-Shun, FEI Lei, WANG Geng
    Computer Engineering. 2012, 38(24): 254-257. https://doi.org/10.3969/j.issn.1000-3428.2012.24.060
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    This paper analyses the current TV payment modes and presents a TV payment platform model based on real-name account. The technologies of the overall architecture and the main work flows are discussed. The security, reliability and integrity of the platform are analyzed. Now the platform is running on the Shanghai Next Generation Broadcasting(NGB) network, and it is running well in actual.
  • ZHANG Jun, YAN Dao, CHEN Ling-Hui
    Computer Engineering. 2012, 38(24): 258-261. https://doi.org/10.3969/j.issn.1000-3428.2012.24.061
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    With the development of the Next Generation Sequencing(NGS) technology, the default raw data processing pipeline faces more and more challenge. This paper presents a new data processing pipeline——NRDPT. NRDPT designs a new raw data processing algorithm, based on edges and Hough transform. In addition, a high performance 2-step registration algorithm is proposed. The experimental results show that, based on the same precision, the 2-step speeds up the classical algorithm by 9 times.
  • TUN Dun-Qi, NI Hong, LI Dun
    Computer Engineering. 2012, 38(24): 262-265. https://doi.org/10.3969/j.issn.1000-3428.2012.24.062
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In order to reduce the difficulty of predicting the real-time Variable Bit Rate(VBR) video traffic, this paper proposes a novel algorithm called VBKDE(Variable Bandwidth Kernel Density Estimation). The algorithm is based on kernel density estimation. It dynamically updates the bandwidth of every sample when the scene change happens so as to accelerate the convergence. Simulation results show that, compared with variable step size normalized least mean square, the algorithm can reduce prediction errors by 10%.
  • GUO Zheng-Gong, GUO Chao-Zhong
    Computer Engineering. 2012, 38(24): 266-268. https://doi.org/10.3969/j.issn.1000-3428.2012.24.063
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the characteristic of register allocation in base mathematics library, this paper presents a method of register allocation based on the strategy of delamination and multi-level. Considering the characteristic of the different registers, the method adopts the multi-layer model and uses the registers reasonably. The measure can ease the situation of register resource inefficiency at a larger-extent, reduce or avoid the appearance of spilling out in the process of register allocation, sequentially upgrade the performance of the base mathematics library. Experimental result indicates that the allocation strategy can optimize the performance of the library at the degree of 6%.
  • LE Jia-Jin, TAO Lan
    Computer Engineering. 2012, 38(24): 269-273. https://doi.org/10.3969/j.issn.1000-3428.2012.24.064
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the problem of searching on sports video in Internet, this paper proposes a full-text search system of sports video information based on Solr. The scheme includes the following areas: collecting and preprocessing the raw sports video information, using Solr to create index and search, processing and showing the search results. This paper gives the architecture of application system, including collecting raw data, searching with Solr, preprocessing and presenting results. Experimental results show that this system can obtain higher percentage and the accuracy, and it is better when searching with multi-source data.
  • CHEN Jian-Xin, CANG Jing, YANG Lu-Lu
    Computer Engineering. 2012, 38(24): 274-278. https://doi.org/10.3969/j.issn.1000-3428.2012.24.065
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    In some scenarios there are no desktops such as bed, bus, etc., where it is not feasible to use such mouse. To combat this problem, a new mouse is proposed using Micro-Electro-Mechanical Systems(MEMS) intertial sensors, which explores the characteristics of sensors for measuring attitude and azimuth to control the mouse moving in 3 dimension space. It verifies this design in Windows platform, which shows it is possible, and overcomes the constraint of the desktop, which extends the mouse application in more scenarios.
  • YUE Feng, LONG Jian-Min, ZHANG Yi-Chi, TU Yong
    Computer Engineering. 2012, 38(24): 279-282. https://doi.org/10.3969/j.issn.1000-3428.2012.24.066
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Compared with traditional serial program migration, parallel program migration becomes sophisticated for the huge diversity of different architectures. To migrate Compute Unified Device Architecture(CUDA) programs to other heterogeneous multi-cores, a method of mapping CUDA architecture to Cell is proposed. Through executing model mapping, enhancing parallel granularity, memory mapping and optimization, the mass threads in CUDA can execute correctly in Cell architecture by source code migration. Experimental result shows the executing speed of translated programs can achieve 72% of native compiled programs.
  • LI Feng-Jing, TAO Pei-Yang, MO Lu-Jun, ZHANG Jie-Yong, TANG Jian
    Computer Engineering. 2012, 38(24): 283-287. https://doi.org/10.3969/j.issn.1000-3428.2012.24.067
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at communication countermeasure under combat targets decision problem in complex battlefield, this paper proposes the solving method which is oriented on situation and based on combat capability. Dynamic matter model for battlefield situation and method of multilayer capability aggregation are proposed based on characteristic of group decision making. And the method of campaign target decision-making based on decision-maker’s venture attitude is proposed. Analysis result shows that this method can show the battlefield state information in a formalized, unified, dynamic way. The method is easy to calculate, and can realize the goal of fighting decision quantification analysis.
  • LI Ai-Guo, FENG Guo-Song
    Computer Engineering. 2012, 38(24): 288-290. https://doi.org/10.3969/j.issn.1000-3428.2012.24.068
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    Aiming at the security risk of the moving storage based the Universal Serial Bus(USB) as well as the vulnerability of the recent secure scheme, a design of a security USB2.0 device controller is given, which has the bidirectional authentication using the Hash function and the XTS-AES that is a block-oriented encryption algorithm. It gives the secure scheme for USB device controller such as the bidirectional authentication between the host and the device and the encryption of the memory data. It also gives the technical support for the secure scheme of the USB storage device from the hardware, which protects the stored data on chip level using fewer secret keys.
  • XU Yue, XIAO Gang, ZHANG Dan
    Computer Engineering. 2012, 38(24): 291-294. https://doi.org/10.3969/j.issn.1000-3428.2012.24.069
    Abstract ( ) Download PDF ( )   Knowledge map   Save
    A particle filtering tracking algorithm based on adaptive spatio-temporal Codebook detection model is designed to realize multi-objective tracking. Spatio-temporal codebook model is used for foreground-background segmentation as well as detecting foreground targets, and an adaptive phrase of the targets is added to this model. The prior target state distribution of particle filter is generated by detected foreground targets, using association algorithm and particle filter to realize multi-objective tracking. Compared with traditional codebook and spatio-temporal codebook model, the proposed adaptive spatio-temporal codebook model has robustness to interference and noise. Experimental results show that the particle filter tracking algorithm can capture moving targets rapidly with higher validity in moving background with illumination change in existence of interference and noise.