Loading...

Table of Content

    21 September 2015, Volume 0 Issue 9
    Improved  Apriori Algorithm Based on Matrix Reduction
    REN Wei-jian, YU Bo-wen
    2015, 0(9):  1-5. 
    Asbtract ( 178 )  
    References | Related Articles | Metrics
    During the search for frequent itemsets of the Apriori algorithm, the database is scanned repetitively and generates a large number of useless candidate sets. For this problem, a kind of improved Apriori algorithm based on the matrix reduction is put forward. The algorithm scans the database only once, converts the database information to Boolean matrix, and reduces the data structure according to the conclusion drawn from the nature of the frequent k-itemsets, which lowers the generation scale of the invalid candidate itemsets effectively. By comparing with the existing algorithms, it is validated that this algorithm can improve the efficiency of mining frequent itemsets effectively.
    Statistical Properties and Dynamics Analyzing of #br#  Enterprise Information Systems Users’ Access Behavior
    REN Jia-jia, WANG Nian-xin, GE Shi-lun
    2015, 0(9):  6-12. 
    Asbtract ( 112 )  
    References | Related Articles | Metrics
    In order to explore the statistical rule of users’ access behavior in enterprise information system, human dynamics is employed to study the users’ access distribution of individual user, group users and all users in information system. The results show that interval time distribution of three levels users’ access to information systems have serious fat tail, but can’t be described by a single distribution. Individual users’ access interval time meets a power-law distribution, group users’ level follows a mixed power-law with exponent’s truncation, and all users’ level can describe better by exponential distribution and power-law distribution.
    Land-use Classification Method Based on Landsat8 OLI Images
    LIU Wei
    2015, 0(9):  13-16,21. 
    Asbtract ( 286 )  
    References | Related Articles | Metrics
    This research aims to seek out the most suitable land use classification method for Landsat8 OLI images, by comparing 〖JP2〗supervised classification based on
    maximum likelihood and the method proposed in this paper. There are 11 kinds of land use type in OLI images of Dingbian county. Firstly, OLI images were fused with panchromatic
    image and then processed by 3 levels wavelet filtering after routine image preprocessing. Secondly, LBV transform were applied to OLI images. Thirdly, training samples set for
    each kind land use type were collected, and then supervised classification based on SVM, opening-closing operation in mathematical 〖JP2〗morphology were carried out to get
    precise information of each kind land use type. Fourthly, assessing the classification results of method proposed in this paper and maximum likelihood, by overall classification
    accuracy and Kappa coefficient as evaluation indexes. Results show that: the overall accuracy and Kappa coefficient of classification image using the method proposed in this
    paper were 83.62% and 0.785, with growth of 12.82% and 14.26% compared with classification image using maximum likelihood. Meanwhile, removal of salt and pepper noise in
    classification image was more effective using method proposed in this paper.
     An Image Retrieval Algorithm Combining Multi-content Features
    ZHU Ling-yun1, ZHU Zheng-yu1,2, QI Xin-yong1
    2015, 0(9):  17-21. 
    Asbtract ( 97 )  
    References | Related Articles | Metrics
    In order to enhance the accuracy of image retrieval, a new content_based image retrieval algorithm FCTL is presented in this paper. In FCTL, the feature of an
    image is described by a 300 bin histogram, which contains three types of information: color, texture and location. To represent the similarity of two images, a Matsushita
    distance between their histograms is used. The retrieve results will be returned, sorting by their similarities with a given querying image. ANMRR, which is recommended by MPEG
    -7, is used to evaluate performance in our comparative experiments. The comparative results based on WANG and UCID image libraries show that, FCTL has higher retrieval accuracy
    than some frequently used algorithms.
     A LED Taping Machine Vision Detection System Developed By VC
    BAI Ying-qian, MAO Hong-mei, PANG Xu-qing
    2015, 0(9):  22-24,29. 
    Asbtract ( 150 )  
    References | Related Articles | Metrics
     This article describes the automated surface mount LED of the taping machine based on digital image processing’s vision detection system technology. The system
    uses lightness value image production method to identify missing of the model of 5050 six angle front light emitting diode. Firstly, the detection method is to deal with image
    by gray processing in the computer, then to deal with the noise through the median filtering, and then to detect lightness value of carrier band with or without material,
    calculates the average deviation of lightness and this value would be a critical value, so the lightness value is greater than the critical value in the process detecting when
    carrier band with material, the opposite carrier band without material. This detection algorithm is simple, high efficiency, detection results are given in the article, and the
    feasibility and correctness of the method were validated theoretically and practically.
    Application of Improved ISOMAP Algorithm for Face Recognition
    GUO Hai-feng1,2, CHEN Yue-xia1, SUN Zhou-bao2
    2015, 0(9):  25-29. 
    Asbtract ( 129 )  
    References | Related Articles | Metrics
     Image data is high-dimensional data which make it easily prone to the dimension disaster. The traditional dimensionality reduction methods can not recover the
    inherent structure. Manifold learning is a nonlinear dimensionality reduction technique, it aims to find low-dimensional compact representations of high-dimensional observation
    data and explore the inherent law and intrinsic dimension of data. In this paper, the feature extraction method-SIFT and the adaptive ISOMAP method are combined and conducted on
    the real face image dataset. This paper analyzes and discusses the problem of the effects of the neighborhood parameter and the intrinsic dimension size on the face image
    recognition.
    Digital Halftoning Algorithm Based on Quantum Signal Processing
    XI Liang
    2015, 0(9):  30-34,41. 
    Asbtract ( 120 )  
    References | Related Articles | Metrics
     A new halftoning method based on quantum signal processing is proposed for the digital halftoning technical. Firstly the normalized gray images are expressed as
    the form of quantum bit, then we make the quantum measurement with random observation. The original image is collapsed into a binary image after measuring. The error between the
    binary image and the original image is calculated, and the product of the error and the adaptive feedback coefficient is used for image enhancement processing. Finally the
    halftone image is generated by the enhanced image using the traditional error diffusion algorithm. The simulation results show that the method can reduce structural texture and
    increase the contrast of halftone image. It outperforms the conventional digital halftone methods in subjective vision perception and objective evaluation index.
    Underwater Dam Crack Image Enhancement Algorithm Based on Rough Set
    WANG Geng-ren, FAN Xin-nan, SHI Peng-fei, CHEN Wei
    2015, 0(9):  35-41. 
    Asbtract ( 162 )  
    References | Related Articles | Metrics
     In view of the complex environmental conditions of underwater image acquisition for the dam cracks, such as the light scattering and attenuation caused by water
    led to the low signal to noise ratio of image, the extremely uneven of light distribution as well as fracture texture weakening,this paper proposes an adaptive enhancement
    algorithm, which adopts the discovery and classification rules of rough sets. This algorithm takes the congenital advantage of rough set when mining data for useful information.
    On the base of rough set theory,we divide the defect images according to the equivalence relation, which can be used to calculate brightness layer by using upper approximation
    and lower approximation, then enhance the texture of the crack image layer by layer. Further more, in order to feedback the best number of layers we introduce approximate
    classification accuracy and system parameters importance. Finally,by calculating their value and analyzing their convergence we can get the best adaptive dam crack enhancement
    image.The simulation results verify the effectiveness of the algorithm.
     A Breakage Detection Based on Context-aware Saliency Detection
    DING Cao-kai, ZHOU Wu-neng
    2015, 0(9):  42-45. 
    Asbtract ( 156 )  
    References | Related Articles | Metrics
    It is not the best idea of using a simple way of image segmentation and edge detection in detecting the breakage of transparent plastics in industrial area. For
    this, a breakage detection based on context-aware saliency detection which is employed in the transparent plastics in industrial area is proposed. First, we carry out the point
    Hough transform (PHT) on the original image. Then the graph-based superpixel segmentation is used to segment the image after the PHT. Finally, context-aware saliency detection
    is used to detect the salient area of the image. The experiments show that the method is of high accuracy and short run time.
    Algorithm of Hilbert-Huang Transform in Forecast of Passenger and Freight Volume
    PAN Wei-jun, PAN Yue-xiao, LU Guo-pan, ZENG Chen
    2015, 0(9):  46-49. 
    Asbtract ( 110 )  
    References | Related Articles | Metrics
    In order to improve the prediction accuracy of non-stationary time series, the paper used Empirical Mode Decomposition(EMD) method of Hilbert-Huang transform
    theory to decompose non-stationary time series into several IMF components of single frequency. Using the neural network model to predict IMF, the prediction results are
    reconstructed and weighted. The accuracy of prediction will be improved. It can also predict the transport volume in a certain period of time on the basis of the historical
    passenger data. The experimental results show that the improved algorithm is better than the neural network method, etc.
     A Balance Algorithm Between Handwriting Error and User Effort
    SHANG Xue-lian, LIANG Chuan-jun
    2015, 0(9):  50-56. 
    Asbtract ( 121 )  
    References | Related Articles | Metrics
    To solve the problem of poor performance in present computer-assisted annotation transcription of handwritten text documents, a new algorithm for predicting the
    error rate in a block of automatically recognized words is proposed, and estimates how much effort is required to correct a transcription to a certain user-defined error rate.
    Firstly, the main problem in traditional error estimating methods is analyzed. Then, the estimation of the error is performed for a whole block of words to raise the accuracy
    rate. Finally, the best-performing techniques presented in previous works are combined to form our method. The proposed method is included in an interactive approach to
    transcribe handwritten text documents, which efficiently employs user interactions by means of active and semi-supervised learning techniques. Transcription results, in terms of
    trade-off between user effort and transcription accuracy, are reported for two real handwritten documents, and prove the effectiveness of the proposed algorithm.
     BP Neural Network Algorithm Based on Frog Leaping Particle Swarm Optimization
    GU Quan-yu1, ZHANG Meng-ting2
    2015, 0(9):  57-59,65. 
    Asbtract ( 162 )  
    References | Related Articles | Metrics
    In order to solve the problems that the calculation of BP neural network is very complex and its convergence rate is slow, an iterative method of weight and
    threshold of the BP neural network based on the heuristic algorithm is put forward. This method combined with two advantages of the Frog Leaping Particle Swarm, in which, one is
    less controllable parameter than normal ways and the other one is the fast convergence speed. In essence, the weight and the threshold of neural network can be seen as
    particles. BP neural network was trained by particle updating and the accuracy of the algorithm is about 1.5342e-03.
     Fusion Recommendation Algorithm Based on Hidden Markov Models
    YANG An-ju, YANG Yun, ZHOU Yuan-yuan, MIN Yu-juan, QIN Yi
    2015, 0(9):  60-65. 
    Asbtract ( 148 )  
    References | Related Articles | Metrics
    In view of the problems of the traditional collaborative filtering recommendation algorithm based on the project of data sparseness and the low accuracy of
    recommendation, the thesis puts forward the HMM-ItemCF recommendation algorithm which combines Hidden Markov Model with the traditional collaborative filtering recommendation
    algorithm based on the project. The algorithm using Hidden Markov Model to all the users in the system evaluation behavior and the history of the target user behavior to carry
    on the overall analysis, to find the probability of the next moment a group of users with the highest score object, and the probability of occurrence of these scores with
    traditional objects project weighted similarity calculation method to get a new recommendation similarity ultimately produce results. The simulation experiment is carried out on
    the algorithm with an important parameter in the training, and compared with other algorithms. It proves that the improved algorithm is effective.
    Review of Blind Classification for STBC
    YAN Wen-jun1, ZHANG Li-min2, LING Qing1, KONG Dong-ming3
    2015, 0(9):  66-71,76. 
    Asbtract ( 153 )  
    References | Related Articles | Metrics
     Blind signal classification, a major task of wireless communication area, has important applications. Classification of space-time block code (STBC) is a new
    important area in blind signal classification. The maximum likelihood based and feature based algorithms for classification of STBC was concluded. Firstly the position of
    classification of STBC in wireless communication systems was introduced. Secondly the category of classification algorithms was illustrated. Thirdly, the key steps and basic
    ideas of the algorithms were analyzed. And finally, the proposed algorithms were summarized and the future development of classification of STBC was pointed out.
    Research and Simulation of Routing Strategy for Resource Sharing #br#  Based on Opportunistic Networks
    QI Na
    2015, 0(9):  72-76. 
    Asbtract ( 99 )  
    References | Related Articles | Metrics
    Opportunistic networks use the opportunity which generated by the nodes movement to transmit information. It can be widely used in the environment of data
    sharing in a short distance due to properties of opportunitys self-organization and without a fixed communication infrastructure. However, the traditional routing algorithm in
    opportunistic network does not take into account the characteristics of data sharing. So messages dissemination only can use the traditional forms like flooding. This paper
    first analyzes the specific features and attributes of the data sharing opportunistic network, and puts forward the concept of anchor zone and the destination node clusters in
    the process of messages dissemination. Furthermore, we propose a new scheduling algorithm based on the level the time to live of the message with different importance and
    lifetime. Moreover, we put forward the new routing strategies through combining the data sharing network attributes with scheduling algorithm, so the message spread quickly and
    efficiently in the network according to its anchor region and the destination node cluster. Finally, we verify the correctness and efficiency of the routing strategy by
    simulation experiments.
     An Adaptive Focused Crawling Algorithm Based on Link and Content Analysis
    ZHU Qing-sheng, XU Ning, ZHOU Yu
    2015, 0(9):  77-80,89. 
    Asbtract ( 126 )  
    References | Related Articles | Metrics
    The focused crawling is a key technique of focus search engine. To solve the problem of incomplete parameters considering in the On-line Topical Importance
    Estimation (OTIE) algorithm, this paper proposes an adaptive algorithm that combines link with content analysis to estimate the priority of unvisited URL in the frontier.
    Moreover, we consider the tunneling problem in the process of topical crawling. We select topics and seed pages from the Open Directory Project (ODP) and conduct the comparative
    experiments with four crawling algorithms: Best-First, Shark-Search, OTIE and our algorithm. The results of experiment indicate that the proposed method improves the performance
    of focused crawler that significantly outperforms the other three algorithms on the average target recall while maintaining an acceptable harvest rate.
    Analysis and Research on Performance of Compression Algorithm #br#  Based on Lifting Scheme Wavelet Transform in WSN
    TIAN Hong-zhou1,2, YING Bei-hua2
    2015, 0(9):  81-89. 
    Asbtract ( 129 )  
    References | Related Articles | Metrics
    In the design of WSN (Wireless Sensor Network), how to use energy more effectively is the first design goal. Data compression can reduce the amount of data in
    the process of transmission, which leads to not only improve the collection efficiency, but also reduce the energy consumption of wireless communication. It is very important
    to prolong the life of the network. In the research of the data compression in WSN, compression algorithm based on lifting scheme wavelet transform focuses on computational
    complexity and ignores the real energy saving effect when choosing wavelet transform. Aim at this situation, this paper, based on three different types of data as samples,
    analyzed all kinds of lifting scheme wavelet transform which is used commonly for data compression and compared the performance of saving energy. The results of validation show
    that under the same error tolerance, the 2/6 wavelet has better compression effect and more energy savings compared to the other kinds of wavelet. This makes it more suitable
    for the application of data compression in WSN.
     Method of Web Information Extraction Based on Ontology Theory
    LIU Li-juan, ZHANG Yin, YANG Yi
    2015, 0(9):  90-94. 
    Asbtract ( 117 )  
    References | Related Articles | Metrics
    To get the Web information for a specific topic, it used an ontology method to measure the topic correlation, in order to improve the quality of Web information
    extraction. According to Vector Space Model (VSM), by calculating weights of feature words, the ontology method to calculate topic correlation is used. In this way, it improved
    the Web information extraction quality in specific topic. In this paper, the method not only simplified dimensional computing in VSM, but also extended the semantic range. A
    practical application system with layered architecture was used to demonstrate the implementation process of this method. Practical application result shows that the proposed
    method is more accurate in extract Web information on specific topic, at the same time it reduces the computational complexity of the system, while improving the web information
    extraction recall and precision, thereby it reduces the missing pages of information, and improves the quality of Web information extraction.
     Research on Optimal Cluster Number of LEACH Routing Protocol
    ZHANG Fei-ge
    2015, 0(9):  95-99,104. 
    Asbtract ( 124 )  
    References | Related Articles | Metrics
    Lower Energy Adaptive Clustering Hierarchy (LEACH) is a wireless sensor network routing protocol. It randomly and circularly selected information and
    established clustering to reduce data volume, which is for the sake of evenly distributing the energy of the nodes in the network, but it did not consider the number of cluster
    head nodes which selected whether met the optimum value. First, in order to avoid effecting the network energy consumption cluster head nodes, it was proposed that determine the
    optimal number of cluster heads according to the energy-consuming situation of the nodes in the establishment stage and stable stage, and information sent the information to the
    base station by the combined mode of single-hop with multi-hop. Second, According to the method which makes the selected cluster head reasonable combining optimal number of
    cluster head with residual energy of nodes optimizes node random number is little used, combined with the nodes to the cluster head distance. The simulation results show that it
    could save energy consumption when the number of cluster heads which selected was in the optimal range.
     Predicting Product Variant Design Time Based on PSO Algorithm
    WANG Zhao-hua1, TONG Shu-rong1, HUANG Li2
    2015, 0(9):  100-104. 
    Asbtract ( 111 )  
    References | Related Articles | Metrics
     Before product secondary development, it is very difficult to forecast product variant design time. A variant design time prediction method based on the combination of the extremum disturbed particle swarm optimization (tPSO) with fuzzy neural network (FNN) is proposed. First of all, the time factor set is designated and the corresponding FNN time prediction model is established. However, the typical algorithm of FNN is easy to fall into local minimum, slow convergence speed and low learning efficiency. And then, the FNN model is optimized by tPSO to overcome disadvantages above. At last, the method is verified by the time prediction of printer variant design. The result indicates that the model is feasible and effective.
     A Quality Evaluation Method for Banks Based on AHP
    MING Bang-xiang1, TENG Yi1, ZHANG Zi-zhen2
    2015, 0(9):  105-108,112. 
    Asbtract ( 135 )  
    References | Related Articles | Metrics
    In order to strengthen the provincial financial outlets proxy centralized control, a comprehensive service quality evaluation model is established to evaluate the quality of services for all the banks with uniform standard. This paper utilizes AHP method to establish the mathematical model, which includes three important steps. Firstly, establish the index evaluation system according to the provincial office management and controlling requirements and its own properties which affect its operation. Secondly, establish the hierarchical structure and the judgment matrix to calculate all the index weight. Thirdly, use the consistency checking method to verify the index weight and deduce the quality evaluation method for China postal savings banks. Lastly, this paper makes comparison between the outcomes of the model and the real operations of banks. The outcome of the models shows the consistent results with real situation for three banks, which indicates that the model used to predict the bank service quality is correct, reliable and can be used for decision support.
    Meat Classification of Salmon Based on Near Infrared #br#  Spectroscopy and Sparse Representation
    WANG Lei1, YIN Jiao-jiao1, YU Xin-jie2
    2015, 0(9):  109-112. 
    Asbtract ( 131 )  
    References | Related Articles | Metrics
     Salmon meat is of the important indicators of quality to evaluate its merits, if they can accurately distinguish the characteristics of the meat, this can greatly reduce the discrimination time and increase breeding success rate. In this paper, using near-infrared spectroscopy and sparse representation, we can analyze the salmon meat specialties and classify research. If the astaxanthin was used as an index to meat specialties, we can compare the principal component analysis (PCA) and the sparse representation of data in two different spectral dimensionality reduction method to process it, in the spectral data dimensionality reduction, we are able to establish classification based on linear discriminant the classification algorithm analysis (LDA) and least squares support vector machine(LS-SVM) classification. The test results show that the sparse representation model correct classification rate and reduce the dimension accuracy rate are higher than the principal component analysis. Therefore, the sparse representation classification provides a new effective way for meat classification.
    Research on ER Training Assessment Method Based on Fuzzy Mathematics
    YAN Bo1, QIN Yong1, LI Guo-he2,3, LIU Lu1
    2015, 0(9):  113-116,126. 
    Asbtract ( 111 )  
    References | Related Articles | Metrics
     In order to improve the writing and implementing accuracy of the ER training in the oil depots, a hierarchical fuzzy comprehensive assessment model is adopted to assess the result of the ER training objectively and fairly. Firstly, the set of the influencing factors is determined. Then the evaluation level and the assignment are set. After that, the membership vector can be calculated. Finally, the results of the two-level fuzzy assessment and the comprehensive assessment can be calculated which realizes the ER training assessment of writing, implementing and effect evaluation. The assessment method overcomes the problems that the recovery quality is low when the staff questionnaire survey is used purely. And it has been successfully used in assessing the ER training of the oil deports.
     An Empirical Study of Institutional Pressures Impact on Cloud Computing Assimilation
    HU Dong-lan, WANG Nian-xin, JIA Yu, GE Shi-lun
    2015, 0(9):  117-121,126. 
    Asbtract ( 123 )  
    References | Related Articles | Metrics
     Based on institutional theory, this paper examines the links among institutional pressures, top management support and assimilation of cloud computing. Survey data from the 376 firms of China show that: Mimetic pressures positively affect top management support, which then positively affects top management support the assimilation of cloud computing, while does not have direct significant impact on cloud computing assimilation. Coercive pressures do not have direct or indirect significant impact on cloud computing assimilation. Normative pressures not only positively affect cloud computing assimilation without the mediation of top management support but also positively affect cloud computing assimilation with the mediation of top management support. We conclude by highlighting the managerial implications and suggestions.
    Predictive Control Algorithm of PV Grid-connected Inverter Based on MLD Model
    LU Lu, GONG Ren-xi, WEI Qian
    2015, 0(9):  122-126. 
    Asbtract ( 121 )  
    References | Related Articles | Metrics
     The slow respond of the inverter PV grid system based on traditional switching function model affects the power quality feed into the grid. To solve this problem and improve the power quality of the inverter, after analyzing the PV grid-connected inverter confounding characteristics, by introducing logical variables for the inverter switching dynamics, the MLD model of the system was built. A predictive control strategy was proposed based on this kind of model. The comparison of simulation results show that the predictive control algorithm on the MLD model can improve the dynamic performance of the system, reduce the time of the system meeting the requirement of the grid, verify the feasibility of the proposed method.