Login

 


WSEAS Transactions on Computers


Print ISSN: 1109-2750
E-ISSN: 2224-2872

Volume 15, 2016

Notice: As of 2014 and for the forthcoming years, the publication frequency/periodicity of WSEAS Journals is adapted to the 'continuously updated' model. What this means is that instead of being separated into issues, new papers will be added on a continuous basis, allowing a more regular flow and shorter publication times. The papers will appear in reverse order, therefore the most recent one will be on top.


Volume 15, 2016


Title of the Paper: Secure and Convenient Secret Management in Distributed Computer Systems

Authors: Blazej Adamczyk

Abstract: Team administration of large, distributed computer systems might become challenging when it comes to access control mechanisms. Some devices provide multi-user environment and use of centralized authentication servers while other do not have these features what brings the need for creating shared secrets among all the administrators. Shared secrets are also quite frequently used to provide database and datastore accesses for application servers. This article summarizes different approaches for a secret sharing system discussing their features and security. The author first shows an example of improper secret sharing system design discussing its consequences. Finally the paper presets a secure design pattern for such a system covering a proper use of cryptography, a zeroknowledge server and a convenient user interface at the same time.

Keywords: shared secrets, password management, authentication, security

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #31, pp. 327-333


Title of the Paper: A Whale Optimization Algorithm with Inertia Weight

Authors: Hongping Hu, Yanping Bai, Ting Xu

Abstract: Whale Optimization Algorithm (WOA) is a novel nature-inspired meta-heuristic optimization algorithm proposed by Seyedali Mirjalili and Andrew Lewis in 2016, which mimics the social behavior of humpback whales. A new control parameter, inertia weight, is introduced to tune the influence on the current best solution, and an improved whale optimization algorithm(IWOA) is obtained. IWOA is tested with 31 high-dimensional continuous benchmark functions. The numerical results demonstrate that the proposed IWOA is a powerful search algorithm. Optimization results prove that the proposed IWOA not only significantly improves the basic whale optimization algorithm but also performs much better than both the artificial bee colony algorithm(ABC) and the fruit fly optimization algorithm(FOA).

Keywords: whale optimization algorithm, artificial bee colony algorithm, fruit fly optimization algorithm, inertia weight, benchmark functions

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #30, pp. 319-326


Title of the Paper: Multiobjective Optimization of Construction Project Time-Cost-Quality Trade-off Using a Genetic Algorithm

Authors: J. Magalhães-Mendes

Abstract: In construction projects, time, cost and quality are the most important factors to be considered. In the present study a hybrid genetic algorithm is used to solve this multiobjective time-cost-quality optimization problem. The chromosome representation of the problem is based on random keys. The schedules are constructed using a priority rule in which the priorities are defined by the genetic algorithm. The execution mode of each activity is selected by the genetic algorithm. Schedules are constructed using a procedure that generates parameterized active schedules. The results indicate that this approach could assist decision-makers to obtain good solutions for project duration, cost and incorporating quality with minimum function evaluation.

Keywords: Construction Management, Project management, Genetic algorithms, Multiobjective optimization, Time-cost-quality tradeoff

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #29, pp. 310-318


Title of the Paper: An Improved Support Vector Machine Based on Particle Swarm Optimization in Laser Ultrasonic Defect Detection

Authors: Ting Xu, Hongping Hu, Xiaoyan Wang, Hui Liu, Yanping Bai

Abstract: Laser ultrasonic defect detection and classification has been widely used in engineering and material defect detection, so detecting and classifying the defect targets accurately is significant. In order to obtain the higher classification accuracy, an improved support vector machine (SVM) based on particle swarm optimization algorithm is used as classifier in this paper. To search the optimal parameters of SVM, a new Tangent Decreasing Inertia Weight strategy particle swarm optimization (TPSO) algorithm is proposed to determine the optimal parameters for SVM. In addition, to further improve the classification accuracy, sparse representation is used to extract the target features from the real target echo waveform in experiment. Experimental results show that the proposed TPSO-SVM can achieve higher classification accuracy compared to the commonly PSO-SVM, classical SVM and BP neural network (BPNN) in the laser ultrasonic defect signals classification.

Keywords: Laser ultrasonic defect detection, Classification method, Support vector machine, Particle swarm optimization, Sparse representation

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #28, pp. 303-309


Title of the Paper: Supervisory Host Model: A New Paradigm to Enhance the Security and Power Management for Mobile Grid Environment

Authors: H. Parveen Begam, M. A. Maluk Mohamed

Abstract: Digital certificates and signatures provide protection in legally binding situations. Even though the digital certificates are used in banking and other legal binding process, they have got their own drawbacks which could be financial or technical. Financial being subscription for the service and technical being creation of platform that could accept all digital certificates and human errors. If the CA is subverted, then the security of the entire system is lost; resulting in a security breach of the entities that trust the compromised CA. Maintenance of CA in two or more levels requires more cost and time for authentication. Moreover CA is a third party, so most of the business organizations, corporate, colleges and governments that use applications expect more security but doesn’t like to depend on a third party. This paper proposes a new concept called Supervisory Host (SH) which is a static node that takes care of certificate generation and distribution. The advantages of Supervisory Host are simple registration process, reduced level of key exchange process and improved authentication process between two parties. SH is maintained independently for any particular application, the trust level is high and authentication process can be improved with the help of ECDSA (Elliptic Curve Cryptography Digital Signature Algorithm) algorithm. Existing projects use RSA (Rivest-Shamir-Adelman) algorithm for encrypting the message which has a key size of 1024 bits, but in ECDSA the key size is 168 bits which is far less when compare to RSA algorithm. The main advantages of ECDSA are greater security for a given key size, effective and compact implementation for cryptographic operations requiring smaller chips, which results in less heat generation and less power consumption, suitable for machines having low bandwidth, low computing power, less memory and easy hardware implementations. So the combination of SH and ECDSA algorithm helps to improve the trust level in organizations.

Keywords: Mobile Grid Computing, Authentication, Secure Communication, Secure Service Certificate(SSC), Certificate generation and encryption, Digital Signature, ECDSA

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #27, pp. 287-302


Title of the Paper: Human Pose Estimation for RGBD Imagery with Multi-Channel Mixture of Parts and Kinematic Constraints

Authors: Enrique Martinez-Berti, Antonio J. Snchez-Salmern, Carlos Ricolfe-Viala, Oliver Nina, Mubarak Shah

Abstract: In this paper, we present a approach that combines monocular and depth information with a multichannel mixture of parts model that is constrained by a structured linear quadratic estimator for more accurate estimation of joints in human pose estimation. Furthermore, in order to speed up our algorithm, we introduce an inverse kinematics optimization that allows us to infer additional joints that were not included in the original solution. This allows us to train in less time and with only a subset of the total number of joints in the final solution. Our results show a significant improvement over state of the art methods on the CAD60 and our own dataset. Also, our method can be trained in less time and with smaller fraction of training samples when compared to state of the art methods.

Keywords: DPM, Kalman Filter, Pose Estimation, Kinematic Constraints

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #26, pp. 279-286


Title of the Paper: Generic Data Services for the Management of Large-Scale Data Applications

Authors: Wolfgang Süß, Karl-Uwe Stucky, Wilfried Jakob, Heiko Maaß, Hüseyin K. Cakmak

Abstract: A new metadata-driven concept of generic data management services for a wide range of applications for processing large-scale data of arbitrary structure is introduced. The novelty of the approach is a metadata structure that subdivides them into three basic metadata types according to their functionality: for the description of the structure and type of data, for the specification of key values for data identification, and for security and organizational purposes. A first implementation of core features and the feasibility of the concept is demonstrated by managing the data storage for an application, which measures and analyses voltage data from electrical power grids. Since electrical measurement data needs privacy protection, our proposal to address this requirement is outlined.

Keywords: generic data services, metadata, object-oriented data management, large-scale data, data security

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #25, pp. 265-278


Title of the Paper: Selectional Preferences Based on Distributional Semantic Model

Authors: Chao Li, Fuji Ren, Xin Kang

Abstract: In this paper, we propose a approach based on distributional semantic model to the selectional preference in the verb & dobj (direct object) relationship. The distributional representations of words are employed as the semantic feature by using theWord2Vec algorithm. The machine learning method is used to build the discrimination model. Experimental results show that the proposed approach is effective to discriminate the compatibility of the object words and the performance could be improved by increasing the number of training data. By comparing the previous method, the proposed method obtain the promising results with obvious improvement. Moreover, the results demonstrate that the semantics is an universal, effective and stable feature in this task, which is consistent with our awareness of using words.

Keywords: Selectional preferences, Distributional semantic model (DSM), Lexical semantics, Word2vec algorithm

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #24, pp. 258-264


Title of the Paper: Enumerating all the Irreducible Polynomials over Finite Field

Authors: Nader H. Bshouty, Nuha Diab, Shada R. Kawar, Robert J. Shahla

Abstract: In this paper we give a detailed analysis of deterministic and randomized algorithms that enumerate any number of irreducible polynomials of degree n over a finite field and their roots in the extension field in quasilinear time cost per element. Our algorithm is based on an improved algorithm for enumerating all the Lyndon words of length n in linear delay time and the known reduction of Lyndon words to irreducible polynomials.

Keywords: Finite Field, irreducible polynomilas, Lyndon words

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #23, pp. 248-257


Title of the Paper: Methods and Algorithms for a High-Level Synthesis of the Very-Large-Scale Integration

Authors: Oleg Nepomnyashchy, Alexandr Legalov, Valery Tyapkin, Igor Ryzhenko, Vladimir Shaydurov

Abstract: We develop methods and algorithms for a high-level synthesis and a formal verification of the architecture for very-large-scale integration (VLSI). The proposed approach is based on the functional-flow paradigm of parallel computing and enables one to perform architecture-independent VLSI synthesis by the construction of a computing model in the form of intermediate structures of control and data graphs. This approach also provides an opportunity to verify a design at the formal description stage before the synthesis of the register-gate representation. Algorithms and methods are developed for the construction and optimization of an intermediate representation of a computing model, the verification, and going to the register-gate description of VLSI. The stages of the high-level VLSI synthesis are formed in the context of the proposed technique. An example of the synthesis of a typical module is considered for a digital signal processing. Results of the practical modeling are presented for an example.

Keywords: Parallel computing, functional programming, high-level synthesis, formal verification

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #22, pp. 239-247


Title of the Paper: Probabilistic Space-Time Analysis of Human Mobility Patterns

Authors: Ha Yoon Song

Abstract: Human mobility models are widely required in many academic and industrial fields. Due to the spread of portable devices with positioning functionality, such as smartphones, ordinary people can obtain their current position or record mobility history. Thus, mobility history can be processed in order to identify human mobility patterns. The human mobility pattern can be analysed in two ways: space and time. Space analysis focuses on a users location, and time analysis emphasises a users mobility on a daily basis. From the raw positioning data of various sets of user mobility, we analysed a personal human mobility model. Each users positioning data set is pre-processed and clustered by space and time. For spatial clustering, we developed a mechanism of clustering with expectation maximisation methodology. For temporal clustering, stay or transition probabilities over a 24 hour period were analysed. We represented the result of the personal human mobility model using the continuous time Markov chain (CTMC) with spatial transition probabilities over time. We developed a process to construct a personal human mobility model from a persons positioning data set. This personal mobility model can act as a basis for many other academic or industrial applications.

Keywords: Human Mobility Model, Space-Time Analysis, Location Clustering, Continuous Time Markov Chain, Individual Personal Mobility

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #21, pp. 213-229


Title of the Paper: New Constructions for Thin Line Problem in Visual Cryptography

Authors: Maged Wafy

Abstract: Visual cryptography schemes (VCSs) are techniques that divide a secret image into n shares. Stacking exactly n shares reveals the secret image. One VCS technique is the deterministic VCS (D-VCS), which suffers from the pixel expansion issue. This problem has been solved by probabilistic size invariant VCS (P-SIVCS); however, the visual quality of the revealed secret image produced by P-SIVCS is low. In D-VCS, a thin line is converted to a thick line and pixel expansion problem while it may not be visible in the revealed secret image created by P-SIVCS. In this paper, two new constructions are introduced for resolving the thin line problem with well visible quality and non expansion pixel by using image manipulation, D-VCS, a mixture of D-VCS and PSIVCS.

Keywords: Visual cryptography, Thin line problem, Pixel expansion

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #20, pp. 212-221


Title of the Paper: Empirical Study on Adoption and Institutionalization of B2B e-Commerce in SMEs in Saudi Arabia

Authors: Almaaf Bader Ali A., Jian-Jun Miao, Quang-Dung Tran

Abstract: Experts and business pundits forecast drastic changes of e-commerce climate in Saudi Arabia when this country became an official member of the World Trade Organization (WTO) in 2005. In fact over a last decade, many Saudi Arabian enterprises have adopted e-commerce, however almost at the entry level. In this research, a research model of e-commerce adoption and institutionalization is developed based on the Technology-Organization-Environment framework. Data collected from 133 small and medium-sized enterprises who are adopters in the country was analyzed. The results showed that (1) major factors affecting on initial adoption of e-commerce were very different from those influencing institutionalization of the technology and (2) experience of small and medium adopters of e-commerce did not significantly influence whether or not they move up towards e-commerce sophistication. Both theoretical and practical implications in promoting e-commerce adoption by SMEs in advanced developing economies, such as Saudi Arabia, are discussed.

Keywords: Key-words: e-commerce adoption, e-commerce institutionalization, SMEs, Saudi Arabia

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #19, pp. 189-211


Title of the Paper: A Markovian Model for Adaptive E-Assessment

Authors: G. S. Nandakumar, S. Thangasamy, V. Geetha, V. Haridoss

Abstract: The development of Information Communication Technologies (ICT) has increased the popularity of web based learning and E-assessment. The success of any online assessment is largely dependent on the quality of the question bank from which the questions are drawn. Various techniques are available for dynamically generating questions during E-assessment with different difficulty levels. Calibrating the question bank to know the measurement characteristics of the questions is a necessary part of large E-assessment. Classification of a question involves assigning a difficulty level to each question. An adaptive E-assessment strategy has been formulated to test the proficiency in ‘Programming using C’ language. This paper deals with the application of Markov chain to assess the reliability of question classification and to classify the performance of the students based on the attainment of handling difficulty levels over a period of time.

Keywords: Adaptive E-assessment, Question Classification, Multiple Choice Questions (MCQ), Degree of Toughness (DT), Markov Chain, Steady State Probability

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #18, pp. 179-188


Title of the Paper: A Novel Approach for Protecting Privacy in Cloud Storage Based Database Applications

Authors: Ambika Pawar, Ajay Dani

Abstract: Cloud computing technology is appealing to many organizations and has become a popular approach to reduce the cost of services. Adoption of cloud storage is limited due to privacy problems while storing and retrieving sensitive information. This paper presents a solution to the privacy by splitting table containing identity and sensitive information into two separate tables, one with identity information and another stores only sensitive information. It provides k-anonymity while performing inserts, select, update and delete operations. The degree of privacy increases with the values of k. In this paper we carry out performance analysis of our proposed scheme for different values of k. Performance analysis shows that the system performance do not degrade significantly for higher values of k. The Regression models of relation between dependent variables like response time, local computation cost, local communication cost and independent variable k have been developed in this paper. This help system designer to identify the degree of privacy required and predict the time required for different database operations. The analysis of results of residual shows that the developed models are good fit for the experimental data.

Keywords: cloud storage, privacy, k-anonymity, database, splitting, shuffling, insert, retrieve, update, delete, query, sensitive

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #17, pp. 167-178


Title of the Paper: Dynamic Workload-Aware Elastic Scale-Out in Cloud Data Stores

Authors: Swati Ahirrao, Rajesh Ingle

Abstract: NoSQL databases store a huge amount of data generated by modern web applications. To improve scalability, a database is partitioned and distributed among the different nodes called as a scale out. However, this scale out feature of the NoSQL database is oblivious to the data access pattern of the web applications, which results in poorly distributed data across all the nodes. Therefore, the cost required for the execution of the query is increased. This paper describes the partition placement strategy, which will place data partitions to the available domains in the Amazon SimpleDB according to the data access pattern of web applications, which leads to an increase in throughput by some percentage. We present the workload-aware elasticity algorithm, which will not only add and remove the domain as per the load, but also places the partitions as per the data access pattern. We have validated the workload-aware elasticity and load distribution algorithm through experimentation over a cloud data store such as Amazon SimpleDB running in the Amazon Cloud. The throughput of the load distribution algorithm is predicted using the regression and the multiple perceptron model.

Keywords: partition placement, workload-aware elasticity, data partitioning, database scalability, placement strategy

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #16, pp. 158-166


Title of the Paper: A Fuzzy Based Classification Approach for Efficient Fake and Real Fingerprint Classification with Intelligent Feature Selection

Authors: V. Sasikala, V. Lakshmi Prabha

Abstract: Fake and real fingerprint classification has become an attractive research area in the last decade. A number of research works have been carried out to classify fake and real fingerprints. But, most of the existing techniques did not utilize swarm intelligence techniques in their fingerprint classification system. Swarm intelligence has been widely used in various applications due to its robustness and potential in solving a complex optimization problem. This paper aims to develop a new and efficient fingerprint classification approach which overcomes the limitations of the existing classification approaches based on swarm intelligence and fuzzy based neural network techniques. The proposed classification methodology comprises of four steps, namely image preprocessing, feature extraction, feature selection and classification. This work uses efficient min-max normalization and median filtering for preprocessing, and multiple static features are extracted from Gabor filtering. Then, from the multiple static features obtained from 2D Gabor filtering, best features are selected using Artificial Bee Colony (ABC) optimization based on certain fitness values. This optimization based feature selection selects only the optimal set of features which is used for classification. This would lessen the complexity and the time taken by the classifier. This approach uses Fuzzy Feed Forward Neural Network (FFFNN) for classification and its performance is compared with the SVM classifier. The performance and evaluations is performed for real and fake fingerprint images obtained from LivDet2015 database. It shows that proposed work provides better results in terms of sensitivity, specificity, and precision and classification accuracy.

Keywords: Fake and real Fingerprint classification, multiple static features, normalization, median filtering, Gabor filtering, Artificial Bee Colony (ABC) optimization, Fuzzy Feed Forward Neural network (FFFNN)

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #15, pp. 143-157


Title of the Paper: Mining Data Streams with Concept Drift in Massive Online Analysis Frame Work

Authors: P. K. Srimani, Malini M. Patil

Abstract: The advancement of the technology has resulted in the data generation with increasing rate of data distribution. The generated data is called as 'data stream'. Data streams can be mined only by using sophisticated techniques. The stream data mainly comes from mobile applications, sensor applications, network monitoring, traffic management, weblogs etc. But the concepts often change with time. Weather forecasting data is a good examples here. The model built on old data is inconsistent with the new data and regular updation of the model is necessary. This type of change in a data stream is called as concept drift. The paper aims at mining data streams with concept drift in Massive Online Analysis Frame work by using Naive Bayes algorithm using classification technique. The authors also generated their own data set generator OURGENERATOR for the analysis. The other generators used are LED, RANDOMRBF, WAVEFORM, SEA, STAGGER and HYPERPLANE with concept drift. Along with Our Generator the other three static generators used are: Electricity, Airline and Forest Cover. The performance of the Naive Bayes on RANDOMRBF generator is found to be excellent but equally it is best on OUR_GENERATOR also which is first of its kind in the literature.

Keywords: Concept drift, Massive data mining, Data streams, Naive Bayes, Accuracy, our_generator

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #14, pp. 133-142


Title of the Paper: 1-NRTSVM via ADMM for Automatical Feature Selection and Classification Simultaneously

Authors: Haitao Xu, Liya Fan

Abstract: In this paper, a novel classifier with linear and nonlinear versions for data classification and automatical feature selection simultaneously is proposed and named as 1-norm regularized twin support vector machine (1-NRTSVM). By means of the alternating direction method of multipliers (ADMM), two implementation algorithms for 1-NRTSVM are presented. A major feature of the proposed method is directly solving primal problems not dual problems. Experiment results show that the proposed 1-NRTSVM is an effective and competitive classifier for data classification and automatical feature selection.

Keywords: Twin support vector machine, 1-norm of a vector, alternating direction method of multipliers, automatic feature selection

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #13, pp. 125-132


Title of the Paper: On Lock-Free Programming Patterns

Authors: Dmitry Namiot

Abstract:Lock-free programming is a well-known technique for multithreaded programming. Lock-free programming is a way to share changing data among several threads without paying the cost of acquiring and releasing locks. On practice, parallel programming models must include scalable concurrent algorithms and patterns. Lock-free programming patterns play an important role in scalability. This paper is devoted to lockfree data structures and algorithms. Our task was to choose the data structures for the concurrent garbage collector. We aim to provide a survey of lock-free patterns and approaches, estimate the potential performance gain for lock-free solutions. By our opinion, the most challenging problem for concurrent programming is the coordination and data flow organizing, rather than relatively low-level data structures. So, as the most promising from the practical point of view lock-free programming pattern, we choose the framework based on the Software Transactional Memory.

Keywords: parallel programming, thread, lock, mutex, semaphore, scalability, atomic

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #12, pp. 117-124


Title of the Paper: Adaptation of Multilayer Perceptron Neural Network to Unsupervised Clustering Using a Developed Version of k-Means Algorithm

Authors: Harchli Fidae, Joudar Nour-Eddine, Es-Safi Abdelatif, Ettaouil Mohamed

Abstract:Cluster analysis plays a very important role in different fields and can be mandatory in others. This fact is due to the huge amount of web services, products and information created and provided on the internet and in addition the need of representation, visualization and reduction of large vectors. So in order to facilitate the treatment of information and reducing the research space, data must be classified. In other words, the needless of having a good technique of clustering is continually growing. There exist many clustering algorithms (supervised and unsupervised) in the literature: hierarchical and non hierarchical clustering methods, k-means, artificial neural networks (RNAs)…. All of these methods suffer from some drawbacks related to initialization issues, supervision or running time. For instance, the classes’ number, initial code vectors and the choice of the best learning set in k-means and Multi Layer Perceptron (MLP) affect seriously the clustering results. To deal with these problems, we develop a new approach of unsupervised clustering. This later consists of using a developed version of k-means algorithm which determines the number of clusters and the best learning set in order to train the MLP in an unsupervised way. The effectiveness of this approach is tested on well-known data sets and compared to other classifiers proposed by recent researches.

Keywords: MLP Neural Network, Retro-propagation, Supervised and Unsupervised Learning, Cluster Analysis, k-means algorithm, parameters initialization, Assessment of Classification

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #11, pp. 103-116


Title of the Paper: A New P System with Hybrid MDE-k-Means Algorithm for Data Clustering

Authors: Weisun, Laisheng Xiang, Xiyu Liu

Abstract: Clustering is an important part of data mining. It can immensely simplify data complexity and helps discover the underlying patterns and knowledge from massive quantities data points. The popular efficient clustering algorithm -means has been widely used in many fields. However, The -means method also suffers from several drawbacks. It selects the initial cluster centers randomly that greatly influences the clustering performance. This study proposes a new P system with modified differential evolution - -means algorithm to improve the quality of initial cluster centers of -means algorithm. The P system has three types of membranes: elementary, local store, global store. Different membranes have different rules. Based on the membrane structure, the elementary membranes evolve the objects with modified differential evolution algorithm and other types of membrane update the local best and the global best objects synchronously with communication rules. Under the control of the P system, the hybrid algorithm achieves a good partition for data sets, compared with the classical -means algorithm and DE- -means algorithm.

Keywords: Data mining, clustering, unsupervised learning, k-means, modified DE algorithm, membrane computing

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #10, pp. 93-102


Title of the Paper: Knowledge Extraction from the Problem-Oriented Data Warehouses on Chemical Kinetics and Thermochemistry

Authors: Vladimir E. Tumanov

Abstract: This paper discusses the technology of extracting the chemical knowledge from the structured electronic sources, problem-oriented systems of science data analytics, and methods of science data analysis. Application of the feed-forward artificial neural network for predicting the reactivity of a compound (the classical potential barrier) in reactions of hydrocarbons with hydrogen atom in solution is presented. Empirical indexes of reactionary centers for groups of such reactions had identified. The artificial neural network is learned using a set of the experimental thermochemical and kinetic data. The artificial neural network has predicted classic potential barrier for hydrogen atom in reactions with hydrocarbons in solution at a temperature of 298 K with satisfactory accuracy. Special attention is placed the use of fuzzy knowledge base to predict the classical potential barrier and to calculate the rate constants of the bimolecular radical reactions of phenyl radical with hydrocarbons in the liquid phase on the experimental data. The hybrid algorithm of calculation of rate constants of bimolecular radical reactions on the experimental thermochemical data and an empirical index of the reactionary center is offered. This algorithm uses a fuzzy knowledge base or an artificial neural network for prediction of classical potential barrier of bimolecular radical reactions at some temperature, a database of experimental characteristics of reaction and Arrhenius's formula for calculation of rate constant. Results of prediction of the classical potential barrier are discussed.

Keywords: chemical kinetic, thermochemistry, data warehouse, data science analytic, expert system, artificial neural network, fuzzy logic. radical reaction, activation energy, classical potential barrier, reactionary center

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #9, pp. 83-92


Title of the Paper: Lossy Compression in the Chroma Subsampling Process

Authors: Pavel Pokorny

Abstract: Nowadays, compression algorithms are frequently used in the computer software field. The reason is the large amounts of data that is processed by applications. These compression algorithms, in different forms, are deployed in many raster image processing cases. The reasons for this are the quite high compression ratios and also, the usually high compression/decompression speed (the success of these two major parameters of the compression algorithms depends on the image information content). In addition, the images can often allow one to use lossy compression algorithms which cause a substantially greater reduction of the image data size. The JPEG graphic format is the most widely used. Here, the loss rate is determined in two ways by the subsampling color components and the quantization of the coefficients that are obtained by the Discrete Cosine Transform calculation. In this paper, attention is focused on the subsampling process of the color components. The content compares the most frequently used sampling techniques - mainly to the quality and size of the processed image.

Keywords: Image Compression, Lossy Compression, Image Quality, Sampling, Colors

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #8, pp. 76-82


Title of the Paper: Implementing Security for Multidimensional Expression Queries

Authors: Ahmad Mousa Altamimi, Mahmood Ghaleb Albashayreh

Abstract: As the security takes center stage in the recent years, most important projects underway are those associated within security. Organizations collect and process data to operate their day-to-day business successfully. Such data is stored in enterprises data warehouses and presented so that users can make decisions more easily. Online Analytical Processing (OLAP) is the prevalent component of these systems where data is organized in the form of multidimensional representation known as a data cube. A specialized query language called multidimensional expression (MDX) is then used with specific syntax and semantics for manipulating OLAP cubes. However, MDX users can override- either intentionally or unintentionally - the security policy of the system to disclose sensitive information and breach an individual’s privacy. In this paper, we present a framework that removes security threats by re-writing MDX queries to ensure consistent data accessing. Experimental results demonstrate that security can be ensured without affecting the usefulness of OLAP systems.

Keywords: OLAP, data cube, MDX, data privacy, security polices, query rewriting

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #7, pp. 64-75


Title of the Paper: Generalization Process for Loose Topological Relations of Regions

Authors: Brahim Lejdel, Okba Kazar

Abstract: In Geographical Information System (GIS), we use principally three features, such as PolyLine, Point and loose region to represent the geographic object. The loose region is used to represent the area objects where they have a loose shape as building or green area. Loose regions can be seen as an extension of simple regions described in Egenhofer model. In this paper, we treat the mutation of the loose topological relations into others relations, when we need to change the scale the map. A new topological model is presented based on simple regions which are defined in Egenhofer model and also on the assertions of mutation of the topological relationships according certain criteria.

Keywords: Region, generalization, visual acuity, loose topological relations

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #6, pp. 55-63


Title of the Paper: ISLIQ: Improved Supervised Learning in Quest to Nowcast Snow/No-Snow

Authors: Kishor Kumar Reddy C., Rupa C. H., Vijaya Babu

Abstract: Nowcasting the presence of snow/no-snow is a major problem to most of the researchers, academicians, scientists and so on, as it would affects the lives of humans, animals and aquatic life, vegetation, tourism sector to a greater extent across the globe. Previously, many of the scientists, researchers, academicians and so on provided solutions but limited to the usage of Satellite imagery, Radar imagery and so on. We provided approaches related to the same, by making use of decision trees, but the main drawback with them is computational complexity during the evaluation of split points. With this focus, in this paper we are providing Improved Supervised Learning in Quest (ISLIQ) decision tree algorithm for the nowcasting of snow/no-snow, the evaluation of splitting criterion is based on interval range instead of computing whenever there is a change in the class label, which drastically reduces the number of split points. Further, we also evaluated the performance measures such as specificity, sensitivity, precision, dice, accuracy and error rate and compared the results with various decision tree algorithms.

Keywords: Classification, Decision Tree, ISLIQ, Nowcasting, Snow, Split Points

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #5, pp. 43-54


Title of the Paper: Distributed Query Processing in Cloud Using Canonicalization Approach

Authors: Senthil K., Chitra P.

Abstract: Nowadays, citations and its formats act as a vital role in scientific publication digital libraries (DLs). Due to various reasons, the citation metadata extraction process make it difficult. The citation used in the extraction process is gathered from web and there is no standard style to the citations. Data gathered from the web is difficult to process due to the erroneous nature of the data. In order to overcome these problems, this paper proposed a distributed query processing based on canonicalization approach. Here, NoSQL database is used for distributed query support and to enhance sustained data throughput. The query is retrieved from the user and then verified with the metadata created. In case of match occurs, the index prediction is performed and the results are evaluated. Semantic query matching is used to match the given query with the database. The Citation Parser (CP) method is used to convert the query string into simple symbols to minimize the query processing time and to enhance the system performance. The experimental results shows that the proposed method results higher accuracy with lesser query processing time, response time and lesser computation cost.

Keywords: Big Data, Citation Parser, Metadata, NoSQL Database, Semantic Query Matching

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #4, pp. 34-42


Title of the Paper: CIM-Based System for Implementing a Dynamic Dashboard and Analysis Tool for Losses Reduction in the Distribution Power Systems in México

Authors: M. Molina-Marin, E. Granados-Gomez, A. Espinosa-Reza, H. R. Aguilar-Valenzuela

Abstract: This paper shows the process to develop a Web Application totally CIM-Based, the standards and the development of necessary tools used to achieve the semantic interoperability between legacy systems of Comision Federal de Electricidad (CFE by its Spanish acronym) in Mexico, as well as the system functionality.

Keywords: Common Information Model (CIM), Semantic Interoperability, Smart Grid, Enterprise Service Bus (ESB), CIM Wrappers, Web Services Definition Language (WSDL)

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #3, pp. 24-33


Title of the Paper: An Investigation of the Use of MJEA in Image Encryption

Authors: Jamal N. Bani Salameh

Abstract: In today’s digital world, electronic communication means are susceptible to attacks and eavesdropping, where security problems such as modification and forgery have reached critical extents. Encryption has been used for securing data communication and it is considered as one of the best tools to help people to protect their sensitive information from cryptanalysts when it is stored or transmitted via insecure communication channels. Most of the encryption algorithms available are generally used for text data and not suitable for multimedia data. Multimedia data contains different types of data that includes text, audio, video, graphic, and images. With the increasing use of multimedia data over the Internet, here comes a demand of securing multimedia data. The main goal of this work is to investigate the use of our encryption algorithm (MJEA) to be a new approach for image encryption. MJEA will try to dissipate the high correlation among pixels and increase the entropy value by dividing the image into blocks and shuffles their positions. In this research I am going to use the correlation, histograms and entropy to measure the security level of the encrypted images. Experimental results show the possibility of applying MJEA for digital image encryption. MJEA was able to achieve high embedding capacity and high quality of the encrypted image. A comparison has been conducted between MJEA and other block ciphers like RC5 and RC6 for encrypting Lena image by considering correlation coefficients. From the obtained results, I noticed that MJEA algorithm achieved minimum correlation coefficient and maximum encryption quality. A detailed description of the design process together with results and analyses are given to define the proposed technique precisely.

Keywords: Image encryption, Image Correlation, Image Entropy, Information Security, Encryption Algorithm

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #2, pp. 12-23


Title of the Paper: Semantic Interoperability for Historical and Real Time Data Using CIM and OPC-UA for the Smart Grid in Mexico

Authors: A. Espinosa-Reza, M. L. Torres-Espindola, M. Molina-Marin, E. Granados-Gomez, H. R. Aguilar-Valenzuela

Abstract: This paper describes the logic architecture implemented for exchanging data gathered in Real-Time as well as stored in Historical repositories, adopting a semantic interoperability architecture based on the Common Information Model (CIM) -which is defined in standards IEC 61968, IEC 61970 and IEC 62325-, and the OPen Connectivity Unified Architecture (OPC-UA) -defined in IEC 62541-. The main objective of this artifacts and architecture is to share the information gathered in real-time from the Electric Power System to many information systems that require it in order to supporting Management, Operational Analysis and other Advanced Functions in the Smart Grid context. The procedure to develop CIM Wrappers is showed, as well as results obtained and some conclusions about the works to integrate legacy systems of Comision Federal de Electricidad (CFE) in Mexico.

Keywords: Smart Grid, Common Information Model (CIM), IEC61968, IEC 61970, OPC-UA, Semantic Interoperability, Electric Power System, SCADA

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 15, 2016, Art. #1, pp. 1-11


 

Bulletin Board

Currently:

The editorial board is accepting papers.


WSEAS Main Site