WSEAS Transactions on Computers

Print ISSN: 1109-2750
E-ISSN: 2224-2872

Volume 14, 2015

Notice: As of 2014 and for the forthcoming years, the publication frequency/periodicity of WSEAS Journals is adapted to the 'continuously updated' model. What this means is that instead of being separated into issues, new papers will be added on a continuous basis, allowing a more regular flow and shorter publication times. The papers will appear in reverse order, therefore the most recent one will be on top.

Volume 14, 2015

Title of the Paper: Big Data in Changes: Is Big Data Bigger then Sustainable Development and Research Design?

Authors: Milena Janakova

Abstract: This paper is oriented on research design with links on big data in the global information society. Big data creates a phenomenon in modern society, where firms, organizations, and individuals must work with data, information, and knowledge. At first glance, it is standard matter, but the deluge of data brings many difficulties to instability. Big data is oriented around the derivation of needed and correct information with knowledge from stored data. The solution is to apply optimal research design as a multidimensional approach on big data based on defined preferences. Key is a complete perception of reality based on an optimal research design. Such approach works with many objects and different contexts using available analytical disciplines (layers). Needed spectrum of suitable analytical disciplines is wide. Presented design is focused on browsing selected layers via disciplines such as Artificial Intelligence, Business Intelligence, Customer Intelligence, Competitive Intelligence and Swarm Intelligence. For active work with data via various layers, a good helper is simulation and multidimensional approach. This approach is useful for data warehouses as well as bringing high value for support of sustainable development. The benefit is creating unexpected relations between specified layers with benefits from information technology, and controversies of the global information society. A natural request is intuitive movement from one layer to another in the form of a zoom to needed data.

Keywords: Big data, design, information, information technology, research, sustainable development

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 14, 2015, Art. #81, pp. 835-842

Title of the Paper: The Optimization by Ant Algorithm for a Project-Based Learning

Authors: Yassine Benjelloun Touimi, Nourrdine El Faddouli, Samir Bennani, Mohammed Khalidi Idrissi

Abstract: The research and development of ICT plays an effective role in distance learning. Several electronic platforms have been created, and have to confront various pedagogic currents including teaching by project. the Project-based teaching is a pedagogical approach that puts the learner at the center of all learning, and develops their skills and knowledge. During the project, students are required to make decisions regarding the solutions of the posed problems.These decisions generates conflicts between students due to heterogeneity in the group and a level difference between the learners' profiles. To overcome this problem, this paper develops a new approach based on the ant algorithm, for adapting the learning path to different learner profiles. The basic idea is to associate the adequate learning activities at the level of homogeneity and at the cognitive level of learners.

Keywords: the project-based learning, adaptation, heterogeneity, cognitive degree, ant colony optimisation

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 14, 2015, Art. #80, pp. 826-834

Title of the Paper: Software Library for Fast Digital Input and Output for the Arduino Platform

Authors: Jan Dolinay, Petr Dostálek, Vladimír Vašek

Abstract: This article presents software library for the Arduino platform which significantly improves the speed of the functions for digital input and output. This allows the users to apply these functions in whole range of applications, without being forced to resort to direct register access or various 3rd party libraries when the standard Arduino functions are too slow for given application. The method used in this library is applicable also to other libraries which aim to abstract the access to general purpose pins of a microcontroller.

Keywords: Arduino, AVR, digitalRead , digitalWrite, embedded system, pin abstraction, software library

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 14, 2015, Art. #79, pp. 819-825

Title of the Paper: Image Segmentation Based on the Self-Balancing Mechanism in Virtual 3D Elastic Mesh

Authors: Xiaodong Zhuang, N. E. Mastorakis, Jieru Chi, Hanping Wang

Abstract: In this paper, a novel model of 3D elastic mesh is presented for image segmentation. The model is inspired by stress and strain in physical elastic objects, while the repulsive force and elastic force in the model are defined slightly different from the physical force to suit the segmentation problem well. The self-balancing mechanism in the model guarantees the stability of the method in segmentation. The shape of the elastic mesh at balance state is used for region segmentation, in which the sign distribution of the points’ z coordinate values is taken as the basis for segmentation. The effectiveness of the proposed method is proved by analysis and experimental results for both test images and real world images.

Keywords: Image segmentation, elastic mesh, self balancing, physics inspired method, stress and strain

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 14, 2015, Art. #78, pp. 805-818

Title of the Paper: Toward Core Point Evolution Using Water Ripple Model

Authors: Zhibing Yu, Kun Ma

Abstract: With the continually-increasing capability of computer hardware and scale of computer software, the complexity of software is also continually increasing, and has manifest itself as one of the key factors that limit the significant improvement of software quality and productivity. However, traditional methods, such as waterfalllike model, component-based programming, software architecture method and agent oriented programming, have encountered more fundamental difficulties in processing complex system development. To address this challenge, we have witnessed that the thought of water ripple model origins from the evolution of water wave when a stone is threw into the water. In this paper, we have proposed three designing core point models, including the high level abstract core point model, the feature core point model and the function core point model, and interactive mode with each other aiming to control the structural complexity of the whole software life cycle. Furthermore, we have proposed a water ripple model of software development process with loosely-coupled correlated core point evolution. The development of a complex system is translated into the evolution of core points. With the persistent evolution thought of water ripple, the development process has strong expansibility and flexible reusability.

Keywords: software development process, water ripple model, core point evolution, software architecture, agent oriented programming, component-based programming

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 14, 2015, Art. #77, pp. 792-804

Title of the Paper: Service to Fault Tolerance in Cloud Computing Environment

Authors: Bakhta Meroufel, Ghalem Belalem

Abstract: Cloud Computing refers to the use of memory and computing capabilities of computers and servers around the world, so the user has considerable computing power without the need for powerful machines. The probability of failure occurring during the execution becomes stronger when the number of nodes increases; since it is impossible to fully prevent failures, one solution is to implement fault tolerance mechanisms. In this work, we have developed a fault tolerant service based on the checkpointing in cloud computing. Our fault tolerance service uses a semi-coordinated checkpointing that minimizes the consumed energy and the overhead by decreasing the time of the coordination phase. Our service also decreases the rollback cost. The experimental results show the effectiveness of our proposition in term of execution time, energy consumption of SLA violation.

Keywords: Fault tolerance, Cloud computing, Virtualization, Checkpointing, Overhead, Rollback, Coordination, Recovery line, SLA violation, I/O

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 14, 2015, Art. #76, pp. 782-791

Title of the Paper: Formal Cognitive Models of Data, Information, Knowledge, and Intelligence

Authors: Yingxu Wang

Abstract: It is recognized that data, information, knowledge, and intelligence are the fundamental cognitive objects in the brain and cognitive systems. However, there is a lack of formal studies and rigorous models towards them. This paper explores the cognitive and mathematical models of the cognitive objects. The taxonomy and cognitive foundations of abstract mental objects are explored. A set of mathematical models of data, information, knowledge, and intelligence is formally created. On the basis of the cognitive and mathematical models of the cognitive objects, formal properties and relationship of contemporary data, information, knowledge, and intelligence are rigorously explained.

Keywords: Cognitive informatics, brain science, mathematical models, formal theories, data science, information science, knowledge science, intelligence science, system science

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #75, pp. 770-781

Title of the Paper: Formal Verification of Embedded Systems for Remote Attestation

Authors: G. Cabodi, P. Camurati, C. Loiacono, G. Pipitone, F. Savarese, D. Vendraminetto

Abstract: Embedded systems are increasingly pervasive, interdependent and in many cases critical to our every day life and safety. As such devices are more and more subject to attacks, new protection mechanisms are needed to provide the required resilience and dependency at low cost. Remote attestation (RA) is a software-hardware mechanism that securely checks the internal state of remote embedded devices. This protocol is executed by: (1) a prover that, given a secret key and its actual state, generates a result through an attestation algorithm; (2) a verifier that, given the key, the expected prover actual state, accepts or rejects the result through a verification algorithm. As the security of a protocol is only as good as its weakest link, a comprehensive validation of its security requirements is paramount. In this paper, we present a methodology for formal verification of hardware security requirements of RA architectures. First we perform an analysis and a comparison of three selected RA architectures, then we define security properties for RA systems and we verify them using a complete framework for formal verification.

Keywords: Formal Verification, Security, Remote Attestation, Embedded Systems

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #74, pp. 760-769

Title of the Paper: Analyzing the Weighted Dark Networks Using Scale-Free Network Approach

Authors: Abdul Waheed Mahesar, Ahmad Waqas, Nadeem Mehmood, Asadullah Shah, Mohamed Ridza Wahiddin

Abstract: The task of identifying the main key nodes in the dark (covert) networks is very important for the researchers in the field of dark networks analysis. This analysis leads to locate the major nodes in the network as the functionality can be minimized by disrupting major key nodes in the network. In this paper, we have primarily focused on two basic network analysis metrics, degree and betweenness centrality. Traditionally, both these centrality measures have been applied on the bases of number of links connected with the nodes but without considering link weights. Like many other networks, dark networks also follow scale-free behavior and thus follow the power-law distribution where few nodes have maximum links. This, inhomogeneous structure of network causes the creation of key nodes. In this research, we analyze the behavior of nodes in dark networks based on degree and betweenness centrality measures by using 9/11 terrorist network dataset. We analyzed both these measures with weighted and un-weighted links to prove that weighted networks are much closer to scale-free phenomenon as compared to un-weighted networks.

Keywords: Dark networks, power-law distribution, scale-free networks, node degree centrality, betweenness centrality, weighted network analysis

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #73, pp. 748-759

Title of the Paper: Using Local Speed Information as Routing Metric for Delay Tolerant Networks

Authors: Fuquan Zhang, Inwhee Joe, Demin Gao, Yunfei Liu

Abstract: In many practical situations, the mobility is critical factor for an intended message delivery. Recently studies have found that mobility itself affects the network performance (capacity of the network, security, connectivity, coverage, etc.). However, how to effective use these characteristics of mobility in design DTN routing protocol has not been well studied yet. In this paper, we proposed a routing protocol that uses speed of the local node as routing metric for opportunistic packet relay in DTN (Delay Tolerant Networks). Moreover, the choosing of the relay nodes is only based on the comparison of speed value. It does not require any history information, global knowledge or mobility patterns of other nodes. This makes our schemes ideal for resources limited (computational power, memory, etc.) and dynamic large-scale networks. The performance of our protocol is compared with several related works. Simulations show that our scheme has better performance.

Keywords: DTN, n-copy routing, mobility, resources, ONE

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #72, pp. 740-747

Title of the Paper: Tag Management in a Reconfigurable Tagged-Token Dataflow Architecture

Authors: Bruno De Abreu Silva, Jorge Luiz E Silva

Abstract: Combining dataflow concepts with reconfigurable computing provides a great potential to exploit the application parallelism efficiently. However, to express such parallelism cannot be a trivial task. Therefore, there is a great effort to automatically translate programs originally written in procedural languages (like C and Java) into dataflow architectures which express the parallelism in a natural way. Our previous work presents a static dataflow architecture which is part of a framework to translate C programs into reconfigurable dataflow architectures. In this paper, it is discussed an implementation of tag management in a reconfigurable tagged-token dataflow architecture which was implemented on a field programmable gate array (FPGA). Although tagged-token is a traditional concept to implement dynamic dataflow machines, they have not been well explored in FPGA-based dataflow architectures. The FPGA-based dynamic dataflow architecture shows the potential for high computation rates allowing more efficient execution and presenting a more effective way to exploit parallelism for several program statements, as nested loops and function calls, when compared to static dataflow architectures.

Keywords: Tagged-Token Dataflow Architecture, Dynamic Dataflow Model, Reconfigurable Computing

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #71, pp. 730-739

Title of the Paper: An Integration Approach for XML Query Parallelization on Multi-Thread Systems

Authors: Rongxin Chen, Zongyue Wang, Husheng Liao

Abstract: The key function parts of an XML query system include XML parsing, XPath and XQuery evaluation. Each part has its specific parallel opportunity and approach. And the efficiency of each part directly affects the overall effect of XML query parallelization. Therefore it is necessary to coordinate each part in a real query application to achieve the best overall parallel performance. In this paper, we propose a novel integration approach for parallelizing XML query. Our integration approach is based on the workflow of XQuery parallelization, where both parallel XML parsing and parallel XPath evaluation are seamlessly integrated. The approach can realize automatic parallelization of XML query and make full use of multi-thread computing resources for parallel processing. Experimental results indicate that our approach can effectively improve the overall performance of XML query application through parallel computing on multi-thread systems.

Keywords: XML query, XML parsing, XPath, XQuery, Parallelization, Multi-thread system, Integration approach

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #70, pp. 719-729

Title of the Paper: A Model of Virtual Carrier Immigration in Digital Images for Region Segmentation

Authors: Xiaodong Zhuang, N. E. Mastorakis

Abstract: A novel model for image segmentation is proposed, which is inspired by the carrier immigration mechanism in physical P-N junction. The carrier diffusing and drifting are simulated in the proposed model, which imitates the physical self-balancing mechanism in P-N junction. The effect of virtual carrier immigration in digital images is analyzed and studied by experiments on test images and real world images. The sign distribution of net carrier at the model’s balance state is exploited for region segmentation. The experimental results for both test images and real-world images demonstrate self-adaptive and meaningful gathering of pixels to suitable regions, which prove the effectiveness of the proposed method for image region segmentation.

Keywords: Region segmentation, image analysis, virtual carrier immigration, self balancing, P-N junction

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #69, pp. 708-718

Title of the Paper: A Principal Component Analysis Using SPSS for Multi-Objective Decision Location Allocation Problem

Authors: Zipeng Zhang

Abstract: In order to solve the location allocation problem with multi-objective decision (MDLAP), this paper creatively combines a cost-based mathematical optimization model, which transforms the distribution location problem into a two-stage logistics location selection decision one. In the process of solving the bottom model, this paper puts forward many methods such as data standardization, entropy weight, principal component analysis and mathematical expressions by using SPSS soft. In the process of solving the top model, this paper use immune algorithm to apply experimental simulation which based on logistics demand and location data. In the process of analysis, there are many methods to be provided to solve the above model, such as the weighted linear regression method, the similarity analysis system clustering method, the principal component regression method and the Immune algorithm. As a result, the 97 candidate service area in Shandong Province are selected into 9 service areas of Wei Fang, Qingdao, Ping Du, Q Fu and so on, in order to be the better optimal logistics development area. This model avoids the ambiguity of the traditional methods, and we can better solve the optimal number and location problem in the LAP.

Keywords: SPSS, location selection model, principal component regression method

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #68, pp. 698-707

Title of the Paper: Matching of Images with Rotation Transformation Based on the Virtual Electromagnetic Interaction

Authors: Xiaodong Zhuang, N. E. Mastorakis

Abstract: A novel approach of image matching for rotating transformation is presented and studied. The approach is inspired by electromagnetic interaction force between physical currents. The virtual current in images is proposed based on the significant edge lines extracted as the fundamental structural feature of images. The virtual electromagnetic force and the corresponding moment is studied between two images after the extraction of the virtual currents in the images. Then image matching for rotating transformation is implemented by exploiting the interaction between the virtual currents in the two images to be matched. The experimental results prove the effectiveness of the novel idea, which indicates the promising application of the proposed method in image registration.

Keywords: Image matching, rotating transformation, virtual current, electromagnetic interaction, significant edge line

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #67, pp. 679-697

Title of the Paper: An Original Continuous Hopfield Network for Optimal Images Restoration

Authors: Joudar Nour-Eddine, El Moutouakil Karim, Ettaouil Mohamed

Abstract: Image restoration is a very important task in image processing. The Artificial Neural Network (ANN) approach was used to solve this problem, especially the Discrete Hopfield Network (DHN). This approach suffers from the fluctuation problem due to the use of the hard limit function as activation function. To overcome this shortcoming, we use in this work the Continuous Hopfield Network (CHN) that uses a probabilistic density as activation function. Indeed, this kind of function avoids the fluctuation behaviour and permits to extend the research area of the solution. In this regard, we propose our own energy function with appropriate parameters to obtain feasible equilibrium points. The performance of our method is demonstrated by several computational tests.

Keywords: Artificial neural network, Image restoration problem, Degraded image, Continuous Hopfield network, Discrete Hopfield network, Linear filtering, Fluctuation problem

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #66, pp. 668-678

Title of the Paper: A Dynamic Landslide Simulation Algorithm Based on Multitask Spatio-Temporal Data Model

Authors: Zhu Jiacheng, Wu Chonglong, Gan Xinglin

Abstract: In traditional real-time landslide dynamic simulation, many problems such as cumbersome and inefficiency data interaction process would lead to the limit in landslide dynamic simulation engineering application. 4D spatio-temporal data model event-driven process will be adopted to address those problems. Aiming at intricacy of influent factors in moving process and multi-factor, the original model was improved and enlarged to support the landslide process in data and logical. We use this algorithm in a real test data, show an efficient landslide dynamic simulation. It solved the problems in landslide movement and sophisticated spatio-temporal process which can be hardly described, finally provided a new idea to solve the similar question.

Keywords: Geologic event, Multitasking, Spatio-temporal data model, Dynamic simulation of landslide

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #65, pp. 658-667

Title of the Paper: Visualization of Knowledge Maps by Linking Social Network Analysis with the Analytic Hierarchy Process for Assessing Academic Performance – The Case of Taiwan’s Intelligent Transportation Systems Discipline

Authors: Chi-Chung Tao, Ruei-Jhih Jian

Abstract: This paper introduces empirical studies of visualizing Taiwan’s Intelligent Transportation Systems (ITS) knowledge maps and assessing academic performance. Social network analysis is applied to visualize ITS knowledge maps to quantify the strength of the connections between individuals or universities. The results from the empirical study show that the development of ITS research is very promising worldwide. CiteSpace and UCINET have proven to be useful tools for demonstrating ITS knowledge maps with clusters by fields, institutions, and authors. Taiwan's ITS knowledge maps with clusters can be summarized to assess their academic performance by using AHP analysis. The ranking indicators and corresponding knowledge maps provide an overview of current performance status of ITS researchers in different ITS fields in Taiwan. This can eventually identify the most important ITS fields in local areas and contribute to highlighting hot topics of ITS researchers in the future.

Keywords: Intelligent Transportation Systems, Visualization of knowledge maps, Academic performance assessment

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #64, pp. 646-657

Title of the Paper: Access Plan Recommendation Using SQL Queries Similarity

Authors: Jihad Zahir, Abderrahim El Qadi, Driss Aboutajdine

Abstract: Plan reuse is a technique of databases optimization, the main purpose of which is to reuse old access plans stored in the database to execute future queries instead of generating new plans. To carry out its task, the optimizer needs to identify similarity between new and old queries. Questions such as which techniques are needed and which SQL query representation is best to produce accurate similarity estimation remain poorly addressed. The main goals of this work is to propose an approach for access plan recommendation using 4 SQL queries representations and clustering techniques to identify similarity between queries. We study SQL queries similarity, at the intentional level by considering the uninterpreted SQL sentences; therefore SQL queries are represented to TF-IDF and N-GRAM. Then, feature selection algorithms are used to identify the most significant descriptors in the feature space. Next, clustering is applied using partitionning clustering, density based clustering and competitive learning based clustering. Finally, a access plan is recommended to the optimizer. Results show that the use of feature selection process is relevant to our work especially for TF-IDF representation while the most accurate and efficient clutsering is obtained with k-means algorithm.

Keywords: Clustering, Access plan reuse, SQL Query similarity, Plan recommendation, Feature selection

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #63, pp. 638-645

Title of the Paper: Trusted Access Control Based on FCE of User Behavior in Cloud Environment

Authors: Leiyue Yao

Abstract: In a complex dynamic cloud computing environment, both analyses of abnormal behavior of users and confirming incredible users are effective security measures. Fuzzy mathematics is used to reflect the ambiguous judgment of experts, and AHP method is used to compute the weight for each attribute of network users’ behavior. So a comprehensive way is used to evaluate user’s trust value based on FCE in this study. Experimental results show that the trust in different types of users can be evaluated effectively, service rejection rate for malicious nodes is improved, and success rate of service interactions is also improved for integrity users. So the evaluation method helps to quantitative analysis of dynamic trust-based security controls, and provides a reliable evidence for service providers in response to user’s request.

Keywords: cloud platform, user behavior, access control, Fuzzy Comprehensive Evaluation, trust evaluation

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #62, pp. 629-637

Title of the Paper: A Dynamic, Real-Time Alarm System for Power Plants

Authors: Ilse Leal-Aulenbacher, José M. Suárez-Jurado, Efren R. Coronel-Flores

Abstract: Alarm systems have been widely used to help power plant operators identify abnormal conditions that require their attention. However, there are certain well–known issues with alarm systems that have a negative effect on their ability to provide useful information to operators. Alarms can become difficult to manage when too many of them are presented simultaneously. Power plants often perform maintenance tasks or change configurations periodically. When alarm limits are configured according to ideal operation conditions alone, alarms may become irrelevant when operation conditions change. In this paper, we present a real–time alarm system, which incorporates tactics such as hierarchical organization of alarms and alarm processing, relative to current power plant operation conditions. We show how these strategies can be used to improve how alarms are presented to power plant operators.

Keywords: alarm system, real–time, data acquisition system

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #61, pp. 620-628

Title of the Paper: Trapezoidal Fuzzy AHP and Fuzzy Comprehensive Evaluation Approaches for Evaluating Academic Library Service

Authors: Quang Hung Do, Jeng-Fung Chen, Ho-Nien Hsieh

Abstract: Academic libraries contribute to educational processes; therefore, the evaluation of library services plays an important role in enhancing a university’s quality. In this paper, we propose a novel framework for academic library service evaluation based on the combination of fuzzy analytical hierarchy process (FAHP) and fuzzy comprehensive evaluation method. Specifically, the evaluation hierarchical structure is established and then the criterion and attribute weights are determined by the trapezoidal FAHP method. Employing the FAHP in group decision-making facilitates a consensus of decision-makers, and reduces uncertainty in decision-making. The evaluation of the academic library service can then be conducted by the use of the comprehensive evaluation method. A case application is also used to illustrate the proposed framework. The application of this framework can make the evaluation results more scientific, accurate, and objective. It is expected that this work may serve as a tool for managers of higher education institutions in improving the educational quality level.

Keywords: Academic library service, Fuzzy analytical hierarchy process, Decision making process, Fuzzy sets

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #60, pp. 607-619

Title of the Paper: New Rate Control Optimization Algorithm for HEVC Aiming at Discontinuous Scene

Authors: Shengyang Xu, Mei Yu, Shuqing Fang, Zongju Peng, Xiaodong Wang, Gangyi Jiang

Abstract: To combat with lagging rate control (RC) parameter setting in encoding video sequences with discontinuous scenes, a novel RC algorithm at group of picture (GOP) level is proposed for high efficiency video coding (HEVC) in this paper. Firstly, the variation of GOP is analyzed and used to detect the discontinuous scene. Then, a new bit allocation algorithm is presented by constructing the correlation between bit allocation for every GOP and the intensity of scene change. Finally, the impact of RC parameter updating at the frame level is investigated to obtain high accuracy of bit allocation for discontinuous scenes. Experimental results show that compared with the current RC algorithm for HEVC, the proposed algorithm can obtain better performances on purpose of reducing the fluctuation of peak-signal-to-noise-ratio and rate-distortion optimization with almost the same encoding complexity for HEVC.

Keywords: High efficiency video coding, rate control, discontinuous scene, video quality fluctuation

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #59, pp. 598-606

Title of the Paper: A New Local Binary Probabilistic Pattern (LBPP) and Subspace Methods for Face Recognition

Authors: Abdellatif Dahmouni, Karim El Moutaouakil, Khalid Satori

Abstract: In this paper, we present a new model of extraction of local characteristics, named Local Binary Probabilistic Pattern (LBPP), and based on the Local Binary Pattern (LBP). This model relates to a very important result of probability theory; it is the large great numbers. In this respect, the distribution of the gray levels on the areas (homogeneous texture) on a face image follows a law of probability, which is the sum of several normal laws. This vision allows evaluating the current pixel while basing on the concept of confidence interval, which permits to overcome some LBP shortcomings, especially the information losses associated to LBP deterministic nature. We have combined the proposed model with the most known algorithms of the dimensionality reduction of data analysis in the face recognition field. In order to evaluate our approach, various experiments are carried out on data bases ORL and YALE. In this context, we made a comparison between the performances of systems LBP+ACP, LBP+LDA, LBP+2DPCA, LBP+2DLDA, LBPP+ACP, LBPP+LDA, LBPP+2DPCA and LBPP+2DLDA. The obtained results show the effectiveness of our systems, in particular for systems LBPP+2DPCA and LBPP+2DLDA. The experimental exactitude observed is of 96.5%.

Keywords: LBP, LBPP, PCA, LDA, 2DPCA, 2DLDA, Confidence interval

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #58, pp. 588-597

Title of the Paper: A Hybrid Load Balancing Policy Underlying Cloud Computing Environment

Authors: S. C. Wang, S. C. Tseng, S. S. Wang, K. Q. Yan

Abstract: Network bandwidth and hardware technology are developing rapidly, resulting in the vigorous development of the Internet. A concept, cloud computing, uses low-power hosts to achieve high usability. The cloud computing refers to a class of systems and applications that employ distributed resources to perform a function in a decentralized manner. Cloud computing is to utilize the computing resources (service nodes) on the network to facilitate the execution of complicated tasks that require large-scale computation. Thus, the selecting nodes for executing a task in the cloud computing must be considered. However, in this study, a hybrid load balancing policy, which integrated static, and dynamic load balancing technologies to assist in the selection for effective nodes. In addition, if any selected node can no longer provide resources, it can be promptly identified and replaced with a substitutive node to maintain the execution performance and the load balancing of the system.

Keywords: Cloud computing, Distributed computing, Load balancing

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #57, pp. 580-587

Title of the Paper: Hidden Object Detection for Computer Vision Based Test Automation System

Authors: Rajibul Anam, Md. Syeful Islam, Mohammad Obaidul Haque

Abstract: In Software Quality Assurance, computer vision based automation tools are used to test the window based application and window contains many type of objects like button, box, list, etc. Automation tool detect window objects by comparing images. Most of the objects are visible in the screen but some are not visible to the screen at the first time, proper interaction with the window application hidden objects get visible to the screen like drop-down list item, editor text object, list box item and slider. With the vision based automation systems these hidden objects cannot be searched directly. In this paper proposes some methods which use image and shortcut key to interact with the testing software to search the hidden objects. These methods will enhance the automation tools to access the window application hidden objects faster.

Keywords: test case automation; software quality assurance; vison based; window application

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #56, pp. 570-579

Title of the Paper: Acoustic Mapping of Visual Text Signals through Advanced Text-to-Speech: the Case of Font Size

Authors: Georgios Kouroupetroglou

Abstract: Current Text-to-Speech systems, commonly used for document accessibility, do not include an effective and standard acoustic provision of the visual typographic cues embedded in them. In this work, we first introduce the text signals (i.e. the writing devices that emphasize aspects of a text’s content or structure) along with an appropriate architecture for the structure of documents as well as the main principles and technological topics of document accessibility. Then, the emerging technological approach of Document-to-Audio (DtA) we have developed is presented. DtA essentially constitutes the next generation of Text-to-Speech systems that supports the acoustic mapping of visual text-signals and thus provides much better accessibility to documents. Finally, for the case of the font size in the typographic layer, we present the results of two quantitative approaches (direct mapping and emotional-based mapping) for the rendering of text signals through DtA.

Keywords: Text-to-Speech, Text Signals, Document-to-Audio, Document Visual Elements, Accessibility

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #55, pp. 559-569

Title of the Paper: Transactional Memory on a Dataflow Architecture for Accelerating Haskell

Authors: Roberto Giorgi

Abstract: Dataflow Architectures have been explored extensively in the past and are now re-evaluated from a different perspective as they can provide a viable solution to efficiently exploit multi/many core chips. Indeed, the dataflow paradigm provides an elegant solution to distribute the computations on the available cores by starting computations based on the availability of their input data. In this paper, we refer to the DTA (Decoupled Threaded Architecture) – which relies on a dataflow execution model – to show how Haskell could benefit from an architecture that matches the functional nature of that language. A compilation toolchain based on the so called External Core – an intermediate representation used by Haskell – has been implemented for most common data types and operations and in particular to support concurrent paradigms (e.g. MVars, ForkIO) and Transactional Memory (TM). We performed initial experiments to understand the efficiency of our code both against hand-coded DTA programs and against GHC generated code for the x86 architecture. Moreover we analyzed the performance of a simple shared-counter benchmark that is using TM in Haskell in both DTA and x86. The results of these experiments clearly show a great potential for accelerating Haskell: for example the number of dynamically executed instructions can be more than one order of magnitude lower in case of Haskell+DTA compared to x86. Also the number of memory accesses is drastically reduced in DTA.

Keywords: Multithreaded Architecture, Data-flow Architecture, Haskell, Transactional Memory

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #54, pp. 546-558

Title of the Paper: A Representational Analysis on Relationships in an ER Model

Authors: Lin Liu, Junkang Feng, Kaibo Xu

Abstract: We observe that a database is a type of representation system in that data in it represents something in the real world. Thus the notion of representation should be highly relevant to databases especially data modeling. In this paper, we focus on identifying the representation aspect of relationships of an ER model by means of notions of representation including information carrying and nomic constraint. In particular we review a known problem, i.e., connection traps, in ER modelling. Our main findings are: A transitive nomic constraint arises when the multiplicities of participating non-transitive relationships are *:1 or 1:1; Connection traps can be accounted for in terms of the nomic constraint or the lack of it in a path, namely, a fan trap is formed when a transitive structure contains no transitive nomic constraint, and a chasm trap is the result of the partial participation of an entity in the nomic constraint of a relationship. Furthermore, even when a path is free of connection traps, the transitive nomic constraint in the path has to project to (i.e., match) a constraint in the target domain (normally the real world) in order for the path to be able to represent it.

Keywords: Representation, ER modeling, Information carrying, Connection traps, Constraint projection

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #53, pp. 538-545

Title of the Paper: Cooperative Architecture for Signature Based Alarm System

Authors: V. Dhanakoti, D. Meenaakshi

Abstract: Existing system security and architecture is vulnerable to withhold the offensiveness of different system risks. An advanced security framework in needed, to obtain better security in various layers of the network. This article represents a collective design for several Intrusion Recognition Systems (IRS) identifying intrusions during authentication. The recognition is efficient and effective, prepared by collective smart agents, appropriate knowledge base and blending several recognition sensors. The design has three divisions namely: Collective Alarm Indexing, Signature Based Alarm Estimation and Alarm Association. This design sinks the alarm by correlating outcomes of many sensors to produce condensed views. This decreases false positives as host system, system information from the estimation technique and combine measures based on reasonable relations, together produces Universal alarm intelligence Database. This design is above the intrusion recognition layer for post recognition alarm scrutiny and precaution actions. This paper discloses the design executed to obtain the results.

Keywords: System security, Intrusion recognition, Alarm, Smart agents, Collective work

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #52, pp. 526-537

Title of the Paper: Image Analysis by Discrete Orthogonal Tchebichef Moments for 3d Object Representation

Authors: Mostafa El Mallahi, Abderrahim Mesbah, Hakim El Fadili, Khalid Zenkouar, Hassan Qjidaa

Abstract: Discrete Tchebichef moments are widely used in the field of image processing application and pattern recognition. In this paper we propose a compact method of 3D Tchebichef moments computation. This new method based on Clenshaw’s recurrence formula and the symmetry property produces a drastic reduction in the complexity and computational time. The recursive algorithm is then developed for fast computation of inverse Tchebichef moments transform for image reconstruction. We also extract scale and translation 3D moment invariants using a proposed direct method. The validity of the proposed algorithm is proved by simulated experiments using 3D image/Object.

Keywords: Digital image processing, Pattern analysis, Image reconstruction techniques, Three-dimensional image processing, Moment methods

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #51, pp. 513-525

Title of the Paper: Trust Based Suspicious Route Categorization for Wireless Networks and its Applications to Physical Layer Attack

Authors: S. Raja Ratna, R. Ravi

Abstract: With the increased usage of networks, security becomes a significant issue. Owing to the open nature, the adversary corrupts the packet by injecting high level of noise, thereby keeping the channel busy so that legitimate traffic gets completely blocked resulting in packet loss at the receiver side. Although several schemes have been proposed to prevent these attacks but none of the existing works have analyzed trust based routing. Nowadays trust based routing is an effective way to prevent the physical layer attacks in wireless network. In this paper, prevention of physical layer attack has been studied by comparing trust metrics. A new scheme known as Trust based Suspicious Route Categorization (TSRC) has been proposed which identifies the misbehaving suspicious routes and it operates in two modules. In module one, the misbehaving routes are marked as suspicious based on its trust condition. In module two, the marked suspicious routes are categorized into four groups using a classifier and from the no risk group a reliable route is selected for data transmission. By simulation studies, it is observed that the proposed scheme significantly identifies suspicious routes with higher detection rate and lower false positive probability; it also achieves higher throughput and lower delay.

Keywords: Categorization, Delay, Misbehaving, Physical layer, Suspicious, Throughput

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #50, pp. 502-512

Title of the Paper: A Genetic Task Migration Algorithm for Fault Recovery in NoC-Based Manycore Systems

Authors: Jinxiang Wang, Zixu Wu, Fangfa Fu

Abstract: Recovery from permanent core faults in NoC-based manycore systems usually requires migrating tasks from faulty cores to fault-free cores, where balanced workloads are commonly desired. Finding optimal migration destinations for tasks, however, is a challenging issue due to the time complexity of the search process. To cope with this, in this paper, a genetic algorithm based task migration algorithm is proposed, where the adaptive crossover (AC) scheme and the An chaotic mapping disturbance (AD) scheme are incorporated to improve the search efficiency. Experiments show that the modifications to the standard genetic algorithm (SGA) are effective, and the proposed task migration algorithm outperforms two PSO-based algorithms, the SGA and a deterministic algorithm (DA) in finding more balanced migration solutions. The average improvements of the proposed algorithm over the DA, PSO, DPSO, and SGA algorithms are 86%, 88%, 37%, and 16%, respectively.

Keywords: Task migration, genetic algorithm, fault recovery, NoC, manycore system

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #49, pp. 487-501

Title of the Paper: A Comparative Study on Syntax Matching Algorithms in Semantic Web

Authors: V. P. Sumathi, K. Kousalya, R. Kalaiselvi

Abstract: In semantic web, extraction of meaningful information involves many tedious processes due to similarity between the information, context of the words used, structure similarity and relationship between the words. Ontology helps to understand the context of heterogeneous information available in the web. The domain specific ontologies can be merged to extract integrated information from various semantic websites. Different algorithms are in practice to find similarity between class names that are exist in different ontologies. The class names which are syntactically and semantically equal are identified and merged to produce global ontology that can be used for information retrieval. In this paper we have implemented and compared various syntax matching algorithms and identified the best syntax matching algorithm appropriate for semantic web environment. Performances of identified algorithms are analyzed and evaluated with respect to precision, recall and F-measure.

Keywords: Syntax matching algorithm, Semantic Web, Information theory, Heterogeneity, Similarity measures

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #48, pp. 474-486

Title of the Paper: A Hybrid Method Applied to Multiple Sequence Alignment Problem

Authors: Lamiche Chaabane, Moussaoui Abdelouahab

Abstract: Multiple sequence alignment (MSA) is an important tool in biological analysis. However, it is difficult to solve this class of problems due to their exponential time complexity when the number of sequences and their lengths increase. In this research paper, we present a new method for multiple sequence alignment problem, based on classical tabu search (TS) and simulated annealing (SA) techniques. The developed approach is called (TSSA), it is implemented in order to obtain an optimal results of multiple sequence alignment. The essential idea of our TSSA approach is the integration of the metropolis criterion of SA algorithm to construct elite solutions list, which can be exploited by TS algorithm to intensify the search around of best elements of this list. At the same time, it helps to diversify the search when TS restarting with worse solutions from the constructed elite solutions list. Computational results on a wide range of datasets taken from the BaliBASE database have shown the degree to which simulated annealing enhance the performance of tabu search algorithm and demonstrate the superiority of TSSA algorithm over the six well-known multiple sequence alignment methods PRRP, ClustalW, SAGA, DiAlign, ML_PIMA and MultiAlign. The proposed method also finds solutions faster than binary particle swarm optimization (BPSO) algorithm, TS-R algorithm, AIS method and IMSA approach.

Keywords: Multiple sequence alignment, tabu search, simulated annealing, TSSA, elite solutions list

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #47, pp. 465-473

Title of the Paper: Web Usage Based Analysis of Web Pages Using RapidMiner

Authors: M. Santhanakumar, C. Christopher Columbus

Abstract: In recent times, due to the rapid usage of World Wide Web, websites are the information provider to the Internet users. Storing and retrieving the information from the web is always a challenging task. Web mining, the term is defined as extract needed information to the users from the Web. Here, the information provided by the Web is not only the exact information of user needs but also suggest the information associated to the exact one. Web mining is classified into three sub tasks such as, Web Content, Web Structure and Web Usage Mining. This paper, introduces the applications and the mining process of data mining tool (open source) Rapidminer. Here, the proposed work analyzes the usage of web pages (i.e. Browsing behavior of user) using two different clustering algorithms such as k-means, which is incorporated in the tool and Fuzzy c means (FCM) clustering using RapidMiner. The results will show operational background of FCM clustering and k-means clustering algorithm based on the cluster centroid.

Keywords: Web Mining, Web Usage Mining, k-means, FCM, RapidMiner

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #46, pp. 455-464

Title of the Paper: Novel Planar Self-Calibration Approach of a Camera having Variable Intrinsic Parameters

Authors: I. El Batteoui, A. Baataoui, A.Saaidi, K. Satori

Abstract: The self-calibration of a camera is an important computer vision problem it allows us to estimate the camera intrinsic parameters without any knowledge of the used scene. In this work we proposed a novel method of camera self-calibration with varying intrinsic parameters from an unknown planar scene. The principle of the present approach consists of using some projection points of the scene in only two image planes to obtain a system of equations according to the intrinsic camera parameters and the image of absolute conic in the both views. From this system of equations we formulated a non-linear cost function whose minimization permits us to estimate the intrinsic camera parameters in the both images. The experimental results on synthetic and real data justify the effectiveness of our method.

Keywords: Self-calibration, Computer vision, Varying intrinsic parameters, Image of the absolute conic, nonlinear cost function, Unknown planar scene

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #45, pp. 444-454

Title of the Paper: Linked Spectral Graph Based Cluster Ensemble Approach Using Weighted Spectral Quality Algorithm for Medical Data Clustering

Authors: S. Sarumathi, N. Shanthi, M. Sharmila

Abstract: Over the certain span of time, Cluster Ensembles have been emerged as an offspring for solving the problem of extracting the efficient clustering results. Although many efforts have been commenced, it is examined that these techniques adversely creates the final data partition based on imperfect information. The original Ensemble information matrix exposes only the cluster data object relations with many entries being left empty. This paper presents an investigation that provides a solution to the problem of degrading the quality of the final partition through a Linked Spectral Graph based Cluster Ensemble approach. In particular, an effective Weighted Spectral Quality algorithm is proposed for the underlying similarity measurement among the Ensemble Members which in turn can be highly used to avoid the local optimum and the ill-posed issues derived from the huge dimensional samples. Subsequently, to obtain the final ultimate clustering results a Spectral Clustering based Consensus Function is applied to the Distilled Similarity Matrix (DSM) that is formulated from the similarity assessment algorithm. The Experimental results projected on Medical datasets retrieved from the UCI repository demonstrate that the proposed approach outperforms the traditional ones in data clustering.

Keywords: Clustering algorithms, Cluster Ensemble, Spectral graph partitioning, Consensus Function, Data Mining, Similarity Measures

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #44, pp. 430-443

Title of the Paper: Retargeting GCC Compiler for Specific Embedded System on Chips

Authors: Benbin Chen, Xiaochao Li, Donghui Guo

Abstract: This paper describes the High-performance C Compiler (HCC) and its specific implementation for industrial application-specific embedded System on Chips (SoCs). HCC compiler is a language C compiler based on the retargetable GCC compiler. Because of the specialized architectures and features of specific embedded SoCs, machine-dependent compiler implementation is an important and challenging work in embedded system. To quickly implement a compiler for specific embedded SoCs, compiler extension methods are proposed in HCC compiler. We extend the identifier and attribute with Abstract Syntax Tree (AST) for language-specific programming syntax of the compiler front-end, which is the syntax of the extended standard ANSI C. And then, the machine-dependent classification of assembly generation for the specific embedded SoCs is designed and implemented. After finishing the ABI (Application Binary Interface) and MD (Machine Description) of the compiler back-end, the HCC compiler is completed by retargeting GCC compiler for the application-specific embedded SoCs. These implementation methods could be referenced for other embedded chips. According to the crossing contrasts and tests with multiple compilers of the same type, conclusion can be drawn that the proposed HCC compiler has a stable performance with excellent improvement of the generated assembly codes.

Keywords: Compiler, Specific Embedded SoCs, Language-Specific, Machine-dependent, Abstract Syntax Tree, Attributes

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #43, pp. 420-429

Title of the Paper: Enhancement of Phase Order Searching Using an Effective Tuning Strategy

Authors: J. Andrews

Abstract: Modern compilers provide a large number of compiler options. Each option has two states called as enable or disable. Enabling on some options may improve or degrade the performance of the program. The objective is to increase the performance of the source program by means of adjusting the compiler options. The selection and ordering of the most efficient compiler options is required to improve the execution time, code and speed up. Some option combination may affect the execution time. The problem of getting the best combination of options is found by using modified genetic algorithm with the help of genetic operators. Finding the better ordering of best options will change the final erformance of the program. Ordering of the best combination of options obtained from selection algorithm is fed to phase order search algorithm. The existing algorithms such as combined elimination, batch elimination, branch and bound, push and pop with combined elimination algorithm, optimality random search and iterative random search algorithm are modified and the results are compared with the newly created combined push and pop with modified genetic algorithm. It is found that the combined push and pop with modified genetic algorithm shows better performance when compared to other algorithms. The phase searching algorithm shows increase in the program performance for some benchmark applications than combined push and pop with modified genetic algorithm. The experimental results show that 9% improvement in both tuning time and normalized tuning time. The speedup exhibit 11% increase over the set of benchmark applications. The combination of combined push and pop with modified genetic algorithm and phase order searching provide a 10% overall improvement in the program performance from the existing algorithms.

Keywords: Optimization, Combined Push and pop with modified Genetic Algorithm, Phase order searching

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #42, pp. 411-419

Title of the Paper: A Performance Evaluation Framework for Mobile P2P Overlays under Churn

Authors: Jun Li, Guoyin Zhang, Xianghui Wang

Abstract: With the popular use of the mobile intelligence equipment, including smartphones and mobile tablets, mobile peer-to-peer (P2P) networks have become increasingly important. In recent years some mobile P2P overlays have been proposed. We propose a three-dimension evaluation framework for mobile P2P overlays under churn. Three P2P overlays, named as GIA, M-GIA, and KCCO(k-Clique Community Overlay), are chosen to be evaluated according to our framework. Two kinds of churn models are used to compare the performance of three P2P overlays for achieving more comprehensive results. Simulation results show that our framework could be used for performance evaluation of various overlays under different conditions, especially under high churn. With this framework, KCCO shows better suitability for mobile networks, because it gets much higher query success rate than other two overlays, and takes less average query delay than other overlays in most cases.

Keywords: performance evaluation, mobile peer-to-peer overlay, churn, k-clique, query success rate, average query delay

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #41, pp. 402-410

Title of the Paper: Predictive Models for Consistency Index of a Data Object in a Replicated Distributed Database System

Authors: Shraddha Phansalkar, A. R. Dani

Abstract: Consistency is a qualitative measure of database performance. Consistency Index (CI) is a quantification of consistency of a data unit in terms of percentage of correct reads to the total reads observed on the data unit in given time. The consistency guarantee of a replicated database logically depends on the number of reads, updates, number of replicas, and workload distribution over time. The objective of our work is to establish this dependency and finding their level of interactions with consistency guarantees to develop a predictor model for CI. We have implemented Transaction Processing Council-C (TPCC) online transactions benchmark on Amazon SimpleDB which is used as big-data storage. We have controlled the database design parameters and measured CI with 100 samples of workload and database design. The findings helped us to implement a prototype of CI based consistency predictor using statistical predictive techniques like a) Regression model and b) Multiple Perceptron neural network model c) Hidden Markov model. The data statistics show that the neural network based CI predictor causes less error and results in better coefficient of determination R2 and mean square error (MSE).The Hidden Markov model based CI predictor is capable of modelling the effect of sequence of the input workload on the probability of obtaining a desired CI.

Keywords: Consistency Index (CI), predictive models, regression, neural network, Hidden Markov model

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #40, pp. 395-401

Title of the Paper: Coordinate Inverse and Automatically Mapping of Roadside Stakes for Complex Curve Route

Authors: Huai Ping Feng, Zhipeng Wang,Chuang Jiang, Jianmei Chang

Abstract: With construction of high-speed railway, coordinate inverse problem, especially adopted in more complex types of transition curves becomes time consuming and error prone. The traditional methods in practice have shorts of only applying with limited types of curves, complexity, and tedious. In this paper, a new inverse algorithm is put forward, which is suit for various types of curves and the first iteration point is not necessary to be input but can be random. The iteration can be ensured to be convergence. In this article, using of this algorithm, software has been developed for lofting any types of transition curves by language C# and ACCESS, which makes the laying work more accurate and quick compared with that by hand calculator. The location of the roadside stake is showed by the use of graphic controller Teigha.net API. Tests with existed works show that this method is feasible and it can provide more accurate results quickly and automatically

Keywords: Coordinate inverse, Automatically mapping, Roadside stake, Complex transition curve

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #39, pp. 389-394

Title of the Paper: A Hybrid Algorithm Combining Weighted and Hasht Apriori Algorithms in Map Reduce Model Using Eucalyptus Cloud Platform

Authors: R. Sumithra, Sujni Paul, D. Ponmary Pushpa Latha

Abstract: Data mining performs a major role in data analysis which benefits enhancement of business. In modern era, distributed data mining becomes a major research area due to distributed computing. In distributed data mining data can be physically or virtually distributed that helps greatly in finding out interesting facts. Virtual distribution and distributed data mining gets its major impact on big data analysis with the help of cloud and grid computing. This research work performs data mining in a cloud environment using Eucalyptus platform with VMWare workstation and Hadoop platform. This work mines distributed data in a cloud environment using hybrid apriori association rule algorithm which combines the benefits of both hash-t and weighted apriori algorithms. Finally the work compares the performance with the existing implementation of weighted and hash-t algorithms

Keywords: Weighted apriori, Association rule mining, Mapreduce, Hadoop, Euclyptus, Cloud, Data mining, HashT

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #38, pp. 382-388

Title of the Paper: A New Spatial Fuzzy C-Means for Spatial Clustering

Authors: Yingdi Guo, Kunhong Liu, Qingqiang Wu, Qingqi Hong, Haiying Zhang

Abstract: Fuzzy C-means is a widely used clustering algorithm in data mining. Since traditional fuzzy C-means algorithms do not take spatial information into consideration, they often can’t effectively explore geographical data information. So in this paper, we design a Spatial Distance Weighted Fuzzy C-Means algorithm, named as SDWFCM, to deal with this problem. This algorithm can fully use spatial features to assign samples to different clusters, and it only needs to calculate the memberships one time, which reduces the running time greatly compared with other spatial fuzzy C-means algorithms. In addition, we also propose two new criteria, named as DESC and PESC, for evaluating spatial clustering results by measuring spatial and regular information separately. The experiments are carried out based on real petroleum geology data and artificial data, and the results show that SDWFCM can achieve better performance compared with traditional clustering method, and our spatial cluster indices can provide the assessment of clusters by taking spatial structure into consideration effectively.

Keywords: spatial clustering, fuzzy c-means, evaluation criteria

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #37, pp. 369-381

Title of the Paper: Register Linear Based Model for Question Classification Using Costa Level Questions

Authors: Shanthi Palaniappan, Ilango Krishnamurthi

Abstract: Question classification module of a Question Answering System plays a very important role in identifying and providing results according to the user expectations. Different methods are involved in the classification that can be applied to all kinds of domains like machine learning and lexical database. Identifying the relevant approach for question classification for a specific domain is one of the foremost tasks. A study on different levels of questions including Blooms taxonomy and Costa taxonomy made the researchers to focus more on different categories of questions. To overcome these issues, we employ a question classifier using Register Linear (RL) models for a specific domain. The Register Linear (RL) Classification Model classifies the complex questions in a linear manner where each input is assigned to only one class. The RL classification model identifies the role of semantics provided in the input space which is divided into decision regions with the decision surfaces to be of linear functions of input x (sentence) for different set of classes. Initially, the Register Linear model identifies the role of semantics in a sentence, and with these roles being identified, statistical relations between the concepts in the sentence are derived that produce a probability distribution over different set of classes. With these classifications, the exact answer type is identified. The model proposed gives better results in terms of execution time (time taken to categorize the queries), classification accuracy and result analyzing efficiency.

Keywords: Register Linear, Question Answering System, World Wide Web, Semantic Features, Statistical Information, Hierarchical Structure

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #36, pp. 358-368

Title of the Paper: Image-Based Ink Diffusion Simulation and 3D Chinese Wash-Ink Paintings Rendering

Authors: Da-Jin Li, Cheng-Jie Bai

Abstract: An image-based synthesis method of Chinese wash-ink strokes is proposed in the paper. Wash-ink stroke is divided into two regions: primary stroke region and diffused region. Synthesizing process of diffused region begins from the outline of primary stroke region. At first, the outline of primary region is magnified outward in all directions with the same distance. Then the irregular stroke border is created. To simulate pigment granulation in wash-ink strokes, we make the statistics in colour varieties by hand-painted brush strokes, and add the variety errors on the stroke images. We also construct a nonlinear function to compute the colours in the overlap areas of two overlapped strokes. Based on our image-based synthesis method of wash-ink strokes, we present a novel method to render 3D scenes into wash-ink landscape paintings. The rendering results show that our method can synthesize realistic wash-ink brush strokes, and it is efficient as well.

Keywords: Simulation of ink diffusion, 3D wash-ink paintings, Non-photorealistic rendering, Image-based rendering, Synthesizing brush strokes, Synthesize algorithm

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #35, pp. 347-357

Title of the Paper: CCBIR: A Cloud Based Implementation of Content Based Image Retrieval

Authors: R. Madana Mohana, A. Rama Mohan Reddy

Abstract: Content Based Image Retrieval (CBIR) is an efficient retrieval of relevant images from large databases based on features extracted from the image. This paper proposes a system that can be used for retrieving images related to a query image from a large set of distinct images. It follows an image segmentation based approach to extract the different features present in an image. The above features which can be stored in vectors called feature vectors and therefore these are compared to the feature vectors of query image and the image information is sorted in decreasing order of similarity. The processing of the same is done on cloud. The CBIR system is an application built on Windows Azure platform. It is a parallel processing problem where a large set of images have to be operated upon to rank them based on a similarity to a provided query image by the user. Numerous instances of the algorithm run on the virtual machines provided in the Microsoft data centers, which run Windows Azure. Windows Azure is the operating system for the cloud by Microsoft Incorporation.

Keywords: Content Based Image Retrieval (CBIR), Wavelet Ttransformation, Euclidean Algorithm, Windows Azure, Cloud Computing

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #34, pp. 333-346

Title of the Paper: The Comparison and Analysis of Scale-Invariant Descriptors Based on the SIFT

Authors: Jian Gao

Abstract: Based on the feature matching theory about SIFT (Scale-Invariant Feature Transform) keypoints, the concentric circle structure and the color feature vector of scale-invariant descriptor are proposed in this paper. In the concentric circle structure, the radiuses of the concentric circles are proportional to the scale factor, which can achieve the scale invariance. To achieve the rotation invariance, the coordinates of descriptor are also rotated in relation to the point’s orientation. Compared with the square structure of SIFT descriptor, the concentric circle structure not only has simpler computation, but also is more robust to image rotation. The color feature vector chooses the mean values of different color components R, G, B in each subregion of descriptor as the vector’s elements. Compared with the gray feature vector of SIFT descriptor, the color feature vector fully utilizes the image’s color information, having stronger rotation invariance, and obviously decreasing the vector’s dimension, with less computation. After the theory analyses, the experimental results have certified their validity, too.

Keywords: computer vision, feature matching, scale-invariant keypoint, scale-invariant descriptor, concentric circle structure, color feature vector

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #33, pp. 324-332

Title of the Paper: Capturing the Semantic Structure of Documents Using Summaries in Supplemented Latent Semantic Analysis

Authors: Karthik Krishnamurthi, Vijayapal Reddy Panuganti Griet, Vishnu Vardhan Bulusu Jntuh

Abstract: Latent Semantic Analysis (LSA) is a mathematical technique that is used to capture the semantic structure of documents based on correlations among textual elements within them. Summaries of documents contain words that actually contribute towards the concepts of documents. In the present work, summaries are used in LSA along with supplementary information such as document category and domain information in the model. This modification is referred as Supplemented Latent Semantic Analysis (SLSA) in this paper. SLSA is used to capture the semantic structure of documents using summaries of various proportions instead of entire full-length documents. The performance of SLSA on summaries is empirically evaluated in a document classification application by comparing the accuracies of classification against plain LSA on full-length documents. It is empirically shown that instead of using full-length documents, their summaries can be used to capture the semantic structure of documents.

Keywords: Dimensionality Reduction, Document Classification, Latent Semantic Analysis, Semantic Structure, Singular Value Decomposition

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #32, pp. 314-323

Title of the Paper: Visual Salience Estimation of 3D Mesh Models

Authors: Xiang-peng Liu, Xing-qiang Yang, Yi Liu, Cai-ming Zhang

Abstract: The visual salience is an important property of 3D mesh model. There are many metric, including methods based on geometry and ones based on images. It is difficult for the former to infer the direct relationship between geometry and visual stimuli, and the latter is affected by the viewpoint and light source. The visual salience of 3D mesh geometry elements, such as the vertex, edge and patch, is a basic issue which this paper concerns. This paper regards the brightness difference and the contour line as direct factors to stimulate eyes, and deduces the relation between these direct stimuli and the mesh model. The brightness contrast of 3D mesh geometry elements is determined by the model, viewpoint, and light. We treat the light source and viewpoint as variables and compute the maximum brightness contrast of geometry elements for any view point and light source. The maximum brightness contrast is determined only by the model, not the viewpoint and light. So it can be used for measuring the visual salience of the geometry element. The contour line is another important factor to vision. The probability of a geometry element appearing on contour lines is measured and is also treated as a metric of visual salience. Many results of experiments show that our metric accords with human eyes.

Keywords: Salient feature, Visual perception, 3D mesh model, contour line

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #31, pp. 307-313

Title of the Paper: A Novel Approach for Outlier Detection Using Rough Entropy

Authors: E. N. Sathishkumar, K. Thangavel

Abstract: Outlier detection is an important task in data mining and its applications. It is defined as a data point which is very much different from the rest of the data based on some measures. Such a data often contains useful information on abnormal behavior of the system described by patterns. In this paper, a novel method for outlier detection is proposed among inconsistent dataset. This method exploits the framework of rough set theory. The rough set is defined as a pair of lower approximation and upper approximation. The difference between upper and lower approximation is defined as boundary. Some of the objects in the boundary region have more possibility of becoming outlier than objects in lower approximations. Hence, it is established that the rough entropy measure as a uniform framework to understand and implement outlier detection separately on class wise consistent (lower) and inconsistent (boundary) objects. An example shows that the Novel Rough Entropy Outlier Detection (NREOD) algorithm is effective and suitable for evaluating the outliers. Further, experimental studies show that NREOD based technique outperformed, compared with the existing techniques.

Keywords: Data Mining, Outlier, Rough Set, Classification, Pattern recognition

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #30, pp. 296-306

Title of the Paper: A Hex-Mesh Generation Method Using Mean Value Coordinates

Authors: Chen Ruizhi, Xi Ping

Abstract: In order to find an applicable way of generating high-quality all-structured hexahedral meshes, a method of generating structured hexahedral mesh using mean value coordinates is proposed. Firstly, a new edge classification method and its execution condition are proposed which greatly cuts down limits for sub-mappable volumes by means of spreading local coordinate system. Secondly, virtual edges are created according to values of mean value coordinates in the computational domain and added into volume’s corresponding directed graph to eliminate inner loop vertices in the graph; therefore virtual volume decomposition process is avoided. Eventually, examples are given to illustrate the applicability of the proposed method. Meshing results manifest that the proposed method can stably generate high-quality hex-meshes and indicate potential application for hexahedral meshing of sub-mappable volumes.

Keywords: hexahedral mesh generation, submapping, mean value coordinates

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #29, pp. 287-295

Title of the Paper: An Improved Neural Network Based Approach for Identification of Self & Non-Self Processes

Authors: Amit Kumar, Shishir Kumar

Abstract: Security of computer systems is a very crucial issue. Now a days various security approaches and tools are used to protect computer system by any virus, worms and attacks. These security tools require a regular signature based updating to protect the computer system by latest virus and worms. If a system has not updated its security tool, then it may be infected by any virus or worms, then the operating system generates its processes. These processes are harmful to the computer system. These processes are categorized as non-self processes. In this paper an Artificial Neural Network is designed to identify the self and non-self operating system process. Backpropagation Algorithm is used to provide the training and learning to the Artificial Neural Network. Initially an Artificial Neural Network is created with random input weights. These weights are updated by using Backpropagation Algorithm for various training examples. After the weight update Artificial Neural Network tests by various test data examples. After campaigning with various computer security approaches it has been observed out, that Artificial Neural Network provides a better security by identifying self and non-self process.

Keywords: Security, Self Process, Non Self Process, Backpropagation, Weight, Activation Function

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #28, pp. 272-286

Title of the Paper: Development of a Personalized Learning System Using Gaze Tracking System

Authors: Jae-Chu Song, Sanchun Nam, Ki-Sang Song, Se Young Park

Abstract: Providing timely feedback is crucial for a successful personalized e-learning system. A personalized learning system with e-learning contents combined with an eye movement tracking device has been developed with an open source-based gaze tracking system. The effects of feedback delivered to learners depend upon learners' attention while gazing at the e-learning contents have been tested based on learners' academic achievement. It is found that the experimental group with feedback based on eye-movement shows higher scores than the controlled group without feedback using eye tracking devices. Also more eye fixation time and fixed point numbers appeared in learners of the experimental group than the controlled group, and thus it can be said that more visual attention was possible with the gaze tracking feedback system. These finding show that feedback using eye movement information may contribute to better interactions between learner and contents, and thus create a more effective e-learning system.

Keywords: E-learning, Eye-tracking, Gaze information, Feedback strategy, Academic achievement

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #27, pp. 264-271

Title of the Paper: A Novel Weight Assignment Approach for Detection of Clones in Simulink Diagrams

Authors: S. Mythili, S. Sarala

Abstract: Clone detection is a process of detecting duplicate patterns which resembles the original. The process of clone detection has been carried out for several purposes like code clone identification, clone software identification, clone image detection, clone object detection and clone language identification. Textual techniques like dynamic pattern matching, latent semantic indexing, dot plots and, Lexical Techniques like token based , line based approaches, and Syntactic Techniques like tree based approaches, metric based approaches and Semantic Techniques like Program Dependency Graph and Hybrid approaches are used for detection of clones . The proposed method detects clones in Simulink based block diagrams. Still now this process has been carried out with graph based technique. The proposed model uses a weight assignment method to identify the clones in a faster and accurate manner. It can identify both exactly matched and similarly matched clones. The proposed method is evaluated with various experimental setup and the results are compared with the existing tools.

Keywords: Clone Detection, Reusability , Model Driven Architecture, Simulink, UML, Software Maintenance

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #26, pp. 253-263

Title of the Paper: Advantages Analysis of Synchronous Modeling Technology Based on Solid Edge

Authors: Lang Wang, Jing-Ying Zhang, Wei Yang, Qiong Fan

Abstract: Common three-dimensional (3D) modeling software mainly uses sequence modeling method based on feature tree structure. This kind of parametric modeling is achieved by using dimensional constraints. However, re-editing of any historical features will cause regeneration of the subsequent features, thus resulting in substantial waste of computer resources and time. Compared with sequence modeling technology, synchronous modeling technology is a new modeling technology whose advantages are studied through some examples analysis and nonparametric modeling can be truly and efficiently achieved. Synchronous modeling technology contributes to accelerate the pace of product innovation and it’s of great significance to 3D CAD intelligent modeling.

Keywords: synchronous modeling, sequence modeling, features, dimensional constraints

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #25, pp. 246-252

Title of the Paper: The Analysis of Local Motion and Deformation in Image Sequences Inspired by Physical Electromagnetic Interaction

Authors: Xiaodong Zhuang, N. E. Mastorakis

Abstract: In order to analyze the moving and deforming of the objects in image sequence, a novel way is presented to analyze the local changes of object edges between two related images (such as two adjacent frames in a video sequence), which is inspired by the physical electromagnetic interaction. The changes of edge between adjacent frames in sequences are analyzed by simulation of virtual current interaction, which can reflect the change of the object’s position or shape. The virtual current along the main edge line is proposed based on the significant edge extraction. Then the virtual interaction between the current elements in the two related images is studied by imitating the interaction between physical current-carrying wires. The experimental results prove that the distribution of magnetic forces on the current elements in one image applied by the other can reflect the local change of edge lines from one image to the other, which is important in further analysis.

Keywords: Image sequence processing, motion, deformation, virtual current, electromagnetic interaction

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #24, pp. 231-245

Title of the Paper: Dynamic Scheduling of Resource Based on Virtual Machine Migration

Authors: Kamakshi R, Sakthivel A

Abstract: An improved method is proposed for achieving high utilization of server resources and minimizing power usage for cloud Environment. The cloud allows cloud user to scale up and down their resource usage based on the needs. Due to multiple multiplexing and sharing of resource is to decrease the resource utilization of the system and also increase the power utilization. The virtual machine (VM) migration is used to increase the server resource utilization and minimize number of primary machines (PM) is used; it helps to eliminate the hotspots to improve overall performance and minimize the power utilization. It is very useful to various autonomic environments for handling different types of workload efficiently and effectively.

Keywords: Cloud computing, resource management, scheduling techniques, Virtual machine migration

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #23, pp. 224-230

Title of the Paper: IDEA: Classification of Security Events, their Participants and Detection Probes

Authors: Pavel Kácha

Abstract: For IDEA (Intrusion Detection Extensible Alert) format to be really usable for security event data exchange, in addition to container and formats also taxonomies for description and classification has to be defined. We thus distil common classification by analysis and mutual mapping of number of existing taxonomies (creating translation between them on the way), and by identifying omissions, unsuitable semantics, unusual or too specific cases, and adding information conveyed in various types of real life security events, we also populate auxiliary dictionaries – classification of sources and destinations of attacks and description tags of detection probes. IDEA security event description may thus serve as simple to create and easy to understand form, onto which most of the existing automatically gained security information can be mapped.

Keywords: alert, security event, incident response, taxonomy, classification, ids, honeypot, json

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #22, pp. 213-223

Title of the Paper: Compliance Management Model for Interoperability Faults towards Enhanced COBIT Governance of Enterprise Software

Authors: Kanchana Natarajan

Abstract: A quantitative and analytical approach is essential to explore the impact of risks on the enterprise due to interoperability in the case of multi-software activation process. The main objective of the research is to propose a software compliance management model for interoperability fault of regulatory non-compliances in IT industries towards better quality governance. The entities that are non-adherence to the standards and failed to follow the enumerated regulations are analyzed for the non-compliances. The non-compliances in procedure-oriented processes and coding are mapped with the risks associated with severity and impact on the chosen applications. The interoperability fault due to non-compliances are identified and detected much earlier to have a better governance and minimal risk. The interoperability faults within the COBIT (Control Objectives for Information and Related Technology) framework are considered to detect and categorize the injected faults towards the enhancement of governance of the system while considering the non-compliances during compilation of an application. The conformance to the requirement specifications pertaining to process, people, product and its quality are verified as a distributed system to manage the non-compliances. The existing information governance can be improvised by the proposed Governance Enhancement Technology (GET) with the help of a case study on healthcare management system with deployed web services. This research work exhibits the integration of IT compliance with the risk management through the risk of non-compliance.

Keywords: Business Risks, Non-Compliances, COBIT Compliance, Interoperability Fault, Verification Standards, Goal-risk model

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #21, pp. 199-212

Title of the Paper: Multi-Integer Somewhat Homomorphic Encryption Scheme with China Remainder Theorem

Authors: Chao Feng, Yang Xin, Yixian Yang, Hongliang Zhu

Abstract: As an effective solution to protect the privacy of the data, homomorphic encryption has become a hot research topic. Existing homomorphic schemes are not truly practical due to their high computational complexity and huge key size. In 2013, Coron et al. proposed a batch homomorphic encryption scheme, i.e. a scheme that supports encrypting and homomorphically evaluating several plaintext bits as a single ciphertext. Based on china remainder theorem, we propose a multi-integer somewhat homomorphic encryption scheme. It can be regarded as a generalization of Coron’s scheme with larger message space. Furthermore, we put forward a new hardness problem, which is called the random approximation greatest common divisor (RAGCD). We prove that RAGCD problem is a stronger version of approximation greatest common divisor (AGCD) problem. Our variant remains semantically secure under RAGCD problem. As a consequence, we obtain a shorter public key without sacrificing the security of the scheme. The estimates are backed up with experiment data. It is expected that, the proposed scheme makes the encrypted data processing practical for suitable applications.

Keywords: Information Security, Cryptography, Somewhat Homomorphic Encryption, Multi-Integer, China Remainder Theorem, Random Approximation Greatest Common Divisor (RAGCD)

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #20, pp. 186-198

Title of the Paper: Recognition of Handwritten Amazigh Characters Based on Zoning Methods and MLP

Authors: Nabil Aharrane, Karim El Moutaouaki, Khalid Satori

Abstract: The main purpose of this work is to develop an optical character recognition system (OCR) of handwritten Amazigh characters employing a feature set of 79 elements based on statistical methods. The feature set elaborated consists of 37 densities features and 42 shadow features basing on a specific zoning to represent the Amazigh characters; in the recognition phase, we use the multilayer perceptron (MLP) as classifier. The accuracy observed, experimentally, on a database of 24180 characters is 96,47%. The experimental evaluation performed on a large set of handwritten characters not only verifies that the proposed approach provides a very satisfactory recognition rate but also shows a reasonable time during the test phase.

Keywords: OCR, MLP, Handwritten Amazigh charcters, Statistical methods, Features extraction, Neural network

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #19, pp. 178-185

Title of the Paper: A Hybrid ACO Based Feature Selection Method for Email Spam Classification Email Spam Classification

Authors: Karthika Renuka D, Visalakshi P

Abstract: In recent days, Internet plays a main role. Internet mail system is a store and forward mechanism used for the purpose of exchanging documents across computer network through Internet. Spam is an unwanted mail which contains unsolicited and harmful data that are irrelevant to the specified users. In the proposed system, spam classification is implemented using Support Vector Machine (SVM), a classifier, which is a powerful non linear classifier applicable for complex classification problems. The proposed system is analyzed by calculating its accuracy. Implementation of feature selection using Ant Colony Optimization (ACO) serves to be more efficient which gives good results for the above system that has been proposed in this paper.

Keywords: Ant Colony Optimization, Feature selection, Spam classification, E-mail, Spam, SVM, Spam dataset

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #18, pp. 171-177

Title of the Paper: Cache System for Frequently Updated Data in the Cloud

Authors: Fusen Dong, Kun Ma, Bo Yang

Abstract: Maintaining data indexes and query cache becomes the bottleneck of the database, especially in the context of frequently updated data. In order to reduce the burden of the database, a cache system for frequently updated data has been proposed in this paper. In the system, update statements are parsed firstly. Then updated data are saved as key-value pairs in the cache and they are synchronized into the database at idle time. Experimental results show that the proposed cache system cannot only accelerate the data updating rate, but also improve the data writing ability in maintaining indexes and consistency of cache data greatly.

Keywords: Frequently updated data, Query cache, Data index, Cache system, Update merging method, Cloud database

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #17, pp. 163-170

Title of the Paper: COCHCOMO: An Extension of Cocomo Ii for Estimating Effort for Requirement Changes During Software Development Phase

Authors: Sufyan Basri, Nazri Kama, Roslina Ibrahim

Abstract: Software undergoes changes at all stages of the software development process. Accepting too many changes will cause expense and delay and rejecting the changes may cause customer dissatisfaction. One of the inputs that help the software project management to decide whether to accept or reject the changes is by having a reliable estimation of the change effort. From a software development perspective, the estimation has to take into account the inconsistent states of software artifacts across project lifecycle i.e., fully developed or partially developed. This inconsistent state requires different ways of estimation such as the fully developed artifacts may have different calculation compared to the partially developed artifacts. Many change effort estimation models have been developed and one of them is using impact analysis. One main challenge of this technique from software development perspective is that this technique is specifically used for software maintenance phase in which all software artifacts have been completely developed. This research introduces a new change effort estimation model that is able to use different estimation techniques for different states of software artifacts. The outcome of this research is a new change effort estimation model for software development phase using the extended version of the static and dynamic analysis techniques from our previous works. The significant achievements of the approach are demonstrated through an extensive experimental validation using several case scenarios.

Keywords: Software development, change impact analysis, change effort estimation, impact analysis, effort estimation

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #16, pp. 152-162

Title of the Paper: Comparative Analysis of Turbo and LDPC Codes for Reduced Storage and Retrieval of Data Reduced Storage and Retrieval of Data

Authors: N. Gopika Rani, G. Sudha Sadasivam, R. M. Suresh

Abstract: Turbo codes are the channel coding scheme used in wireless cellular networks as they are able to reach close to the Shannon limit. This paper proposes the use of turbo codes and LDPC codes for storage of data. Turbo encoding can be performed by using parallel Recursive Systematic Convolutional (RSC) encoder and an interleaver while turbo decoding is based on Bahl Cocke Jelinek and Raviv (BCJR) algorithm, the Maximum Aposterior Algorithm (MAP).Low Density Parity-Check (LDPC) codes encoding technique are based on the generator matrix value of the original code word to be identified. In LDPC decoding Hard-decision decoding algorithm is followed. Finally, a comparative analysis on turbo and LDPC codes is presented. Theoretical and experimental results show turbo codes perform better than LDPC codes.

Keywords: BCJR algorithm, Check nodes, encoding algorithm, Hard-decision decoding algorithm, LDPC codes, Turbo codes, Variable nodes

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #15, pp. 142-151

Title of the Paper: New Modeling Methods of Spiral Bevel and Hypoid Gear Based on the CAD Surface Design Features

Authors: Han Ding, Xieeryazidan Adayi

Abstract: Based on the comprehensive analysis of previous research about the spiral bevel and hypoid gear, principle of forming spherical involute tooth surface was proposed with the spherical involute that was new theory applied in the field of their modeling. According to precise shape of tooth surfaces, each part of the quick and accurate derivations of the basic parametric equation of the tooth profile curve was made. Especially, it focused on the generating-line and tooth trace. In addition, on the base of some CAD features on curves surface design namely by rotating to constitute surfaces, by sweeping to found surfaces, and creating a surface from the point cloud, some new approaches were correspondingly proposed ,which were modeling with rotating generating line, sweeping from tooth profile and discrete point cloud on the tooth surface . Respectively, each approach was given certain optimization, in order to get higher accuracy, and faster efficiency, better flexibility, and provide new theoretical foundation and means for fast and accurate modeling of spiral bevel and hypoid gear.

Keywords: he spiral bevel and hypoid gear, the spherical involute, tooth profile curve, CAD features, Modeling

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #14, pp. 134-141

Title of the Paper: Secure Key Sharing in Mobile Ad Hoc Network Using Content Invisibility Scheme

Authors: A. Jegatheesan, D. Manimegalai

Abstract: In mobile ad hoc network (MANET) privacy protection and efficient use of memory is a challenging task due to mobile and dynamic behavior. Existing schemes provides anonymity, unlinkability and unobservability, but they consume more memory space and also result in computation overhead and communication delay. In our proposed scheme, key is established only between neighbors, which reduce memory storage, computational overhead and communication delay. In our scheme, ID Based Encryption is proposed. By using this scheme we can overcome the internal and external attacks. Also we achieve privacy in terms of anonymity, unlinkability and unobservability.

Keywords: Security, Hash function, Key Exchange, Authentication, MANET, ID-Based Encryption

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #13, pp. 124-133

Title of the Paper: The Virtual Electromagnetic Interaction Between Digital Images for Image Matching with Shifting Transformation

Authors: Xiaodong Zhuang, N. E. Mastorakis

Abstract: A novel way of matching two images with shifting transformation is studied. The approach is based on the presentation of the virtual edge current in images, and also the study of virtual electromagnetic interaction between two related images inspired by electromagnetism. The edge current in images is proposed as a discrete simulation of the physical current, which is based on the significant edge line extracted by Canny-like edge detection. Then the virtual interaction of the edge currents between related images is studied by imitating the electro-magnetic interaction between current-carrying wires. Based on the virtual interaction force between two related images, a novel method is presented and applied in image matching for shifting transformation. The preliminary experimental results indicate the effectiveness of the proposed method.

Keywords: Virtual edge current, electro-magnetic interaction, image matching, shift transformation

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #12, pp. 107-123

Title of the Paper: Fuzzy Based Faulty Link Isolation Technique in Dynamic Wireless Sensor Networks

Authors: Mercilin Raajini X., Raja Kumar R., Indumathi P.

Abstract: Transmission link failures are common in both static and dynamic wireless sensor networks due to factors such as sensor node failure, mobility of the nodes, network congestion and interference. These faulty links in a network are needed to be localized and repaired to sustain the health of the network. It is necessary to maintain the dynamic topology of the network to monitor the link quality. Hence, in this paper we adopt the Kinetic PR Quad tree and modify it to maintain the network topology and we have also proposed a Fuzzy-Link Quality Assessment (F-LQA) algorithm to estimate the link quality by considering parameters like asymmetry, energy level of nodes and delay. The proposed algorithm zeroes down to the link responsible for the faultiness of the bad topology. The F-LQA algorithm computes the Malfunction Index (MI) Value which is found to possess an increased reliability due to the inclusion of a variety of parameters for ascertaining the link quality. Therefore the conclusion obtained regarding the link reliability is accurate. The proposed algorithm is found to be more energy efficient than other existing approaches thus extending the network lifetime.

Keywords: Sensor Networks, Localization, Faulty links, Link Quality Assessment, Fuzzy Logic, Malfunction Index

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #11, pp. 97-106

Title of the Paper: An Improved PSO Clustering Algorithm with Entropy-Based Fuzzy Clustering

Authors: Yuyan Zheng, Yang Zhou, Jianhua Qu

Abstract: Particle swarm optimization is a based-population heuristic global optimization technology and is referred to as a swarm-intelligence technique. In general, each particle is initialized randomly which increases the iteration time and makes the result unstable. In this paper an improved clustering algorithm combined with entropy-based fuzzy clustering (EFC) is presented. Firstly EFC algorithm gets an initial cluster center. Then the cluster center is regarded as inputs of one of all particles instead of being initialized randomly. Finally we cluster with the improved clustering algorithm which guarantees unique clustering. The experimental results show that the improved clustering algorithm has not only high accuracy but also certain stability.

Keywords: Particle Swarm Optimization (PSO), Entropy-based Fuzzy Clustering, Cluster Center Initialization

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #10, pp. 88-96

Title of the Paper: Information Security Using Surrogate Object Based Encryption in Mobile Cloud Systems

Authors: S. Ravimaran, A. N. Gnana Jeevan, M. A. Maluk Mohamed

Abstract: The advancement in wireless technology and rapid growth in the use of mobile devices in cloud paradigm have the potential to revolutionize computing by the illusion of a virtually infinite computing infrastructure, namely the mobile cloud. However, the amount of unprotected data in mobile cloud platform will grow in the forthcoming year and this will leads to various security issues such as potential attack, authentication, faster accessing of data, challenges arise from data residency, accessing mechanism, mobility, bandwidth restrictions, number of wireless and wired access, number of communication messages. So, it is the right time to enhance, the existing mechanism to make the overall security more robust in this platform. This paper propose a new standardization for information security using Surrogate object that may store, gather private information and ensure that individuals’ private information is kept and accessed in a secured manner. Thus, the Surrogate Object Based Encryption (SOE) model enhancing the privacy when business information is uploaded to the data centres. The performance of proposed models have been evaluated and compared with existing models in mobile cloud platform by simulation and proved that the proposed SOE scheme provides better protection for information in mobile cloud platform.

Keywords: Surrogate object, Information Security, Mobile Cloud Computing, Authentication, RSA Algorithm

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #9, pp. 79-87

Title of the Paper: A Multi-View Approach for Formalizing UML State Machine Diagrams Using Z Notation

Authors: Khadija El Miloudi, Aziz Ettouhami

Abstract: Due to the missing formal foundation of UML, the semantics of a number of UML constructs is not precisely defined. Based on our previous work on formalizing class and sequence diagrams, a method for transforming a subset of UML state machine diagram into Z specification is proposed for the purpose of formally checking consistency in multi view modeling. The consistency of the resulting specification is guaranteed by providing a set of well-formedness and consistency rules. It is worth noting that our multi view approach is the first work on state machine diagram formalization based on Z notation. Our approach is illustrated using an example taken from the literature.

Keywords: UML State Machine Diagram, Z, Formal Methods, Consistency Checking, Multi View Modeling

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #8, pp. 72-78

Title of the Paper: A Hybrid Approach for Discovery of OWL-S Services Based on Functional and Non-Functional Properties

Authors: M. Deepa Lakshmi, Julia Punitha Malar Dhas

Abstract: Web services are independent software systems designed to offer machine-to-machine interactions over the WWW to achieve well-described operations. Typically, service providers expose their services to the public by providing brief descriptions of the service’s operations; the challenge is to discover the right service in response to a specific service request based on rather sparse service descriptions. In this work, we present a hybrid semantic web service discovery framework that offer semantic web service discovery based on both functional (Input, Output, Precondition and Effect - IOPE) and non-functional (text-description) properties of OWL-S (Semantic Markup for Web Services) services. For functional parameter matching, we have used the bipartite graph-based approach for matching the parameters of services. For non-functional parameter matching, we have used natural language processing techniques on the textual-description of a web service. The cumulative similarly measures determine the overall similarity of the advertised service with the service request. We evaluated the performance of our Service Matchmaking framework using the OWLS-TC4 (Test Collection version 4) dataset, and furthermore compared its performance with some existing discovery models. Our results indicate that the proposed web service matchmaking framework offers an improved discovery mechanism with a significant increase in recall rate.

Keywords: Bipartite matching, Discovery, Functional parameters, OWL-S, Text-description

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #7, pp. 62-71

Title of the Paper: Performance Evaluation of Interconnection Schemes for Shared Cache Memory Multi-core Architectures

Authors: Medhat Hussein Ahmed Awadalla

Abstract: Current systems-on-chips (SoCs) designs integrate an increasingly large number of designed cores and their number is predicted to increase significantly in the near future. This paper focuses on the interconnection design issues of area, power and performance of chip multi-processors with shared cache memory. It shows that having shared cache memory contributes to the performance improvement, however, typical interconnection between cores and the shared cache using crossbar occupies most of the chip area, consumes a lot of power and does not scale efficiently with increased number of cores. New interconnection mechanisms are needed to address these issues. This paper proposes an architectural paradigm in an attempt to gain the advantages of having shared cache with the avoidance of penalty imposed by the crossbar interconnect. The proposed architecture achieves smaller area occupation allowing more space to add additional cache memory. It also reduces power consumption compared to the existing crossbar architecture. Furthermore, the paper presents a modified cache coherence algorithm called Tuned-MESI. It is based on the typical MESI cache coherence algorithm however it is tuned and tailored for the suggested architecture. The achieved results of the conducted simulated experiments show that the developed architecture produces less broadcast operations compared to the typical algorithm.

Keywords: Chip Multi Processors, Shared cache memory, Interconnection mechanisms

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #6, pp. 49-61

Title of the Paper: An Efficient Data Access Policy in Shared Last Level Cache

Authors: Nitin Chaturvedi, S Gurunarayanan

Abstract: Future multi-core systems will execute massive memory intensive applications with significant data sharing. On chip memory latency further increases as more cores are added since diameter of most on chip networks increases with increase in number of cores, which makes it difficult to implement caches with single uniform access latency, leading to non-uniform cache architectures (NUCA). Data movement and their management further impacts memory access latency and consume power. We observed that previous D-NUCA design have used a costly data access scheme to search data in the NUCA cache in order to obtain significant performance benefits. In this paper, we propose an efficient and implementable data access algorithm for D-NUCA design using a set of pointers with each bank. Our scheme relies on low-overhead and highly accurate in-hardware pointers to reduce miss latency and on-chip network contention. Using simulations of 8-core multi-core, we show that our proposed data search mechanism in D-NUCA design reduces 40% dynamic energy consumed per memory request and outperforms multicast access policy by an average performance speedup of 6%.

Keywords: Non-Uniform Cache Architecture (NUCA), Last Level Cache (LLC), Multi-core Processors (CMP)

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #5, pp. 38-48

Title of the Paper: The Analysis, Design and Implementation of Optimized Web Structures

Authors: Zdenka Prokopova, Radek Silhavy, Petr Silhavy

Abstract: The goal of this work was to develop an optimized web structure combined with the preparation of a web-based e-commerce platform for a small manufacturing company. The aim of the optimization of the web structure was to increase the number of website-visits, reduce the bounce rate, and increase the percentage representation of approaches from search engines. Evaluation and testing of the website after optimization has shown that the number of website visits increased by 55%, the bounce rate went down by nearly 40%; and the percentage representation of approaches from search engines went up by 15%. An analysis and comparison of the possibilities for the utilization of an e-commerce platform was undertaken.

Keywords: Analysis, Evaluation, Implementation, Optimization, Search engines, Website

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #4, pp. 29-37

Title of the Paper: A Hierarchical Load Balanced Fault Tolerant Grid Scheduling Algorithm with User Satisfaction

Authors: Keerthika P., Suresh P.

Abstract: The human civilization advancements lead to complications in science and engineering. Dealing with heterogeneous, geographically distributed resources, grid computing acts as a technology to solve these complicated issues. In grid, scheduling is an important area which needs more focus. This research proposes a hierarchical scheduling algorithm and the factors such as load balancing, fault tolerance and user satisfaction are considered. The proactive fault tolerant approach used here achieves better hit rate. The hierarchical scheduling methodology proposed here results in reduced communication overhead and the user deadline based scheduling results in better user satisfaction when compared to the algorithms which are proposed recent based on these factors. The tool used to evaluate the efficiency of this hierarchical algorithm with other existing algorithms is gridsim. The overall system performance is measured using makespan and it proves to be better for the proposed hierarchical approach.

Keywords: Communication overhead, Resource utilization, Load balancing, Fault tolerance, Hierarchical scheduling, User satisfaction

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #3, pp. 15-28

Title of the Paper: Construction of 3-D Model and its Visualization of Road Based on Terrain

Authors: Ziru Wang, Fan Zhang, Bing Qiu, Yaolong Liang

Abstract: This paper combines terrain with the feature of road engineering to set up the constraint Delaunay triangle net in division by terrain characteristic line, and then forming range constraint Delaunay triangle net. With the data of road midcourt line, the road slope toe line and the point of intersection of road slope toe line could be calculated. Terrain characteristic line can be used to construct the model of road and the whole digital terrain. Based on the WINDOWS platform, blend VC++ with OpenGL to build a simple and easy operation on 3-D visualization system of terrain and road. This system consists of four subsystems, namely, data input subsystem, the terrain subsystem, road subsystem and inquiry system. This paper has applied the method to a mountain highway road design from Chengdu to Shenzhen. The method realizes the visualization, which provides a vivid analysing method for the designer to improve the efficiency and quality and has a certain of engineering application value. Case study shows that this proposed model can directly be seen by the 3-D scene, and can be used to optimize the road design plan.

Keywords: terrain, constraint Delaunay triangle net, terrain character line, road, visual simulation

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #2, pp. 9-14

Title of the Paper: S-PAC: A Novel Hard Index Based Cluster Validation Technique for Ad Hoc Network

Authors: S. Thirumurugan , E. George Dharma Prakash Raj, V. Sinthu Janita Prakash, M. Nandhini

Abstract: Gone are the days that transmission media are physically established to transmit the data from one location to another location. As of now, the transmission by air has been an indispensable step to conserve the resources and also to make the investments cost effective. The ad hoc network has come up as a solution provider to set up the communication process. The supportive mechanism like clustering will increase the efficiency of the network functionality which has been known very well. These clusters need to be fine-tuned on the process of maintaining an efficient network. Thus, this work proposes S-PAC as a validation technique to measure the cluster stability. This stability of clusters are measured in terms of their strength of inter and intra cluster communication. This strength factor decides the proper time to make the re-cluster event to occur in the ad hoc network. This work has been experimentally shown using OMNET++ simulator.

Keywords: S-PAC, W-PAC, Silhouette

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2880, Volume 14, 2015, Art. #1, pp. 1-8


Bulletin Board


The editorial board is accepting papers.

WSEAS Main Site