Login

 


WSEAS Transactions on Information Science and Applications


Print ISSN: 1790-0832
E-ISSN: 2224-3402

Volume 10, 2013


Issue 1, Volume 10, January 2013


Title of the Paper: A Review of Lean-Kanban Approaches in the Software Development

Authors: Erika Corona, Filippo Eros Pani

Abstract: We present a review on the state-of-the-art of the adoption of a specific Agile Methodology (AM), the Lean-Kanban, in different software development contexts. Such approach requires to break down the software development process into smaller steps, which are implemented with the aid of a Kanban board. We study 14 different Kanban boards and we also examined how features are represented in the boards and compared 22 software tools for implementing virtual Kanban boards, analyzing also resources available on the web. We analyzed the main features of the Kanban boards actually used, the main activities defining the software development process, the content of the cards representing work units, and also the automation tools available for Kanban board management. Our survey shows that nor standard definitions of Kanban practices exist for the software development, neither specific practices for the Kanban board management have been rigorously defined, and thus Lean development standardization and improvement is still an unaccomplished task for software development.

Keywords: Kanban, Lean, software development, agile methodologies


Title of the Paper: E-Learning Developing Using Ontological Engineering

Authors: Sarma Cakula, Abdel-Badeeh M. Salem

Abstract: One of the most important prerequisites in base plan for long-term development of all countries is high education level in society what includes e-learning studies. The time is coming when global tasks could be solved only with communication and learning in world level. Ontological engineering have become an efficient methodology for knowledge representation and management in many domains and tasks. Ontology design, approaches and methodologies are very important issues for building ontologies for specific task. This paper presents the application of the ontological engineering methodology in e-Learning domain. There is the development of two web-based ontologies in the area of artificial intelligence technology. The first one is the “Artificial Intelligence in Education” ontology and the second is ‘Expert Systems” ontology. The developed ontologies were encoded in OWL-DL format using the Protégé-OWL editing environment. The ontological engineering methodology is widely used in many domains of computer and information science, cooperative information systems, intelligent information integration, information retrieval and extraction, knowledge representation, and database management systems. Several attempts introducing universal ontology for E-Learning materials have had only modest success.

Keywords: artificial intelligence in education, e-learning, knowledge management, ontological engineering


Title of the Paper: A Risk Perception Analysis on the Use of Electronic Payment Systems by Young Adult

Authors: Noor Raihan Ab Hamid, Aw Yoke Cheng

Abstract: The use of technology in commercial activities for many years has led to the emergence of various new supporting services in the marketplace. Among the services is the e-payment which enables payment done via the electronic medium and without involving any physical cash. E-Payment systems have received different acceptance level throughout the world – some methods of e-payment are highly adopted while others are relatively lower. Primarily, perceived risk associated to the payment systems is believed to be one of the contributing factors to the low adoption rate. This study aimed to identify young adult’s perception of e-payment risk and their behaviour towards different payment methods. For the data collection purpose, survey questionnaires were distributed to students from tertiary institutions in a metropolitan city in Malaysia as the study sample. The findings showed significant difference in perceived risk between cash and E-Payment but less significant in terms of volume of purchase. We discuss the implications of the findings to service providers and policy makers and offer some recommendations to improve the e-payment systems quality. Finally, the limitations of study and future directions of research are discussed.

Keywords: E-commerce, payment systems, risks, cash, payment cards


Issue 2, Volume 10, February 2013


Title of the Paper: Design of UAV Flight Simulation Software Based on Simulation Training Method

Authors: Chao Yun, Xiaomin Li

Abstract: UAV (unmanned aerial vehicle) has been widely used in both military and civilian fields in recent years. With the development of UAV, high quality training for the UAV operators on the ground is necessary and urgent, the fidelity of simulation training system is very important, the fidelity training system can simulate the flying environment on the sky, the flying simulation software is the core of the training simulator system, the fidelity and accuracy of software can ensure of the reliability and efficiency of flying simulation system. The study of structure, composing and function of the UAV system is the premise; in the paper, the flying simulation software is proposed based on object oriented according to the actual training of UAV. The research is necessary in the development of UAV training simulator system.

Keywords: UAV (Unmanned aerial vehicle), Simulation training, Flying simulation software, Object oriented, Dynamic simulation


Title of the Paper: Applying an Integrated Fuzzy MCDM Method to Select Hub Location for Global Shipping Carrier-Based Logistics Service Providers

Authors: Ji-Feng Ding

Abstract: A good hub centre will effectively help expand agglomeration economy effects and increase competitive advantages for global shipping carrier-based logistics service providers (GSLPs). Hence, the selection of hub location of logistics centre is very important for the GSLP companies. In light of this, the main purpose of this paper is to develop an integrated fuzzy MCDM model to evaluate the best selection of hub location for GSLPs. At first, some concepts and methods used to develop the proposed model are briefly introduced. Then, we develop an integrated fuzzy MCDM method. Finally, a step by step example is illustrated to study the computational process of the proposed fuzzy MCDM model. In addition, the proposed approach has successfully accomplished our goal.

Keywords: Hub location, Shipping, Logistics, Service provider, Fuzzy, MCDM


Title of the Paper: Analysis of Unmanned Aerial Vehicle MIMO Channel Capacity Based on Aircraft Attitude

Authors: Xijun Gao, Zili Chen, Yongjiang Hu

Abstract: In the paper, the demand of Unmanned Aerial Vehicle Multi-Input Multi-Output (UAV-MIMO) communication are taken into consideration, four-element circular antenna array are adopted on the UAV. In order to analyze UAV-MIMO communication system, the uniform coordinate is proposed, and the three-dimensional geometrically based single bounce cylinder model (3D-GBSBCM) channel model of UAV-MIMO system based on four transmitting antennas and two receiving antennas is put forward. The methods of channel matrix factorization and channel coefficient normalization are adopted to deduce the average channel correlation matrix of UAV-MIMO. The influences of UAV attitude parameters on UAV-MIMO channel capacity are simulated and analyzed. The simulation and analysis results have good reference and application values in improving UAV-MIMO system capacity through changing attitude parameters of UAV.

Keywords: UAV-MIMO, Circular Antenna Array, Attitude Change, Channel Capacity


Issue 3, Volume 10, March 2013


Title of the Paper: A Novel Algorithm for SOA-based Cross-Layer QoS Support in Multiservice Heterogeneous Environments

Authors: Gordana Gardasevic, Dejan Stjepanovic, Aleksandar Damljanovic, Dejan Cvijanovic

Abstract: There are many challenging issues involving in design of contemporary networking systems. The ultimate goal is to provide end-to-end Quality of Service (QoS) support in multidomain and multiservice interactive environments. The paper proposes an algorithmic framework for cross-layer QoS adaptation in multiservice heterogeneous environments (MHEs) based on Service-Oriented Architecture (SOA) principles. This algorithm takes into account the diversity of user/application requirements, networking technologies, terminal and provider capabilities. Service description and reasoning rules are accomplished by using Web Ontology Language (OWL) and Semantic Web Rule Language (SWRL). The paper also discusses relevant issues for an overall QoS implementation in MHE.

Keywords: Multiservice Heterogeneous Environments; Quality of Service; Cross-Layer; Service-Oriented Architecture; Adaptive; Ontology


Title of the Paper: High Precision Cohesion Metric

Authors: N. Kayarvizhy, S. Kanmani, R.V. Uthariaraj

Abstract: Metrics have been used to measure many attributes of software. For object oriented software, cohesion indicates the level of binding of the class elements. A class with high cohesion is one of the desirable properties of a good object oriented design. A highly cohesive class is less prone to faults and is easy to develop and maintain. Several object oriented cohesion metrics have been proposed in the literature. These metrics have provided a lot of valuable insight into the mechanism of cohesion and how best to capture it. However they do suffer from certain criticisms. In this paper, we propose a new cohesion metric named as High Precision Cohesion Metric (HPCM). HPCM addresses the drawbacks present in the existing cohesion metrics. We introduce two novel concepts - link strength and average attributes used in a class and apply them to arrive at the proposed metric. The metric is presented using the unified cohesion framework to avoid ambiguity. The metric is validated using theoretical approach suggested in the unified framework for cohesion metrics. Empirical validation is also carried out on this metric using data from open source Java projects. The newly proposed High Precision Cohesion Metric overcomes the shortfalls of the earlier proposed cohesion metrics.

Keywords: Object Oriented Metrics, Cohesion, High Precision, Link Strength


Title of the Paper: An Evaluation of Teaching Performance: The Fuzzy AHP and Comprehensive Evaluation Approach

Authors: Quang Hung Do, Jeng-Fung Chen

Abstract: Teaching performance evaluation is one of the main instruments to improve teaching quality and plays an important role in strengthening management of higher education institutions. In this paper, we present a framework for teaching performance evaluation based on fuzzy AHP and fuzzy comprehensive evaluation methods. First, the teaching performance index system was determined and then the factor and sub-factor weights were calculated by the fuzzy AHP method. Employing the fuzzy AHP in group decision making facilitated a consensus of decision-makers and reduces uncertainty. Fuzzy comprehensive evaluation was then employed to evaluate teaching performance. This paper also used a case application to illustrate the proposed framework. The application of this framework can obtain a scientific and objective evaluation result. It is expected that this work may serve as a tool for educational institution managers that improves teaching performance quality.

Keywords: Evaluation, Higher education, Fuzzy AHP; Fuzzy comprehensive evaluation, Fuzzy theory, Teaching performance


Issue 4, Volume 10, April 2013


Title of the Paper: A New Supervised Dimensionality Reduction Algorithm Using Linear Discriminant Analysis and Locality Preserving Projection

Authors: Di Zhang, Yun Zhao, Minghui Du

Abstract: Linear discriminant analysis (LDA) is one of the most popular supervised dimensionality reduction (DR) techniques used in computer vision, machine learning, and pattern classification. However, LDA only captures global geometrical structure information of the data and ignores the geometrical variation of local data points of the same class. In this paper, a new supervised DR algorithm called local intraclass geometrical variation preserving LDA (LIPLDA) is proposed. More specifically, LIPLDA first casts LDA as a least squares problem, and then explicitly incorporates the local intraclass geometrical variation into the least squares formulation via regularization technique. We also show that the proposed algorithm can be extended to non-linear DR scenarios by applying the kernel trick. Experimental results on four image databases demonstrate the effectiveness of our algorithm.

Keywords: Dimensionality reduction, locality preserving projection, linear discriminant analysis, pattern classification


Title of the Paper: Toward Real-world Activity Recognition: An SVM Based System Using Fuzzy Directional Features

Authors: Samy Sadek, Ayoub Al-Hamadi, Bernd Michaelis

Abstract: Despite their attractive properties of invariance, robustness and reliability, fuzzy directional features are not hitherto paid the attention they deserve in the activity recognition literature. In this paper, we propose to adopt an innovative approach for activity recognition in real-world scenes, where a new fuzzy motion descriptor is developed to model activities as time series of fuzzy directional features. A set of one-vs.-all SVM classifiers is trained on these features for activity classification. When evaluated on our dataset (i.e., IESK action dataset) incorporating a large and diverse collection of realistic video data, the proposed approach yields encouraging results that compare very favorably with those reported in the literature, while maintaining real-time performance.

Keywords: Human activity recognition, fuzzy directional features, one-vs.-all SVM, video interpretation


Title of the Paper: Partitioning of State-Space with the Sum-of-Digits and Hashing with Reduced Number of Address Collision

Authors: Eleazar Jimenez Serrano

Abstract: We present a method for partitioning the state-space based on the sum-of-digits in order to conduct parallel state-space exploration and hashing. The method has a configuration for storing the universe of the state-space using multiple hash tables generating a reduced number of address collisions. This paper presents the partitioning result, the number of address collisions generated with the proposed hashing method, the efficiency in reducing the number of address collisions, an estimation of the number of address collisions, and the probability of the unfitness of our estimation for the size of our arrangement of multiple hash tables with respect to the real sizes.

Keywords: Partition, Hash, State-space, Petri nets


Issue 5, Volume 10, May 2013


Title of the Paper: A New Approach for Knowledge Management and Optimization using an Open Source Repository

Authors: Giulio Concas, Filippo Eros Pani, Maria Ilaria Lunesu

Abstract: The Institutional Repositories (Irs) based on Open Archives represent one of the main free access tools for the results of scientific research, and their diffusion is continuously growing. In the context of the “Analytic Sound Archive of Sardinia” project, that aims to create an institutional archive with a linguistically annotated electronic corpus, this work proposes a new approach for management of knowledge using the tool Dspace (an open source software package developed in 2000 in the context of a joint project of the Massachusetts Institute of Technology with Hewlett-Packard): the purpose is to offer an original way to associate linguistic annotations (information associated to specific text portions) to the corpus by treating them as metadata, so as to insert and manage them in the archive of choice after formalizing them in XML. The formalization level of this approach allows for effective text retrievals through a metadata schema and easy, quick corpus interrogations, by formalizing linguistic annotation as a structured metadata schema. There is, thus, the need to have an efficient tool that could classify and store the vast amount of knowledge contained in an electronic corpus of spoken texts of Sardinian language, linguistically annotated at various levels, and that could allow a high usability in terms of ease of reference as well as ease of query and communication.

Keywords: Knowledge Management, Open Archive, Institutional Repository, Multimedia content, Metadata


Title of the Paper: An Effective Visualization Method for Large-Scale Terrain Dataset

Authors: Hang Qiu, Lei-Ting Chen, Guo-Ping Qiu, Hao Yang

Abstract: Visualization of very large digital elevation models plays an important role in many applications ranging from computer games to professional Geographic Information System (GIS). However, the size of terrain datasets is much larger than the capacity of main memory, which brings up serious challenges for the interactive visualization of massive terrain. In this paper, an effective visualization method for large-scale terrain is proposed. In preprecessing stage, the full resolution terrain dataset is divided into several uniform blocks. For each block, a LOD hierarchy, which contains different resolution of elevation data, is constructed based on quadtree-split. In order to save the external memory, a non-redundant storage method is put forward. Different from the traditional pyramid model, we stored only the newly added vertices in each level. The data quantity during real-time rendering decreased effectively due to the sight-line-dependent screen error computation, view frustum culling and detailed culling scheme. To ensure the continuity of navigation, a preloading scheme, which based on the view frustum extension, is presented and implemented. Experiments show that compared with traditional methods, our method has reduced the external memory cost by 60% and achieved good visual effect with high rendering speed.

Keywords: Levels of Detail (LOD), Model Error, Static Error, Multi-resolution model, Quadtree


Title of the Paper: Optimized Routing for Sensor Networks using Wireless Broadcast Advantage

Authors: Jasmine Norman

Abstract: Routing in wireless sensor network is a challenging task. Barring a few applications, these networks would coexist with other wired and wireless networks. The proactive protocols do not fair well as the topology changes continuously due to the mobility of the nodes. Most of the reactive routing protocols have request reply pattern to establish a path and involves only sensors. These protocols deplete energy by control packets and have high latency. In this paper I propose a scheme which will eliminate the need for control packets by exploiting the wireless broadcast advantage. Also the proposed model is highly energy efficient when it coexists with other networks.

Keywords: WSN, Routing, Wireless Broadcast Advantage, Energy efficient


Issue 6, Volume 10, June 2013


Title of the Paper: Methodology and Toolset for Model Verification, Hardware/Software Co-Simulation, Performance Optimisation and Customisable Source-Code Generation

Authors: M. Berger, J. Soler, L. Brewka, H. Yu, M. Tsagkaropoulos, Y. Leclerc, C. Olma

Abstract: The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co‐simulation tool (TRIAL) and a performance optimisation and customisable source‐code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open‐source model verification engines, based on the automated analysis of system properties, and producing inputs to be fed into these engines, interfacing with standard (SystemC) simulation platforms for HW/SW co‐simulation, customisable source‐code generation towards respecting coding standards and conventions and software performance‐tuning optimisation through automated design transformations.

Keywords: Model Verification, HW/SW co-simulation, customisablecode generation, SW optimization


Title of the Paper: Person Authentication System with Quality Analysis of Multimodal Biometrics

Authors: A. Annis Fathima, S. Vasuhi, Teena Mary Treesa, N. T. Naresh Babu, V. Vaidehi

Abstract: This work is aimed at developing a multi-modal, multi-sensor based Person Authentication System (PAS) using JDL model. This research investigates the necessity of multiple sensors, multiple recognition algorithms and multiple levels of fusion and their efficiency for a Person Authentication System (PAS) with face, fingerprint and iris biometrics. Multiple modalities address the issues of non-universality encountered by unimodal systems. The PAS can be aptly addressed as ‘smart’ since several environmental factors have been considered in the design. If one sensor is not functional, others contribute to the system making it fault-tolerant. Smartness has been very tactfully administered to the processing module by employing different efficient algorithms for a given modality. Selection of the recognition algorithms is rooted on the attributes of the input. Multiplicity has been employed to establish a unanimous decision. Information fusion at various levels has been introduced. Sensor level fusion, local decision level fusion at algorithmic level and global decision level fusion provide the right inference. A multitude of decisions are fused locally to decide the weightage for the particular modality. Algorithms are tagged with weights based on their recognition accuracy. Weights are assigned to sensors based on their identification accuracy. Adaptability is incorporated by modifying the weights based on the environmental conditions. All local decisions are then combined to result in a global decision about the person. The final aggregation concludes whether ‘The Person is Authenticated or not’.

Keywords: Biometric; Image quality; Fusion; Multi-modal; Multi-sensor


Title of the Paper: Extended Asynchronous SN P Systems for Solving Sentiment Clustering of Customer Reviews in E-commerce Websites

Authors: Xue Bai, Xiyu Liu

Abstract: Customer reviews of goods sold on shopping sites are pivotal factors to affect the potential purchasing behavior. However, some traditional classification methods are too simplistic to take buyers’ feelings and emotions into full consideration. In this paper, an extended asynchronous spiking neural P system with local synchronization and weighted synapses is proposed to achieve the sentiment clustering of Chinese customer reviews based on graph representation. Each sentiment word in a comment record is regarded as one unique node of a senti-graph, and the extended SN P system carries out the clustering algorithm by taking each senti-graph as a vertex in the 2D scheme and using the sentiment similarity between them to measure the edge weight. Neurons in the system are divided into four main sets and several subsets, with rules applied asynchronously among different sets but used in a synchronous manner within the same one. The computational complexity is limited to O(n2) in the worst case and optimized to O(n) as the best. A case study shows its computing effectiveness and customer feedbacks clustered by emotion orientation could provide us better understanding on the customer feelings and product features.

Keywords: Membrane computing, Spiking neural P system, Sentiment clustering, Graph representation, Customer review


Issue 7, Volume 10, July 2013


Title of the Paper: Privacy Preservation and Enhanced Utility in Search Log Publishing

Authors: S. Belinsha, A. P. V. Raghavendra

Abstract: Search engines are being widely used by the web users. The search engine companies are concerned to produce best search results. Search logs are the records which records the interactions between the user and the search engine. Various search patterns, user’s behaviours can be analyzed from these logs, which will help to enhance the search results. Publishing these search logs to third party for analysis is a privacy issue. Zealous algorithm of filtering the frequent search items in the search log looses its utility in the course of providing privacy. The proposed confess algorithm extends the work by qualifying the infrequent search items in the log which tends to increase the utility of the search log by preserving the privacy. Confess algorithm involves qualifying the infrequent keywords, URL clicks in the search log and publishing it along with the frequent items.

Keywords: Information service, privacy, infrequent items, threshold


Title of the Paper: A New Hybrid Algorithm for the Multiple-Choice Multi-Dimensional Knapsack Problem

Authors: Mahmoud Zennaki

Abstract: In this paper, we approximately solve the multiple-choice multi-dimensional knapsack problem. We propose a hybrid algorithm based on branch and bound method and Pareto-algebraic operations. The algorithm starts by an initial solution and then combines one-by-one groups of the problem instance to generate partial solutions in each iteration. Most of these partial solutions are discarded by Pareto dominance and bounding process leading at the end to optimality or near optimality in the case when only a subset of partial solutions is maintained at each step. Furthermore, a rounding procedure is introduced to improve the bounding process by generating high quality feasible solutions during algorithm execution. The performance of the proposed heuristic has been evaluated on several problem instances. Encouraging results have been obtained.

Keywords: Combinatorial optimization, heuristics, knapsacks, branch and bound


Title of the Paper: Efficient Framework Architecture for Improved Tuning Time and Normalized Tuning Time

Authors: J. Andrews, T. Sasikala

Abstract: To improve the performance of the program by finding and applying the best set of techniques from the set of available techniques for the GCC Compiler is a non-trivial task. GCC Compiler provides three different levels of optimization techniques. Some techniques are turned ON or OFF each time. Turning on all the techniques for an application may degrade the program performance and increase the compilation time. The selection is dependent on the program domain, the compiler setting and the system architecture. The objective is to find the best set of techniques from the various options provided by the GCC compiler. The framework should handle all the new set of techniques that get added with the new release. The framework should be capable of finding the best set of optimization techniques for any program that is provided as input. Since there are more number of techniques, finding the best set is not feasible manually. The process of selection is automated in order to minimize the execution time, compilation time, tuning time and normalized tuning time. The existing algorithms such as the Combined Elimination, Batch Elimination, Branch and Bound and Optimality Random Search are modified and the results are compared with the newly developed Push and Pop Optimal Search Algorithm. We have implemented the algorithms on both a Pentium IV machine and a Core 2 Duo machine, by measuring the performance of MiBench benchmark programs under a set of 65 GCC compiler options. It is found that the Push and Pop algorithm shows better performance when compared with other algorithms.

Keywords: Optimization; Execution Time; Orchestration Algorithms; GCC Compiler Options; Benchmark Applications


Issue 8, Volume 10, August 2013


Title of the Paper: Publishing RDF from Relational Database Based on D2R Improvement

Authors: Ying Chen, Xiaoming Zhao, Shiqing Zhang

Abstract: As a key technology to implement Semantic Web, linked data have gradually been an academic and industrial concern. Linked data represents a practice of technologies on the web and linked structure data. The goal of linked data is to enable people to share structured data on the web as easily as they can share documents today. On the Web of data structured with linked data, users can jump from one dataset to another. Compared with the Web of document which enables one to jump from one document to another, data Web provides much more links and has semantics. Mapping relational databases to RDF is a fundamental problem for the development of linked data. This paper firstly makes an improvement with R2RML to open source software named D2R which can publish linked data, and then takes an example of the data concerning a college enrollment in Sichuan of China in 2011, demonstrating that we can publish the data in RDB into RDF as linked data through the new approach more efficient and make it browsable.

Keywords: Semantic web; relational database; RDF; D2R; R2RML; linked data


Title of the Paper: An Optimization of a Planning Information System Using Fuzzy Inference System and Adaptive Neuro-Fuzzy Inference System

Authors: Bikram Pal Kaur, Himanshu Aggrarwal

Abstract: This paper aims to design Mamdani-Fuzzy Inference System(FIS) and Sugeno-Adaptive Neuro-Fuzzy Inference System model (ANFIS) for the development of an effective Information System. The comparative study of both the systems provided that the results of ANFIS model are better than the Fuzzy Inference System [6]. This was ascertained after testing of the models. Sugeno-type ANFIS has an advantage that it is integrated with neural networks and genetic algorithm or other optimization techniques therefore the IS Planning Model using ANFIS adapt to individual user inputs and environment. In this research paper, the datasets loaded into FIS and ANFIS have the responses of 86 managers regarding the factors responsible for the success, challenge and failure of Information System. A Fuzzy Inference System has been designed having input fields of various factors under three sub dimensions (strategic planning, top management and IS infrastructure) responsible for the Information System Planning whereas output fields are success, challenge or failure of IS planning. In the development of ANFIS model for IS planning, input values of various planning factors like strategic planning, top management and IS infrastructure determine the success/challenged/failure of IS planning. Comparison of two systems (FIS and ANFIS) results shows that the results of ANFIS model are better than FIS when these systems designed and tested. In ANFIS model, for IS planning output fields, the training, testing and checking errors are 0.0204, 0.4732 and 0.27607 where as FIS results in average error of 4.87.

Keywords: Information System (IS), Critical Success Factors (CSFs), Critical Failure Factors (CFFs), ANFIS, FIS, Planning Information System


Title of the Paper: Determining the Type of Long Bone Fractures in X-Ray Images

Authors: Mahmoud Al-Ayyoub, Duha Al-Zghool

Abstract: Computer-aided diagnosis is a very active field of research. Specifically, using medical images to gen- erate a quick and accurate diagnosis can save time, effort and cost as well as reduce errors. Previous works have considered the problem of detecting the existence of fractures in long bones using x-ray images. In addition to the existence of fractures, this paper considers the problem of determining the fracture type. To the best of our knowledge, ours is the first work to address this problem. After preprocessing the images, we extract distinguish- ing features and use them with different classification algorithms to detect the existence of a fracture along with its type (if one exists). The experiments we conduct show that the proposed system is very accurate and efficient.

Keywords: X-Ray Images, Long Bone Fracture, Machine Learning, Image Processing


Issue 9, Volume 10, September 2013


Title of the Paper: Cache Access Pattern Based Algorithm for Performance Improvement of Cache Memory Management

Authors: Reetu Gupta, Urmila Shrawankar

Abstract: Changes in cache size or architecture are the methods used to improve the cache performance. Use of a single policy cannot adapt to changing workloads. Non detection based policies cannot utilize the reference regularities and suffer from cache pollution and thrashing. Cache Access Pattern (CAP), is a policy that detects patterns, at the file and program context level, in the references issued to the buffer cache blocks. Precise identification is achieved as frequently and repeatedly access patterns are distinguished through the use of reference recency. The cache is partitioned, where each sub-cache holds the blocks for an identified pattern. Once-identified pattern is not stored, repeatedly identified patterns is managed by MRU, frequently identified and unidentified patterns are managed by ARC. Future block reference is identified from the detected patterns. This improves the hit ratio, which in turn reduces the time spent in I/O and overall execution.

Keywords: Access pattern, program counter, reference recency, reuse distance, buffer cache, reference regularities, and replacement policies


Title of the Paper: IT-Integrated Design Collaboration Engagement Model for Interface Innovations

Authors: Naeimeh Delavari, Rahinah Ibrahim, Normahdiah S. Said, Muhamad Taufik Abdullah

Abstract: Due to the advent of computer-based technologies, data sharing and communication in collaborative design environments have significantly improved. However, current systems’ interfaces do not support mutual understanding among professional design team members. This only contributes to increasing levels of miscommunication and lesser understanding among design professionals in IT-integrated design collaboration processes. The purpose of this paper is to understand and nurture engagement of design professionals within the building and construction sectors to the computing system interfaces, so they will actively participate in IT-integrated design collaborations. This study utilizes Grounded Theory Methodology to explicate our understanding about engaging design professionals. While control and feedback are common parameters of engagement in prior models, the study found the need to add functionality to engage design professionals in IT-integrated collaborative interfaces. These parameters would enable design professionals to mitigate process rigidity caused by inefficient knowledge allocation, knowledge retrieval and decision-making control. Additionally, they would also mitigate against inadequate user performance, which resulted in operational deficiency. Thus, the present IT-integrated design collaboration engagement model is expected to increase communication and collaboration among design professionals. By facilitating user-friendlier interfaces, they will be motivated to adopt collaborative technologies thereby encouraging them to develop their proficiencies for working in global collaborative projects.

Keywords: Engagement Model, Grounded Theory, IT-Integrated Design Collaboration, Human Computer Interaction, Human Computer Interface.


Title of the Paper: Experimental Analysis and Implementation of Bit Level Permutation Instructions for Embedded Security

Authors: Gaurav Bansod, Aman Gupta, Arunika Ghosh, Gajraj Bishnoi, Chitrangdha Sawhney, Harshita Ankit

Abstract: With the increasing use of electronic control units (ECU’s) in automobile or in any embedded system security becomes an area of grave concern. Information is exchanged between ECU’s over a CAN (Control Area Network) bus, vehicle to infrastructure (V2I) and vehicle to vehicle (V2V) communication. These interactions open a wide gateway for manipulating information which could lead to disastrous results. EVITA, SEVECOM, SHE are existing security models to address these concerns in automobiles but at the cost of huge footprint area and more power consumption as it uses cryptographic engines like AES-128,ECC, HMAC. We propose the use of bit level permutation GRP (group operations) in cryptographic environment which not only accelerates cryptography but also has a positive impact of providing low cost security solution that is having good encryption standards, relatively less footprint area, less cost and low power consumption. Use of GRP in cryptographic environment is a unique solution for all security applications where footprint area and power consumption are constraints .This paper shows implementation of GRP in embedded C, over a CAN bus on ARM7(LPC2129) and on FPGA. It is the first successful attempt to have universal and optimized structure of GRP and its implementation. Measures on side channel attacks on GRP like differential power analysis (DPA) are incorporated in this paper. This architecture with the use of bit permutation instruction will pave a new way in securing small scale embedded system.

Keywords: Security, Automobile, Embedded system, GRP, CAN bus, ECU


Issue 10, Volume 10, October 2013


Title of the Paper: Generating C++ Log File Analyzers

Authors: Ilse Leal-Aulenbacher, James H. Andrews

Abstract: Software testing is a crucial part of the software development process, because it helps developers ensure that software works correctly and according to stakeholders’ requirements and specifications. Faulty or problematic software can cause huge financial losses. Therefore, automation of testing tasks can have a positive impact on software development, by reducing costs and minimizing human error. To evaluate test results, testers need to examine the output of the software under test (SUT) to determine if it performed as expected. Test oracles can be used to automatize the evaluation of test results. However, it is not always easy to directly capture the inputs and outputs of a program. It is already a common practice for developers to instrument their code so that important events get recorded in a log file. Therefore, a good alternative is to use test oracles capable of analyzing log files. Test oracles that analyze log files are known as log file analyzers. The work presented in this paper builds upon previous research by Dr. James H. Andrews, who proposed a framework in which log file analyzers are generated automatically based on the SUT’s expected behavior. These log file analyzers are used as test oracles to determine if the log file reveals faults in the software. In this paper, we describe how we extended Andrews’ log file analysis framework in order to incorporate new capabilities that make our log file analyzers more flexible and powerful. One of our main motivations was to improve the performance of our log file analyzers. The original log file analyzers were based on the Prolog language. Our new log file analyzers are based on the C++ language, which allowed us to take advantage of the object–oriented paradigm to add new features to our analyzers. We discuss those features and the experiments we performed in order to evaluate the performance of our analyzers. Our results show that our C++ analyzers are faster than their Prolog counterparts. We believe that log file analyzers have a lot of potential in the area of software testing by making it easier for testers to automatize the evaluation of test results.

Keywords: Software Testing, Test Oracles, Log File Analysis


Title of the Paper: Modified Group Decision Algorithm for Performance Appraisal of Indian Premier League Cricketers

Authors: Pabitra Kumar Dey, Dipendra Nath Ghosh, Abhoy Chand Mondal

Abstract: Multi Criteria Decision Making (MCDM) is a major technique in the field of performance appraisal. In time to time numerous MCDM procedures are proposed to solve multi criteria problems. The different methods may provide different results on the same problem, which is the major fault of MCDM. To overcome this our proposed technique namely as Modified Group Decision Analysis (MGDA) plays the vital role. Indian Premier League (IPL) T-20 cricket tournament dataset is to be considered for applying MGDA. The assessment of the players by using four different MCDM techniques considered as an input of group decision method and the output produces the rank of the players. The result shows that proposed model yields more realistic way to judge the players and resolve deficiency of MCDM process.

Keywords: Group Decision, IPL, MCDM method, Performance Appraisal, Spearman Ranking


Title of the Paper: A Novel Algorithm for Source Localization Based on NonnegativeMatrix Factorization using αβ -Divergence in Cochleagram

Authors: M. E. Abd El Aziz, Wael Khedr

Abstract: In this paper, a localization framework based on a nonnegative matrix factorization using a family of αβ-divergence and cochleagram representation is introduced. This method provides accurate localization performance under very adverse acoustic conditions. The system consists of a three-stage analysis,the first stage: the source separation using NMF based on αβ-divergence where the decomposition performed in cochleagram domain. In the second stage the estimated mixing matrix used to estimate the Time Difference of arrival (TDOA). Finally the Time Difference of Arrival estimates can be exploited to localize the sources individually using the Scaled Conjugate Gradient algorithm (SCG) ,where SCG has advanced compared to other conjugated gradient algorithms. Experiments in real and simulated environments with different microphone setups in 2-D plane and 3-D space, are discussed, showing the validity of the proposed approach and comparing its performance with other state-of-the-art methods.

Keywords: Blind source separation (BSS), Nonnegative Matrix Factorization (NMF), αβ divergence, sound source localization (SSL)


Issue 11, Volume 11, November 2013


Title of the Paper: Design of Traditional/Hybrid Software Project Tracking Technique: State Space Approach

Authors: Manoj Kumar Tyagi, Srinivasan M., L. S. S. Reddy

Abstract: Software projects are required to be tracked during their execution for controlling them. According to state space approach, the tracking problem leads us to have a project state transition model and project status model. A key factor in modeling software projects is to model the project with uncertainty involved in the parameters related to project state transition model and project status model. Traditional/Hybrid software project tracking technique is formulated, modeled with state space approach in plan-space and execution-space, and simulated using discrete event simulation. The uncertainty considered here is ontological that modeled as a normal distribution using an approximation method. The state space model consists of project-state transition equation and project-measurement equation, in plan-space, and it is formulated with Monte Carlo method in execution-space. The developed state space model is used to track status of the traditional/hybrid project. The project is executed with an iterative and incremental development process. With Monte Carlo simulation runs, simulation result shows the variation in ontological uncertainty iteration-wise, both individually and cumulatively, and the effect of uncertainty on project status, by showing project status in execution-space and plan-space. Besides, the project completion somewhere during the last iteration is shown with simulation.

Keywords: Ontological Uncertainty, Normal Distribution function, State-Space Model, Traditional Project, Hybrid Project, Monte Carlo Simulation, Discrete Event Simulation, Execution-Space, Plan-Space


Title of the Paper: Camera Self-Calibration with Varying Parameters from Two Views

Authors: Nabil El Akkad, Mostafa Merras, Abderrahim Saaidi, Khalid Satori

Abstract: This work presents a practical and new approach of self-calibration of cameras with varying parameters, by an unknown planar scene. We show that the estimation of the different parameters of the cameras used can be made from only two matches between two images of the planar scene. The strong point of our method resides at minimizing constraints on the self-calibration system (firstly we estimate the different parameters from only two images, secondly we use only two matches between these images, and thirdly we use the cameras having the varying parameters). The principal idea of our approach is based on demonstrating the relationship between two matches which have a best correlation score ZNCC and the relationship between images of absolute conic for the couple of images. These relations permit to formulate a non-linear cost function, its resolution provides the intrinsic parameters of the cameras used. The robustness of our method in terms of simplicity, stability, accuracy and convergence is shown by the experiment results and the simulations realized.

Keywords: Interest points, Matching, Homography, Self-Calibration, Varying parameters, Non-linear optimization


Title of the Paper: Certain Investigations on Drowsiness Alert system based on Heart Rate Variability using LabVIEW

Authors: R. Sukanesh, S. Vijayprasath

Abstract: Monitoring of cardiac activity of post operating patients in a driving state is one of the emerging applications in bio medical research. The detection of QRS complexes in an Electrocardiogram (ECG) signal provides information about the heart rate within the heart as well as various abnormalities such as hyperkalemia, cardiac hypertrophy, etc. A system that detects drowsiness and heart attack of a driving person in advance would prevent major accidents. The emphasis is on QRS complex detection and analysis of heart rate variability and we have performed an analysis for detecting drowsiness of Post-operative patient through Heart rate variability using LabVIEW as a tool. Results with lower heart rate classify it as fatigue and increased heart rate as abnormality.

Keywords: Active filters, Electrocardiogram (ECG), Heart rate, Signal processing, Vehicle safety


Issue 12, Volume 12, December 2013


Title of the Paper: Similarity Measurement Method between Two Songs by Using the Conditional Euclidean Distance

Authors: Min Woo Park, Eui Chul Lee

Abstract: Since numerous songs have recently been released increasingly, the genre of the song clustering is reasonably more important in terms of the audience’s choice. Also arguments for plagiarism are continuously being raised. For this reason, similarity measurement between two songs is important. In previous works, although similarity measurement has been actively researched in the field of query by humming, they only focused on quite partial matching for input humming. To solve this problem, we proposed a novel similarity measurement method between two songs. The proposed method has several advantages compared with previous works. Firstly, it is possible to measure overall similarity between two different songs. Secondly, overall region of a song is represented as 1-dimensional signal which can be obtained by run-length representation of 2-dimensional note information ((pitch, duration)). Thirdly, by sequentially adopting median filter, average filter, Z-score normalization into the 1-dimensional signal, we obtain the overall flow without noise feature such as the eccentric note of the song. Lastly, a new distance metric namely the conditional Euclidean distance is used by combining two distance concepts such as the Euclidean distance and the Hamming distance. To perform the feasibility test, several famous songs by the Beatles and the MIREX`08 dataset were used for our experiment. Also, by applying our method into a comparison between two songs with a plagiarism issue, we confirmed that very high similarity score between the two songs was measured.

Keywords: Music similarity measurement, Music clustering, Music plagiarism


Title of the Paper: An Improved Energy Efficient Intrusion Detection System for False Misbehavior Elimination in Wireless Sensor Networks

Authors: A. Babu Karuppiah, S. Rajaram

Abstract: A Wireless Sensor Network (WSN) consists of many sensor nodes with low cost and power capability. The nature of WSN makes it easily prone to security attacks and paves way for attackers to easily eavesdrop the network. One of the deadliest attack is the Sink Hole attack where the destruction caused to the network becomes inexplicable. It causes the intruder to lure all the packets and drop it, that ultimately will disrupt the sensor networks functionalities. The existing Watchdog mechanism consumes more energy to compute the sink hole node in the network and its trustworthiness also becomes debatable. In this paper, a novel algorithm is developed to improve the existing Watchdog monitoring system for Intrusion Detection to detect the false misbehaving node and elimination of the same. The simulation results show that precise elimination of the malicious node is done. Moreover, a greater percentage reduction in energy consumption is achieved by the proposed method that makes this system more viable for WSN.

Keywords: Wireless Sensor Networks, Intrusion Detection, Network life time, Energy efficient mechanism, Watchdog


Title of the Paper: A Comparative Study of Hierarchical ANFIS and ANN in Predicting Student Academic Performance

Authors: Quang Hung Do, Jeng-Fung Chen

Abstract: The accurate prediction of student academic performance is of importance to institutions as it provides valuable information for decision making in the admission process and enhances educational services by allocating customized assistance according to the predictions. The purpose of this study is to investigate the predictive ability of two models: the hierarchical ANFIS and ANN. We used previous exam results and other factors, such as the location of the student’s high school and the student’s gender, as input variables, and predicted the student’s expected performance. The simulation results of the two models were then discussed and analyzed. It was found that the hierarchical ANFIS model outperformed the ANN model in the prediction of student academic performance. These results show the potential of the hierarchical ANFIS model as a predictor. It is expected that this work may be used to assist in student admission procedures and strengthen the service system in educational institutions.

Keywords: Adaptive Neuro-Fuzzy Inference System, Artificial neural network, Prediction, Student academic performance, Higher education


Bulletin Board

Currently:

The editorial board is accepting papers.


WSEAS Main Site